Senior Data Engineer

Devleaps
Hoofddorp, Netherlands
4 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Hoofddorp, Netherlands

Tech stack

Clean Code Principles
API
Amazon Web Services (AWS)
Azure
Google BigQuery
Cloud Computing
Data as a Services
ETL
Data Security
Data Systems
Data Warehousing
Github
Python
Management of Software Versions
Apigee
Bitbucket
Api Design
Terraform
Api Management
Serverless Computing

Job description

  • Own the full lifecycle of data products: ingest, transform (dbt), store (BigQuery or equivalent), and serve via APIs (Python on Cloud Run or similar)

  • Translate business needs into robust data contracts with well-designed schemas, versioning, and change management

  • Design for scale and reuse, replacing brittle, ad-hoc solutions with maintainable and scalable data products

  • Expose and secure data via APIs and warehouse interfaces with proper access controls and compliance

  • Automate infrastructure with Terraform, containerize services, and optimize for performance and cost

  • Collaborate cross-functionally with analysts, product managers, and engineers; lead technical discussions and mentor others

Requirements

At Dataleaps, we're not just hiring engineers we re looking for people who love to build, break, fix, and improve.

People who get excited about solving real problems not just writing clean code. For us, a great job isn t just about the work. It s about learning, growing, and being part of a team that genuinely cares. No egos. No pointless meetings. Just meaningful tech, smart people, and a culture where your voice matters.

Are you passionate about data? Let s talk.

The Mission

Design and ship reusable domain data products not one-off, point-to-point data feeds.

Design and deliver reusable, domain-oriented data products not one-off data feeds. Instead of building temporary extracts for individual requests, you'll create durable, versioned data assets that serve multiple teams and evolve with the business.

Tech Stack You'll Use

  • Python Core language for building APIs and data services

  • Terraform Infrastructure as Code

  • Cloud Platforms GCP preferred (Cloud Run, BigQuery), but AWS or Azure experience is welcome (multi cloud experience is even better)

  • DBT Data modeling and transformation

  • Apigee (or equivalent) API management and monitoring

  • ETL/ELT Pipeline orchestration, testing, and lineage

  • CI/CD GitHub Actions, Bitbucket Pipelines, etc.

  • Networking Secure service-to-service communication, Must-Haves

  • 5+ years of experience building and maintaining end-to-end data products in production environments

  • Strong expertise in Python and cloud data warehouses (preferably BigQuery)

  • Solid experience with dbt, API design, and data modeling/lineage

  • Proven ability to guide stakeholders from vague requests to well-structured, reusable data solutions

  • Clear, proactive communicator; confident working directly with technical and business teams

Nice-to-Haves

  • Experience with Apigee, Terraform modules, and serverless containers

  • Strong foundation in observability, security-by-design, and cost management

  • Background in complex data modeling (e.g., dimensional models, domain-driven design)

Apply for this position