Data Engineer
Multiplied
Amsterdam, Netherlands
yesterday
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
Senior Compensation
€ 7.5KJob location
Amsterdam, Netherlands
Tech stack
Clean Code Principles
API
Artificial Intelligence
Airflow
Azure
Continuous Integration
Information Engineering
Data Integration
ETL
Data Vault Modeling
Distributed Computing Environment
Django
Microsoft Dynamics CRM
Python
Standard Sql
Azure
React
Flask
Backend
FastAPI
Vue.js
Data Lake
Angular
PySpark
Information Technology
Deployment Automation
GraphQL
Machine Learning Operations
Front End Software Development
Api Design
Terraform
Data Pipelines
Docker
Databricks
Job description
As a Senior Data Engineer, you will be responsible for designing, building, and optimizing scalable data solutions within a modern Azure lakehouse environment. You will play a leading role in establishing robust data foundations and work closely with data scientists and business stakeholders to turn data into actionable insights., * Leading the design and implementation of scalable ETL/ELT pipelines on Databricks
- Working within a medallion architecture (bronze, silver, gold)
- Building reusable, metadata-driven pipeline frameworks
- Managing and optimizing Delta Lake and Azure Data Lake Storage Gen2
- Setting high standards for data quality, governance, and performance
- Implementing monitoring, alerting, and observability
- Developing APIs and backend services in Python (FastAPI, Flask, or Django)
- Delivering reusable, production-grade components for wider team usage
- Supporting lightweight front-end solutions (e.g. React or Angular) where needed
- Engineering and evolving an Azure-based lakehouse platform
- Using Terraform and Azure DevOps for IaC, CI/CD, and deployment automation
- Integrating data with ERP and external systems (MS Dynamics is a plus)
- Mentoring and supporting junior data engineers
- Promoting best practices such as TDD, clean code, and solid documentation
- Collaborating with data scientists on AI and machine learning use cases
- Staying up to date with emerging tools and technologies and applying them where relevant
Requirements
- Bachelor's degree in Computer Science, Data Engineering, or a related field
- 5+ years of hands-on experience in data engineering
- 1-2+ years in a senior or lead role
- Strong Python skills (PySpark for pipelines + API frameworks)
- Solid SQL skills with experience optimizing complex queries
- Proven experience with Databricks, Delta Lake, and distributed data processing
- Experience with AI/ML pipelines or working closely with data science teams
- Hands-on experience with Azure Cloud, Terraform, and Azure DevOps
- Experience designing APIs and integration patterns (REST/GraphQL)
Nice to have:
- Experience with Apache Airflow or similar orchestration tools
- Familiarity with Docker and Kubernetes
- Front-end experience (React, Angular, or Vue)
- Knowledge of data modeling (Kimball, Data Vault, star/snowflake schemas)
Benefits & conditions
- Salary up to €90,000 gross per year
- Strong secondary benefits package
- A collaborative, hands-on, and technically strong team
- Plenty of room for ownership and technical decision-making
- The opportunity to work with modern data and AI technologies