Senior Data Engineer
Role details
Job location
Tech stack
Job description
- Data Architecture & Pipelines: Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and deliver structured and unstructured data.
- Data Governance & Quality: Implement governance frameworks, data catalogs, and quality assurance processes to ensure compliance, integrity, and security of all data assets.
- Data Integration & Storage: Manage data lakes, warehouses, and streaming platforms; optimize storage solutions for performance and cost-efficiency.
- Enable AI & Analytics: Prepare and curate datasets for machine learning and advanced analytics projects, ensuring accessibility and reliability for data scientists and AI engineers.
- Performance Optimization: Monitor and optimize data workflows for speed, scalability, and resilience in cloud and on-prem environments.
- Collaboration: Work closely with cross-functional teams (data scientists, software engineers, project managers) to deliver end-to-end data solutions.
- Continuous Improvement: Stay updated on emerging technologies in data engineering, big data, and cloud platforms; drive adoption of best practices.
- Client Interaction: Support clients in understanding data requirements, present technical solutions, and provide expertise on data-driven strategies.
Requirements
- Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field.
- Experience: 3-5 years in data engineering, big data, or data platform development.
- Technical Skills:
-
Strong proficiency in Python and SQL; experience with distributed systems (Spark, Databricks).
-
Cloud Platforms: o Microsoft Azure: Azure Data Factory, Azure Data Lake Storage, Microsoft Fabric,Azure Cosmos DB. o Other Clouds: AWS (Glue, RDS, Athena, Redshift, S3), GCP (BigQuery, Dataflow).
-
Data Orchestration & Workflow: Airflow, Prefect, Dagster, ADF
-
Data transformation: dbt
-
Data Warehousing & Modeling: Snowflake, DuckDB, BigQuery.
-
Streaming & Real-Time Processing: Kafka, Azure Event Hub.
-
Visualization & BI: Power BI, Tableau.
-
Familiarity with CI/CD pipelines (Azure DevOps, GitHub Actions) and containerization (Docker, Kubernetes).
-
Infrastructure as Code: Terraform
- Analytical Thinking: Ability to design efficient data workflows and troubleshoot complex data issues.
- Communication: Clear and concise communication skills for technical and non-technical stakeholders.
- Ethical Standards: Commitment to data privacy, security, and compliance with relevant regulations.
- Bonus: Experience with MLOps and DataOps.
- Fluency in both written and spoken Dutch and English is required.
Benefits & conditions
- Opportunity to work on cutting-edge data and AI projects with diverse clients.
- Collaborative and supportive work environment.
- Continuous learning and professional development opportunities.
- Competitive salary and benefits package.
- Flexible working hours and work-life balance initiatives.