Data Engineer
Talentor
6 days ago
Role details
Contract type
Temporary contract Employment type
Part-time (≤ 32 hours) Working hours
Regular working hours Languages
English Experience level
Senior Compensation
€ 7KJob location
Tech stack
Java
Airflow
Google BigQuery
Data Transformation
Data Stores
Python
PostgreSQL
Operational Data Store
Performance Tuning
Query Optimization
SQL Databases
Scripting (Bash/Python/Go/Ruby)
Data Pipelines
Apache Beam
Job description
As a Data Engineer on the Warehouse Data Availability team, you'll be an integral part of a collaborative unit including data and BI engineers, a lead engineer, a product manager, and an engineering manager. Your contributions will include:
- Architecting, developing, and sustaining data pipelines for extracting and transforming data from diverse warehouse systems.
- Leveraging Postgres and BigQuery as primary data repositories.
- Constructing and refining data transformations through the use of dbt.
- Managing and automating workflows utilizing Apache Airflow.
- Engineering robust and scalable processing solutions with GCP Dataflow.
- Producing clean, well-maintained SQL and Python code.
- Enhancing the performance, dependability, and detectability of current pipelines.
- Partnering with stakeholders to translate their data requirements into effective technical implementations., Nice assignment in Utrecht until December 2026, for 32 hours per week (hybrid with two office days per week). On this assignment, you will work on a flex contract via our agency. The salary range for this role is €6500-€7000 gross per month for 40 hours per week.
Requirements
If you're a Medior or Senior engineer ready to make a significant impact and drive technical excellence, let's connect!, * 5+ years of experience as a Data Engineer, preferrably within the warehouse, logistics, or operational data domains.
- Strong proficiency in SQL, encompassing data modeling, query optimization, and performance tuning.
- Postgres and BigQuery, or similar enterprise-grade data warehouses.
- Experience with data pipeline orchestration tools, specifically Airflow.
- Familiarity with data transformation tools, specifically dbt.
- Solid experience with Python for data manipulation, scripting, and automation.
- Experience with CI/CD pipelines.
- Experience with Java is highly preferred.
- Experience with GCP Dataflow or Apache Beam for building data processing pipelines.