Python Developer (ZZP)
Role details
Job location
Tech stack
Job description
The Python Data Engineer works on external client projects to design and deliver robust data-processing solutions that support the client's business vision.
The role focuses on building, testing, and optimizing Python-based data pipelines and workflows, particularly for high-volume or time-series datasets.
Working closely with data architects and platform teams from early design stages, the engineer contributes to scalable, reliable data platforms and enables high-quality data consumption across systems., * Data Pipelines & Processing: Design, build, and maintain scalable data pipelines and data-processing components for high-volume and time-series datasets.
- Data Modeling & Storage: Design and optimize data models and storage layers for analytics and downstream consumption, ensuring data quality and reliability.
- Data-Oriented Services: Develop Python-based services and jobs to support data ingestion, transformation, and validation workflows.
- Delivery & Collaboration: Contribute to CI/CD pipelines, monitoring, and operational reliability. Collaborate with backend and platform engineers to ensure smooth integration with consuming systems.
- Backend Integration (Secondary): Contribute to the design and implementation of scalable backend services and APIs using Python, with a focus on reliability, performance, and maintainability
Requirements
Do you have experience in Scrum?, Do you have a Master's degree?, * Bachelor's/Master degree in IT (or related) field
- Strong experience in Python development (5-8 years)
- Strong Python development for data engineering workloads (data processing, batch/streaming jobs, analytics workflows).
- Solid SQL skills and experience with data modeling and query optimization.
- Experience working with cloud-based data platforms and data pipelines.
- Your communication skills are excellent. You communicate the architectural design and technical solution with the client and the team and set clear goals to work towards
- You have a strong problem-solving and ownership mindset.
- You have experience with Agile/Scrum methodologies
- Your communication skills in English are excellent (Dutch is plus)
Nice-to-Have:
- Experience with lakehouse or data lake architectures (experience with Databricks preferred).
- Familiarity with orchestration and scheduling tools.
- Experience exposing data via APIs or supporting data consumption use cases.
- Experience in the energy domain