Full-Stack Data Engineer

Eneco
Rotterdam, Netherlands
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Rotterdam, Netherlands

Tech stack

Azure
Code Review
Databases
Continuous Integration
Data Systems
Database Design
DevOps
Integrated Development Environments
Python
PostgreSQL
Performance Tuning
Software Engineering
SQL Databases
Data Ingestion
Snowflake
Backend
GIT
Kubernetes
Data Management
Data Pipelines
Microservices

Job description

At Eneco, we're driving the energy transition forward with our One Planet plan, with the ambition to become the first CO -neutral energy company by 2035. Data plays a crucial role in achieving this goal.

As a Full-Stack Data Engineer within the Asset Performance Optimization (APO) team, you will contribute to building and operating reliable data platforms that support heat asset monitoring, optimization, and analytics. You will work closely with senior engineers, product owners, and domain experts to turn data into actionable insights, while growing your technical depth and impact.

As a Medior Data Engineer, you will actively contribute to the development, maintenance, and improvement of the APO data landscape across several evolving products and platforms. You will support initiatives such as operational monitoring, optimization tooling, and performance analytics for district heating systems and related assets.

Your responsibilities include:

  • Contributing to improvements in APO data ingestion pipelines and platform components
  • Supporting the transition of POCs and project-based solutions into stable run & maintain environments
  • Implementing and maintaining production-ready data pipelines and backend services
  • Ensuring data products are reliable, observable, and well-documented
  • Collaborating with senior engineers on technical decisions related to Snowflake, Python services, Kubernetes workloads, and database design
  • Actively participating in code reviews and knowledge sharing within the team

You'll be responsible for

  • Developing and maintaining data pipelines and backend services for APO use cases
  • Working with Snowflake data models and PostgreSQL databases to support analytics and reporting
  • Helping ensure reliability, performance, and observability of Kubernetes-based services
  • Writing clean, maintainable Python code following agreed engineering standards
  • Translating functional requirements into solid technical implementations
  • Documenting solutions and contributing to technical standards and best practices

Is this about you?

  • 3+ years of experience as a Data Engineer or Backend Engineer in a data-intensive or operational environment
  • Solid Python skills and familiarity with software engineering best practices
  • Good SQL knowledge and hands-on experience with Snowflake and/or PostgreSQL
  • Experience using Git in a collaborative development environment
  • Familiarity with CI/CD concepts and DevOps tooling (experience with Azure DevOps is a plus)
  • Basic to intermediate experience with Kubernetes and containerized applications
  • Willingness to learn, take ownership of components, and grow toward more complex technical responsibilities
  • Comfortable working in a dynamic environment where not everything is fully defined yet

You will join Eneco's Asset Performance Optimization (APO) team, where data and engineering directly support the transition to a CO -neutral energy system. In this environment, you'll collaborate closely with business and technical stakeholders to transform complex operational challenges into scalable, production-ready data platforms. You'll work on systems that monitor and optimize heat assets in real time, helping evolve APO products from experimental solutions into stable, future-proof operational services - all within a culture that values ownership, technical excellence, and meaningful impact on the energy transition.

  • Contribute directly to the energy transition by building data platforms that help Eneco achieve its goal of becoming CO -neutral by 2035.
  • Collaborate closely with domain experts, and engineers to translate complex operational challenges into practical, data-driven solutions.
  • Build production-ready data solutions that move beyond POCs and optimize real-world heat assets at operational scale.

Requirements

  • 3+ years of experience as a Data Engineer or Backend Engineer in a data-intensive or operational environment
  • Solid Python skills and familiarity with software engineering best practices
  • Good SQL knowledge and hands-on experience with Snowflake and/or PostgreSQL
  • Experience using Git in a collaborative development environment
  • Familiarity with CI/CD concepts and DevOps tooling (experience with Azure DevOps is a plus)
  • Basic to intermediate experience with Kubernetes and containerized applications
  • Willingness to learn, take ownership of components, and grow toward more complex technical responsibilities
  • Comfortable working in a dynamic environment where not everything is fully defined yet

About the company

* Contribute directly to the energy transition by building data platforms that help Eneco achieve its goal of becoming CO -neutral by 2035. * Collaborate closely with domain experts, and engineers to translate complex operational challenges into practical, data-driven solutions. * Build production-ready data solutions that move beyond POCs and optimize real-world heat assets at operational scale.

Apply for this position