Full-Stack Data Engineer
Role details
Job location
Tech stack
Job description
At Eneco, we're driving the energy transition forward with our 'One Planet plan' and striving to become the first CO -neutral energy company by 2035. Data plays a central role in helping us reach these ambitious objectives. As a Data Engineer, you'll work alongside colleagues from diverse business areas, including Machine Learning Engineers, Data Scientists, and Data Analysts, in an environment that values collaboration and continuous learning. Together, you'll advance Eneco's digital innovations and contribute meaningfully to a cleaner, more sustainable future. We support professional growth, celebrate achievements, and learn from every experience.
As a Data Engineer, you will be an integral part of a cross-functional team, working hands-on to design, build, and maintain cloud-based data platforms and pipelines. You will collaborate closely with engineers, analysts, and other stakeholders to deliver robust, scalable, and innovative data solutions. In this role, you'll be working with a wide range of data sources, such as customer data, smart meters, wind turbines, weather, markets, and data from internal processes. You'll play a key role in processing and leveraging this data to generate valuable business insights, and create new products, services, and strategies. You will have the opportunity to take ownership of technical challenges, continuously improve our data infrastructure, and help shape the future of data-driven value creation at Eneco
-
Supporting Machine Learning Engineers, Data Scientists, and Data Analysts by preparing and making high-quality data easily accessible.
-
Designing and implementing data solutions for new projects from scratch.
-
Deploying, monitoring, and managing data pipelines and applications in production.
-
Writing well-documented code and reviewing code from colleagues.
-
Collaborating with stakeholders to gather and discuss project requirements.
Must have:
-
3-5 years of experience as a Data Engineer.
-
Experience in data engineering responsibilities such as ETL/ELT, orchestration, warehousing, monitoring and alerting; as well as best practices
-
Experience with software engineering best practices: code reviews, version control, CI/CD, monitoring, and DevOps principles.
-
Experience with cloud deployments (preferably Azure) and deploying applications
-
Experience with data platforms and tools such as Databricks, Snowflake, dbt, Airflow, Spark or equivalent.
-
Experience with monitoring tools (e.g. Grafana)
-
Proficiency in SQL and Python.
Nice to Have:
-
Experience in data modeling, data governance, data quality, access control, and documentation.
-
Experience in Kubernetes
-
API development experience. E.g. FastAPI
At Eneco, you'll join a collaborative, purpose-driven environment where data and technology play a central role in accelerating the energy transition. You will work alongside experienced Data Engineers, Machine Learning Engineers, Data Scientists, and Analysts to develop cloud-based platforms that transform diverse data streams-from smart meters to renewable assets-into valuable insights and innovative solutions. You'll be part of a culture that promotes continuous learning, shared ownership, and technical excellence, all while contributing to Eneco's ambition to become the first CO -neutral energy company by 2035.
Please apply directly via our careers website. 𝐀𝐩𝐩𝐥𝐢𝐜𝐚𝐭𝐢𝐨𝐧𝐬 𝐯𝐢𝐚 𝐞𝐦𝐚𝐢𝐥 𝐰𝐢𝐥𝐥 𝐧𝐨𝐭 𝐛𝐞 𝐜𝐨𝐧𝐬𝐢𝐝𝐞𝐫𝐞𝐝.
- Work on high-impact, real-world data by building scalable cloud platforms that directly support Eneco's mission to become CO -neutral by 2035
- Collaborate in a highly multidisciplinary environment, working closely with Data Scientists, Machine Learning Engineers, and business stakeholders to turn complex data into actionable insights and products
- Grow your expertise in a modern data stack, with hands-on ownership of end-to-end data solutions using cloud technologies like Azure, Databricks, and advanced orchestration and monitoring tools
Requirements
-
3-5 years of experience as a Data Engineer.
-
Experience in data engineering responsibilities such as ETL/ELT, orchestration, warehousing, monitoring and alerting; as well as best practices
-
Experience with software engineering best practices: code reviews, version control, CI/CD, monitoring, and DevOps principles.
-
Experience with cloud deployments (preferably Azure) and deploying applications
-
Experience with data platforms and tools such as Databricks, Snowflake, dbt, Airflow, Spark or equivalent.
-
Experience with monitoring tools (e.g. Grafana)
-
Proficiency in SQL and Python.
Nice to Have:
-
Experience in data modeling, data governance, data quality, access control, and documentation.
-
Experience in Kubernetes
-
API development experience. E.g. FastAPI