Data Engineer
Role details
Job location
Tech stack
Job description
We are looking for a Senior Data Engineer to help drive the energy transition by transforming raw data into actionable insights. You will design, build, and maintain data solutions that support business decision-making across the organization. This role combines hands-on engineering, coaching, and cross-team collaboration. Qualifications Responsibilities Data Engineering & Development
- Develop and maintain scalable data pipelines in the Microsoft Azure Cloud environment.
- Work with tools such as Databricks, Microsoft Fabric (future), and Airflow.
- Build data engineering solutions, including interfaces to and from the data platform.
- Transform large volumes of raw data such as sensor data into meaningful, high-quality information.
Data Solutions & Architecture
- Translate business information needs into effective technical data solutions.
- Contribute to the development of BI and data services, including decentralization and self-service BI (Microsoft Fabric).
- Support business units such as Asset Management, Operations, Maintenance, and Capacity Management in becoming more data-driven.
Collaboration & Coaching
- Coach and mentor fellow engineers within a DevOps team.
- Work closely with platform teams, data providers, architects, and other stakeholders.
- Help balance business priorities with technical debt management.
- Participate in operational and maintenance tasks within the team.
Innovation & Continuous Improvement
- Quickly learn new tools and programming languages.
- Drive innovation and bring colleagues along in adopting new technologies and approaches.
- Maintain a strong customer-centric mindset in all activities
Requirements
- Bachelor's degree level of thinking and working (HBO level).
- At least 5 years of relevant professional experience.
Technical Expertise
-
Proven experience translating business features into technical solutions within existing architectural frameworks.
-
Demonstrable experience with:
-
SQL Server
-
Databricks
-
Data lake concepts (e.g., Delta Lake)
-
ELT/ETL processes
-
Python and SQL scripting
-
Experience working with APIs (preferred).
-
Familiarity with software engineering best practices, including:
-
OTAP
-
Version control (Git)
-
Test automation
-
CI/CD pipelines
Data Modeling & BI
- Knowledge of various data modeling techniques.
- Experience with Data Vault is a plus.
- Knowledge of Microsoft Fabric, especially in the context of migrating from Databricks, is an advantage.
Soft Skills
- Strong communication skills.
- Experience coaching and guiding colleagues.
- High energy, curiosity, and a positive, proactive mindset.
- Tools & Technologies
- Cloud: Microsoft Azure
- Data Engineering: Databricks, Airflow
- Languages: Python, SQL
- Data Concepts: Delta Lake, Data Lakehouse, ELT/ETL
- DevOps: Git, CI/CD, OTAP
- BI & Self-Service: Microsoft Fabric (future)
Additional Information Languages:
- Fluent in Dutch& English