Data Engineer
Role details
Job location
Tech stack
Job description
As a Data Engineer, you will play a key role in transforming raw data into powerful insights that drive strategic decisions across the organization. Leveraging modern technologies such as Azure, Terraform, Databricks, Kafka, and Power BI. You will design and implement complex ETL workflows, and real-time streaming pipelines with a strong focus on IoT use cases. You will also contribute to building a state-of-the-art data lakehouse architecture using Delta Lake tables.
Your work will empower teams both onshore and offshore to access, analyse, and act on data more effectively than ever before. You'll collaborate closely with end users and stakeholders to deliver robust, scalable solutions aligned with IT best practices., * Design and implement data pipelines using Azure services, with a strong focus on Databricks;
- Develop and maintain real-time data streaming pipelines to process IoT data, using tools such as Kafka, Azure IoT Hub, and stream processing frameworks like Spark Streaming, Flink/PyFlink, or Azure Stream Analytics;
- Build and optimize a modern data lakehouse architecture, employing Delta Lake for efficient and reliable data storage and management;
- Create and maintain an environment for Power BI dashboards and reports that deliver actionable insights for business stakeholder;
- Translate business requirements into robust, scalable data solutions by working closely with cross-functional teams;
- Enforce data governance and quality standards to ensure the integrity, accuracy, and security of the data infrastructure;
- Proactively monitor, troubleshoot, and optimize data pipelines to meet evolving business needs;
- Design and develop scalable, reusable, and reliable ETL batch processing patterns.
Requirements
We're on the lookout for a self-starting, skilled, and meticulous Data Engineer who thrives in a collaborative environment. You have a strong foundation in data engineering and a proven ability to build scalable, high-quality data solutions. Your expertise in automated testing of data pipelines will be critical in maintaining the integrity and performance of our data infrastructure.
You bring a strong understanding of data engineering fundamentals and experience with Azure Cloud, Databricks, and databases. You're passionate about automation and innovation, quick to adapt to change, and eager to continuously learn. With your proactive attitude and strong communication skills, you work seamlessly with colleagues across teams.
What you bring:
- An HBO or BSc degree in Computer Science, Engineering, Information Technology, or a related field;
- Proficiency in SQL and Python for data processing;
- Demonstrable experience in data modeling, warehousing, ETL, and real-time streaming;
- Strong problem-solving skills, with the ability to perform under pressure and meet deadlines;
- Excellent communication skills in English, with a history of collaborating in cross-functional teams;
- Experience working in Scrum/Agile environments, using tools like Jira or Azure DevOps;
- Hands-on experience with CI/CD pipelines and tools such as Git and Terraform.
Benefits & conditions
Working at Allseas allows you the chance to work in a dynamic, fast-paced, entrepreneurial environment, with creative thinking, collaboration and down-to-earth culture at its core.
You can expect:
- Competitive industry benchmarked salary and excellent pension;
- Performance based salary raises and bonuses;
- 30 holidays days per year, flexible working hours;
- Company-sponsored fitness scheme;Internal clubs, committees, parties and (sportive) events.