Data Engineer
Role details
Job location
Tech stack
Job description
Working on an Agile methodology, your day to day will cover different and exciting challenges in a broad range of technologies. We expect from you the ability to create new Proof of Concepts (POC). Moreover, you will be working on a CI/CD approach using blueprints and reusable standard assets already available at company level.
You will be implementing solutions for projects supporting the achievement of Sustainability Goals as per the Corporate Sustainability Reporting Directive (CSRD) in the Group Finance Value Stream.
Your role As a Data Engineer, your main responsibilities will involve:
-
Analyze data models and derive logical conclusions.
-
Processes modeling.
-
Hands on development and monitoring of the Azure cloud Platform and various associated components for data ingestion, transformation and processing.
-
Effectively diagnose, isolate, and resolve complex problems pertaining to data infrastructure, including performance tuning and optimization.
-
Designs and writes programs according to functional and non-functional requirements.
-
Develops and maintains technical documentation.
-
Follows established configuration/change control processes.
Requirements
We are seeking a dynamic and experienced Data Engineer with a proactive attitude and willing to learn to join our DBI organization within Group Finance Value Stream., * University level education or equivalent
-
5+ years of experience working as a Data Engineer
-
Ability to write complex SQL queries
-
A good understanding of distributed computation
-
GIT
-
Experience with Azure Cloud.
-
Experience with Databricks.
-
2+ years Python
-
1+ years Pyspark
-
1+ years of experience working as a Software Engineer
-
1+ years of experience in DBT.
-
Experience with Python libraries like pandas or numpy.
-
Experience with scikit-learn, pytorch, keras or similar.
Nice to have:
-
Knowledge of design patterns.
-
Experience with containerization of solutions (Docker, Kubernetes).
-
Experience with Python Poetry framework.
-
Continuous Integration / Continuous Deployment (Azure DevOps).
-
Experience in data exploration using notebooks (Databricks notebooks).
-
Experience with experiment tracking and model registry (MLflow).
-
Understanding of how to deploy, operationalize, and monitor ML models.
-
Experience with prompt engineering.
-
Experience in automation of regular tasks (e.g. data refreshes) using bash and python scripting.