Data Engineer (Azure, Databricks), (Remote) - International organisation

The White Team
17 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
€ 57K

Job location

Remote

Tech stack

Data analysis
Azure
Bash
Cloud Computing
Computer Programming
Continuous Delivery
Continuous Integration
Information Engineering
Data Integration
ETL
Data Systems
Software Design Patterns
DevOps
Distributed Systems
Python
Machine Learning
NumPy
Shell Script
SQL Databases
Data Processing
Scripting (Bash/Python/Go/Ruby)
PyTorch
Prompt Engineering
Keras
GIT
Pandas
Containerization
PySpark
Scikit Learn
Kubernetes
Software Version Control
Docker
Databricks

Job description

We are seeking a highly skilled Data Engineer to join our team, contributing to the design, development, and optimization of data solutions within cloud-based and distributed environments. The ideal candidate will have hands-on experience with Azure Cloud, Databricks, and PySpark, complemented by strong proficiency in SQL, Python, and modern data engineering tools. This role offers the opportunity to work on advanced data integration, analytics, and machine learning projects, ensuring efficient data processing, automation, and deployment of scalable solutions.

Requirements

Do you have experience in Shell Scripting?, The Data Engineer demonstrates strong analytical and technical capabilities, with the ability to design and implement efficient data architectures and pipelines in cloud environments. They possess solid programming skills in Python and SQL, combined with expertise in Azure and Databricks platforms. The role requires a deep understanding of distributed computing, data modelling, and ETL processes, as well as practical experience with machine learning frameworks and DevOps practices, including CI/CD, version control, and containerization. Strong problem-solving, collaboration, and communication skills are essential to translate business needs into scalable, high-quality data solutions.

IT skills:

  • Microsoft Azure Cloud (including Azure DevOps).

  • Databricks, PySpark, DBT.

  • Python and SQL.

  • Pandas, Numpy.

  • Machine learning frameworks such as: Scikit-learn, PyTorch, Keras.

  • Git.

  • Docker, Kubernetes.

  • Bash, Python scripting.

  • Attunity.

  • Continuous Integration / Continuous Deployment (CI/CD).

  • Python Poetry, Databricks notebooks.

  • Distributed computation, software design patterns, prompt engineering, data exploration and analysis, model deployment and monitoring.

Language:

  • English (C1).

Apply for this position