Data Engineer (Medior)

MultiMinds
Aalst, Belgium
9 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Aalst, Belgium

Tech stack

Java
JavaScript
API
Artificial Intelligence
Amazon Web Services (AWS)
Application Integration Architecture
Azure
Computer Programming
Databases
Data Architecture
Information Engineering
ETL
Data Systems
Data Visualization
Linux
Distributed Systems
Elasticsearch
Hadoop
Python
Enterprise Messaging Systems
Neo4j
NoSQL
SQL Databases
Data Streaming
Tableau
Talend
Jupyter Notebook
Data Storage Technologies
Grafana
Spark
Multi-Cloud
Data Strategy
Event Driven Architecture
Containerization
Kubernetes
Data Analytics
Kafka
Vertica
Kibana
Data Pipelines
Marketing Cloud
Docker

Job description

As a Data Engineer, you will play a key role in preparing and transforming data for a wide range of analytical and operational use cases. You'll design, build, and maintain robust data pipelines and applications that power modern marketing cloud environments. Working closely with our customers, you'll support their technical teams in identifying needs, analysing risks, and implementing scalable data solutions. You'll collaborate with data analysts and fellow engineers to rapidly prototype ideas, validate strategies, and bring innovative data solutions to life. With your technical expertise and hands-on mindset, you'll ensure data flows efficiently across systems, enabling organisations to make smarter, data-driven decisions.

Your responsibilities in a nutshell

  • Build and maintain scalable and reliable data pipelines
  • Connect systems and platforms through APIs and integrations
  • Set up and manage data storage solutions and analytical databases
  • Collaborate with clients to understand requirements, analyse risks, and implement technical solutions
  • Work in multi-cloud environments using various ETL tools and technologies
  • Implement and operationalise algorithms (AI/ML) developed within the team
  • Partner with data analysts to create proof of concepts and validate data strategies
  • Provide guidance on best practices and solve complex data and analytics challenges
  • Contribute to knowledge sharing, training, and team development
  • (Optionally) Coordinate and support other data engineers

Requirements

We're looking for someone who combines strong technical expertise with a problem-solving mindset. You enjoy working with data at scale, thrive in dynamic environments, and are eager to continuously learn and grow.

Experience and expertise

  • At least 3 years of experience in data engineering or a related role
  • Strong programming skills (e.g. Python, JavaScript, Java)
  • Experience with Jupyter Notebooks
  • Solid understanding of database technologies (SQL, NoSQL, Graph)
  • Strong Linux knowledge and experience working with cloud platforms (AWS, Azure, or GCP)
  • Ability to set up and configure infrastructure and environments from scratch

Technical strengths

  • Experience with ETL/ELT technologies (e.g. Hadoop, Spark, Kafka, Talend)
  • Familiarity with modern data architectures and distributed systems
  • Interest in or experience with AI and machine learning technologies

Nice-to-have

  • Experience with analytical databases (e.g. Elasticsearch, ClickHouse, Neo4j)
  • Knowledge of data visualization and dashboarding tools (e.g. Grafana, Kibana, Tableau)
  • Experience with containerization and orchestration tools (Docker, Kubernetes)
  • Familiarity with messaging systems and event-driven architectures

Collaborative and communicative

  • Ability to work closely with clients and cross-functional teams
  • Strong communication skills and a proactive mindset
  • Comfortable working in an international environment (good English required)

Innovative and curious

  • Passion for exploring new tools, technologies, and data solutions
  • Eagerness to experiment, prototype, and improve continuously

Professional skills

  • Designing and building data pipelines and workflows
  • Developing and integrating APIs
  • Designing and managing data storage and database systems
  • Implementing and maintaining ETL/ELT processes
  • Analyzing and optimizing data flows and system performance
  • Supporting deployment and monitoring of data applications
  • Testing and validating data solutions
  • Translating business needs into technical implementations
  • Ensuring data quality, reliability, and scalability
  • Contributing to technical documentation and best practices
  • Supporting technical training and knowledge sharing
  • Working with cloud platforms and distributed systems

Personal skills

  • Planning and organizing
  • Coaching and knowledge sharing
  • Flexibility and adaptability
  • Initiative and proactivity
  • Continuous self-development
  • Problem-solving mindset
  • Creativity
  • Customer focus
  • Critical thinking
  • Analytical mindset
  • Decision-making
  • Sense of responsibility
  • Strong communication
  • Team collaboration
  • Independence
  • Curiosity and eagerness to learn
  • Digital mindset

Apply for this position