Senior Data Engineer I

Loop
Antwerp, Belgium
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Antwerp, Belgium

Tech stack

Artificial Intelligence
Data analysis
Azure
Cloud Computing
Data as a Services
Information Engineering
Data Infrastructure
ETL
Data Transformation
Data Systems
DevOps
Github
Identity and Access Management
Jinja (Template Engine)
Python
Performance Tuning
SQL Databases
Data Processing
Data Storage Technologies
Snowflake
Grafana
Kubernetes
Information Technology
Terraform
Data Pipelines
Docker

Job description

In this role, you'll work with a state-of-the-art data stack-DBT, dltHub, Jinja, Snowflake, and Dagster-and extend your impact into cloud operations across GCP/Azure. You'll take direct ownership of building, deploying, and optimizing data pipelines and cloud infrastructure, ensuring high reliability, scalability, and security. You'll collaborate closely with Data Engineers, AI Engineers, and Data Analysts to design robust, cloud-native data solutions that drive actionable insights and empower the business, As a Senior Data Engineer I, you'll ensure data and infrastructure flow seamlessly across GCP/Azure environments. You'll automate, monitor, and optimize cloud-based data services-enabling the Analytics team and AI team to make confident, data-driven decisions. You'll take ownership of both data pipelines and their cloud foundations, ensuring cost efficiency, resilience, and compliance as our operations scale rapidly across multiple regions. As a senior technical contributor, you'll influence architectural decisions, define engineering standards, and mentor peers while fostering a strong culture of knowledge sharing within the data and AI community at Loop. Your work will ensure that our data platform remains secure, high-performing, and future-proof-a critical enabler of business success., * Design, develop and maintain ETL/ELT pipelines using Python, experience with dltHub is a plus.

  • Optimize data storage and performance within Snowflake.
  • Implement workflow orchestration with Dagster for reliable data processing.
  • Automate infrastructure and deployments, applying IaC and DevOps best practices.
  • Monitor, secure, and optimize cloud environments, ensuring uptime, cost efficiency, and compliance.
  • Collaborate with AI engineers, data analysts and business stakeholders to align data solutions with strategic goals, ensuring high data quality and accessibility.
  • Establish engineering standards, document best practices, and mentor junior engineers.
  • Contribute to knowledge-sharing initiatives that elevate data literacy and technical capability across teams., * Within your first month, you'll immerse yourself in Loop's data and cloud ecosystem-learning how analytics, AI, and business teams interact with our data platform. You'll help refine development practices, strengthen observability, and contribute to establishing standards for data quality, orchestration, and infrastructure management.
  • By your third month, you'll take full ownership of a key data domain or pipeline, leading design and implementation across GCP/Azure/Snowflake. You'll optimize core workflows, align architecture decisions with AI and Analytics teams, and deliver measurable improvements in scalability, reliability, or efficiency.
  • By your sixth month, you'll be driving multiple data initiatives-owning systems end-to-end from design to production. You'll help shape Loop's data roadmap, mentor peers, and define standards for how we build and operate data systems at scale. Your leadership will ensure that our cloud-native infrastructure continues to enable faster insights and data-driven innovation across the company.

Requirements

Do you have experience in Terraform?, You combine deep data engineering expertise with a cloud operations mindset. You're passionate about building robust systems that scale, perform, and stay resilient. You enjoy collaborating with multidisciplinary teams and have a proactive approach to automation and

optimization. Your curiosity drives innovation, your mentorship elevates peers, and your technical leadership ensures that Loop's data foundation is strong, flexible, and future-ready., * 5+ years of experience building secure and scalable cloud data infrastructure, a degree in Computer Science or related field is a plus.

  • Proficiency in Python and SQL for data manipulation, automation, and orchestration.
  • Hands-on experience with Snowflake for data modeling, performance tuning, and integration with cloud ecosystems.
  • Strong CloudOps expertise in GCP and/or Azure, including compute, storage, IAM, and networking.
  • Proven experience with DBT for data transformation, testing, and documentation.
  • Familiarity with workflow orchestration tools such as Dagster. - Experience implementing Infrastructure as Code (IaC) using Terraform.
  • Experience with DevOps practices, including CI/CD pipeline tools such as Docker, Kubernetes, Github Actions.
  • Familiarity implementing monitoring and observability tools (e.g. Grafana, ELK).
  • Strong communication and collaboration skills, working effectively with cross-functional and non-technical stakeholders.

Apply for this position