Senior Data Engineer - Snowflake & Data Sharing , Berlin) gesucht in Berlin

Hier Ihre Firma Anmelden
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
€ 75K

Job location

Tech stack

Java
Airflow
Amazon Web Services (AWS)
Azure
Cloud Computing
Information Systems
Information Engineering
Data Security
Data Sharing
Data Systems
Web Development
Electronic Data Interchange (EDI)
Iterative and Incremental Development
Python
Cloud Services
Software Engineering
Data Streaming
Google Cloud Platform
Snowflake
Infrastructure as Code (IaC)
Amazon Web Services (AWS)
B2b Software
Containerization
Information Technology
Data Delivery
Terraform
Data Pipelines
Docker

Job description

We're looking for a Senior Data Engineer (d/f/m) to join our software engineering team in Berlin. As an individual contributor (IC), you'll work closely with our Head of Engineering. You'll not only develop new cross-cloud data sharing features from scratch but also improve the stability of the platform and all data pipelines., * Enable customers to connect diverse data sources and publish marketplace-ready data products, fueling seamless data exchange and monetization

  • Ensure solid, observable, and efficient data flows by operating Python pipelines orchestrated with Prefect and running on AWS ECS
  • Connect and unify data across multiple cloud environments, enabling secure and high-performance data exchange between diverse customer systems and platforms
  • Collaborate across teams to align technical solutions with customer and business needs, driving engineering excellence and platform reliability
  • Design, develop, and maintain our Snowflake-based cross-cloud data platform with scalable and future-proof architecture

Requirements

  • 5+ years of hands-on experience as a Data Engineer or in a similar role, building and maintaining large-scale data systems in Snowflake environment
  • Expert knowledge of Snowflake and Python, with experience designing efficient, scalable, and secure data pipelines
  • Proficiency in data orchestration frameworks such as Airflow, Dagster, or dbt - experience with Prefect is a strong plus
  • Solid understanding of containerization and cloud infrastructure, particularly with Docker and AWS ECS
  • Proven experience in data platform architecture, data delivery, and data lifecycle best practices
  • Strong sense of ownership and accountability, with the ability to prioritize, assess criticality, and deliver results under minimal supervision
  • Experience working in agile, cross-functional teams, embracing iterative development and continuous improvement, * Degree in Computer Science, Information Systems, Application Programming, or a related technical field
  • Hands-on experience with Infrastructure as Code (IaC) tools such as Terraform
  • Background in international B2B software applications, ideally within the e-commerce industry
  • In-depth knowledge of multiple cloud service providers (e.g., AWS, GCP, Azure) and experience working in cross-cloud environmentsGenuine passion for Data Engineering, with additional experience in web application development or adjacent software domains

Benefits & conditions

The start date is 1st of January 2026, work location is Berlin (hybrid), base salary starts at €75K gross / annum (based on experience).

About the company

Monda believes that any company should be able to share and access the data they need to fuel AI. Therefore we create a borderless data sharing ecosystem to fuel the AI revolution and accelerate human progress. We encourage and empower any company in the world to share and monetize their data safely. Our Engineering Team: We are a passionate, multicultural engineering team dedicated to turning complex data challenges into seamless software - always deciding, acting, and delivering with the customer experience at the core. Our Data Tech Stack: Snowflake, Prefect, Python, Django, AWS, Terraform, Cloudflare, Docker, Github, Heroku Our Tech Challenges: * Simplify cross-cloud data product creation: Enable easy onboarding of data sources from multiple cloud environments and ensure reliable data delivery for true cross-cloud sharing, supporting seamless data marketplace integrations across AWS, GCP, Azure, Snowflake, and Databricks. * Fuel the AI revolution: Streamline data customization and multi-asset data product management, empowering integrations with leading data marketplaces such as Datarade, Snowflake Marketplace, Databricks, Google Cloud Analytics Hub, and SAP Datasphere. Drive innovation in data platforms: Tackle the challenges of scalability, reliability, and performance in a rapidly evolving, multi-cloud ecosystem while enabling business-ready, high-quality data products.

Apply for this position