Staff Software Engineer (Tech Lead) - AI Data Platform , Berlin) gesucht in Berlin

Hier Ihre Firma Anmelden
Berlin, Germany
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
€ 110K

Job location

Remote
Berlin, Germany

Tech stack

Artificial Intelligence
Airflow
Amazon Web Services (AWS)
ARM
Google BigQuery
Cloud Database
Cloud Storage
Data Infrastructure
Data Sharing
Data Warehousing
Django
Python
DataOps
Data Ingestion
Azure
Snowflake
Low Latency
Data Management
Vertica
Network Server
Azure
Data Pipelines
Redshift
Databricks

Job description

Engineer the data backbone of the AI economy: We're looking for an individual contributor (IC) with proven experience in designing and building secure, high-scale data platforms. You've worked with a heterogeneous cloud data architecture supporting both structured and unstructured data sources. In addition, you have deep expertise in automated data pipeline orchestration and data observability. Preferrably, you already released AI data agents in production systems for users, creating measurable customer value., You'll take ownership to build, architect, and innovate on our AI Data Platform:

  • Build: Maintain and develop our high-scale data product platform, running close to 400K data pipeline jobs per month at petabyte scale
  • Architect: Design and improve our cross-cloud data architecture and infrastructure to ensure high-scalability, low-latency, and cost-efficiency
  • Innovate: Implement AI-driven features and AI data agents that support customers in creating & exposing semantically-rich, AI-ready data products (e.g. via MCP)

Requirements

  • Data Warehousing: In-depth experience with Snowflake (*required) and similar platforms like Databricks, BigQuery, Redshift, Azure Synapse, or ClickHouse
  • Data Pipeline Orchestration: Hands-on experience with Prefect (preferred) or similar tools like Airflow, Dagster, Flyte, Mage, or Metaflow
  • Data Ingestion & Egress: Proven experience loading/unloading data from/to S3-compatible cloud data storage like Amazon S3, GCS, Azure Blob Storage, etc.
  • AI Data Agents: Experience building agents, skills/CLI, and MCP servers with Snowflake Cortex AI (preferred), Google Vertex AI, Databricks, or similar
  • Application Engineering: Expert-level experience designing and coding data-intense back-ends with Django and Python (*required) or similar
  • Data Observability: Experience in montioring data quality & anomalies with tools like Metaplane, Monte Carlo, Soda, Great Expectations, or similar
  • Cross-Cloud Data Sharing (bonus): Experience sharing data products with Snowflake Data Sharing, Delta Sharing, BigQuery Sharing, Azure Data Share

Benefits & conditions

  • Ownership: Participation in our employee stock options program (VESOP)
  • Balance: 30 days of vacation and flexible working hours (aligned with team)
  • Mobility: €63 monthly public transport budget (covers your Deutschland-Ticket)
  • Flexibility: 3 days office / 2 days home office (options for remote work periods)
  • Hardware: High-end hardware of your choice (Mac/Linux)

We're looking forward to receiving your application and encourage to apply from any background and even if you don't fulfill all requirements.

We commit to answer fast to your application. The process will be led by our tech founder personally and you will get the chance to speak with our engineering team about your code and our platform. Qualifikation: Befristet: n.a. Verdienst: EUR 90000-110000 per year Bewerbung an: Monda Labs

About the company

Monetize data, fuel AI: Monda helps companies to turn data into products into revenue. Our AI data monetization platform makes it easy to build and deliver AI-ready data products, ready to be consumed by (agentic) AI.

Apply for this position