Senior AI Data Engineer

DOCTOLIB SAS
Paris, France
25 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Paris, France

Tech stack

Java
Artificial Intelligence
Data analysis
Google BigQuery
Data Warehousing
Cursor (Graphical User Interface Elements)
Distributed Systems
Mobile Application Software
Jinja (Template Engine)
Python
SQL Databases
Tableau
TypeScript
Data Ingestion
Swift
Kotlin
Debezium
Amplitude Analytics
Kafka
Machine Learning Operations
React Native
Data Pipelines

Job description

As a Senior Data Engineer AI , your mission will be to build robust data pipelines - from data capture to monitoring - to power our AI systems and support our ambition to transform healthcare delivery. You will be embedded in the AI teams delivering AI-powered features to healthcare professionals and patients.

Working in the tech team at Doctolib involves building innovative products and features to improve the daily lives of care teams and patients. We work in feature teams in an agile environment, while collaborating with product, design, and business teams.

Your responsibilities include but are not limited to:

  • Design and implement data capture and ingestion systems ensuring data quality, privacy compliance (anonymization, consent, retention), and GDPR adherence
  • Build, optimize and maintain end to end data pipelines using Python, Dagster, BigQuery, SQL/Jinja, DBT
  • Enable AI model development by providing datasets for training, evaluation, and annotation workflows
  • Develop custom monitoring solution s including online metrics pipelines and dashboards (Amplitude, Metabase, Tableau) to track AI system performance

Collaborate with the Data Platform teams to optimize infrastructure , ensure scalability, and manage costs effectively

About our tech environment

  • Our solutions are built on a single fully cloud-native platform that supports web and mobile app interfaces, multiple languages, and is adapted to the country and healthcare specialty requirements. To address these challenges, we are modularizing our platform run in a distributed architecture through reusable components.
  • Our stack is composed of Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.
  • We leverage AI ethically across our products to empower patients and health professionals. Discover our AI vision here and learn about our first AI hackathon here!
  • Our data stack includes: Kafka/Debezium for data ingestion, Dagster/DBT for orchestration, GCS/BigQuery for data warehousing, and Metabase/Tableau for BI and reporting.

Requirements

  • You have at least 7 years+ of experience as an Analytics Engineer, Data Engineer, or a similar role
  • You are proficient in Python, SQL, and DBT for building data pipelines
  • You have hands-on experience building data pipelines for AI/ML systems in production
  • You have a good understanding of ML model lifecycle (training, evaluation, deployment, monitoring)
  • You have a first experience with Google Cloud Platform ( GCP) stack
  • You have strong collaboration skills and can work effectively with data science and engineering teams

Now it would be fantastic if you:

  • Have experience with Vertex AI, MLflow, or similar ML platforms
  • Have experience with AI model monitoring and observability tools
  • Have worked with annotation platforms and labeling workflows
  • Have experience with Cursor / Claude
  • Are familiar with GDPR regulations

Benefits & conditions

  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of a sport club membership or a creative class
  • Up to 14 days of RTT
  • Lunch voucher with Swile card

Apply for this position