Staff DataOps Engineer

DOCTOLIB SAS
Paris, France
10 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Paris, France

Tech stack

Java
API
Artificial Intelligence
Airflow
Google BigQuery
Computer Programming
Continuous Integration
Data Governance
Data Infrastructure
Data Warehousing
Distributed Systems
Identity and Access Management
Mobile Application Software
Python
Machine Learning
DataOps
Software Engineering
TypeScript
Google Cloud Platform
Swift
FastAPI
Kotlin
Kubernetes
Data Management
React Native
Terraform
Data Pipelines

Job description

As a Staff DataOps / Platform Engineer , your mission will be to shape our data platform strategy and architecture, driving enterprise-scale solutions that accelerate machine learning initiatives, enable engineering excellence, and unlock business insights. You will be working in a team at the heart of Doctolib's data-driven transformation, enabling innovation through robust, scalable data infrastructure that empowers engineers, AI teams, and business across the organization.

Working in the tech team at Doctolib involves building innovative products and features to improve the daily lives of care teams and patients. We work in feature teams in an agile environment, while collaborating with product, design, and business teams.

Your responsibilities include but are not limited to:

  • Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
  • Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
  • Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
  • Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
  • Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization

About our tech environment

  • Our solutions are built on a single fully cloud-native platform that supports web and mobile app interfaces, multiple languages, and is adapted to the country and healthcare specialty requirements. To address these challenges, we are modularizing our platform run in a distributed architecture through reusable components.
  • Our stack is composed of Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.
  • We leverage AI ethically across our products to empower patients and health professionals. Discover our AI vision here and learn about our first AI hackathon here!

Requirements

  • You have 7+ years of experience after graduation as a Senior Data Platform, Senior Data Engineer or in a similar role , with a history of architecting and scaling robust data platforms.

  • You have extensive experience with Google Cloud Platform and a command of Kubernetes & Terraform for automated deployments. You are an authority on implementing network and IAM security best practices You have deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, d eploying applications to the cloud, and leveraging modern data warehouses such as BigQuery

  • You are highly skilled in programming with Python , and have a solid understanding of software development principles

  • You are an excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identifying performance bottlenecks, and a strong communicator who can articulate complex technical concepts to both technical and non-technical audiences

Now, it would be fantastic if you:

  • Have hands-on experience building and deploying APIs, with a strong preference for frameworks like FastAPI, and are skilled at designing APIs that provide reliable and efficient access to data
  • Have actively applied data governance principles to manage data quality, security, and compliance, and understand how to implement controls that protect sensitive information
  • Are experienced with CI/CD tools and methodologies for data-related projects, and can build automated pipelines that streamline development, testing, and deployment
  • Have practical experience with cloud cost optimization and FinOps principles, actively experimenting with cost-reduction strategies, and are comfortable analyzing infrastructure spending and optimizing resource allocation

Benefits & conditions

  • Free mental health and coaching services through our partner Moka.care
  • For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
  • Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
  • Work Council subsidy to refund part of sport club membership or creative class
  • Up to 14 days of RTT
  • A subsidy from the work council to refund part of the membership to a sport club or a creative class
  • Lunch voucher with Swile card

Apply for this position