Staff Data Ops Engineer
Role details
Job location
Tech stack
Job description
The Data Foundation team is at the heart of Doctolib's data-driven transformation, enabling healthcare innovation through robust, scalable data infrastructure. As a Staff DataOps Engineer, you are the technical leader who shapes our data platform strategy and architecture. You will drive the design and implementation of enterprise-scale data solutions that power insights for our +300,000 healthcare practitioners and 60 million patients. Working at the intersection of engineering excellence and business impact, you will lead cross-functional initiatives, mentor engineering teams, and establish the technical standards that enable Doctolib's continued growth across Europe. Your success in the first 6+ months will be measured by your ability to deliver strategic data infrastructure improvements, reduce operational costs through intelligent automation, and establish architectural patterns that scale with our business. Your responsibilities include but are not limited to:
- Strategic Data Architecture Leadership: Design and implement enterprise-scale data infrastructure strategies, conducting thorough impact and cost analysis for major technical decisions, and establishing architectural standards across the organization
- Advanced Pipeline Engineering: Build and optimize complex, multi-region data pipelines handling petabyte-scale datasets, ensuring 99.9% reliability and implementing advanced monitoring and alerting systems
- Cost Optimization & Performance Engineering: Lead cost analysis initiatives, identify optimization opportunities across our data stack, and implement solutions that reduce infrastructure spend while improving performance and reliability
- Technical Leadership & Mentoring: Provide technical guidance to data engineers and cross-functional teams, conduct architecture reviews, and drive adoption of best practices in DataOps, security, and governance
- Innovation & Research: Evaluate emerging technologies, conduct proof-of-concepts for new data tools and platforms, and lead the technical roadmap for data infrastructure modernization
Requirements
Do you have experience in Terraform?, * 7+ years of experience as a DataOps Engineer or in a similar role, with a history of architecting and scaling robust data platforms.
- Deep technical proficiency in orchestrating data pipelines using Airflow or Dagster, deploying applications to the cloud, and leveraging modern data warehouses (Redshift, BigQuery).
- Highly skilled in programming with Python, and a solid understanding of software development principles.
- Extensive experience with cloud infrastructure on AWS, Azure, or GCP, and a command of Terraform for automated deployments. You are an authority on implementing network and IAM security best practices.
- An excellent troubleshooter who excels at diagnosing and fixing data infrastructure and identify performance bottlenecks.
- A strong communicator who can articulate complex technical concepts to both technical and non-technical audiences, fostering effective collaboration across teams.
Now, it would be fantastic:
- API Expertise: You have hands-on experience building and deploying APIs, with a strong preference for frameworks like FastAPI. You're skilled at designing APIs that provide reliable and efficient access to data.
- Data Governance & Security: You're not just aware of data governance; you've actively applied principles to manage data quality, security, and compliance. You understand how to implement controls that protect sensitive information.
- CI/CD Implementation: You're experienced with CI/CD tools and methodologies for data-related projects. You can build automated pipelines that streamline development, testing, and deployment, reducing manual effort and improving reliability.