Senior Data Engineer - Analytics
Role details
Job location
Tech stack
Job description
As a Senior Analytics Engineer, your mission will be to work on building data products that provide insights and support decision-making across Doctolib, helping to transform access to healthcare. You will be working in a team developing data pipelines and solutions that empower our organization with data-driven insights while supporting our AI strategy.
Working in the tech team at Doctolib involves building innovative products and features to improve the daily lives of care teams and patients. We work in feature teams in an agile environment, while collaborating with product, design, and business teams., * Building and maintaining data pipelines using Python to support Doctolib's AI strategy
- Developing and maintaining data marts in BigQuery using SQL/Jinja, DBT
- Creating dashboards for high-level reporting using Tableau
- Collaborating with stakeholders to understand their data needs and define specifications
- Ensuring data quality, security (GDPR compliance), and availability through monitoring and optimization
About our tech environment
- Our solutions are built on a single fully cloud-native platform that supports web and mobile app interfaces, multiple languages, and is adapted to the country and healthcare specialty requirements. To address these challenges, we are modularizing our platform run in a distributed architecture through reusable components.
- Our stack is composed of Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.
- We leverage AI ethically across our products to empower patients and health professionals. Discover our AI vision here and learn about our first AI hackathon here!
- Our data stack includes: Kafka/Debezium for data ingestion, Dagster/DBT for orchestration, GCS/BigQuery for data warehousing, and Metabase/Tableau for BI and reporting.
Requirements
Do you have experience in Tableau?, * You have at least 7 years+ of experience as an Analytics Engineer, Data Engineer, or a similar role
- You are proficient in Python, SQL, and DBT for building data pipelines
- You have experience with BI tools such as Tableau, Metabase, or similar platforms
- You have a first experience with Google Cloud Platform (GCP) stack.
- You have a good understanding of functional aspects of data (Sales, Finance, Web Analytics, Product)
- You have strong communication skills and can collaborate effectively with business teams
Now it would be fantastic if you:
- Have experience with AI products
- Have experience with Vertex AI
- Are familiar with GDPR regulations
Benefits & conditions
- Free mental health and coaching services through our partner Moka.care
- For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
- Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
- Work Council subsidy to refund part of a sport club membership or a creative class
- Up to 14 days of RTT
- Lunch voucher with Swile card
- Bicycle subsidy