DevOps - Data Platform
Role details
Job location
Tech stack
Job description
As a DevOps Engineer , your mission will be to improve and maintain an efficient, reliable and scalable platform that enables Data Product developers and owners to develop, deploy and maintain their data products autonomously, at scale, with clear and maintained interfaces and full observability, ensuring seamless data flow across the organization and enabling data-driven decision-making., * Maintain Data Product Controller to give stakeholders full responsibility for managing their data products in an automated way (via CI/CD and infrastructure as code) securely, reliably, governed and with full ownership
- Maintain Data and AI Platform orchestrator (Dagster) to give the possibility to Data Product developers to orchestrate their Data Product in a decentralized way (Data Mesh), by owning their release process and job pipelines.
- Monitor data platform for performance and reliability, identify and troubleshoot issues, and implement proactive solutions to ensure availability
- Offer observability components to enable developer teams and data product consumers to have the right level of knowledge of costs, data quality and data lineage
- Monitor platform costs, identify optimizations and saving opportunities while collaborating with data engineers, data scientists, and other stakeholders
About our tech environment
- GCP (BigQuery, Google Cloud Storage, Pub/Sub, Cloud Run, IAM, Monitoring)
- Container & Orchestration: Google Kubernetes Engine, ArgoCD for GitOps
- Iac: Terraform/Crossplane
- Monitoring: Datadog
- Versioning/Continuous Integration: Git
- Data Orchestrator: Dagster
- We leverage AI ethically across our products to empower teams. Discover our AI vision here and learn about our first AI hackathon here!
Requirements
- You have more than 5 years of experience as a DataOps Engineer or in a similar role, with a proven track record of building and maintaining complex data infrastructures
- You have strong proficiency in data engineering and infrastructure tools and technologies (Kubernetes, ArgoCD, Crossplane)
- You have expertise in programming languages like Python
- You are familiar with cloud infrastructure and services, preferably GCP, and have experience with infrastructure-as-code tools such as Terraform
- You have excellent problem-solving skills with a focus on identifying and resolving data infrastructure bottlenecks and performance issues
Now it would be fantastic if you:
- Have knowledge of data governance principles and best practices for data security
- Have experience with continuous integration and continuous delivery (CI/CD) pipelines for data
Benefits & conditions
- Free mental health and coaching services through our partner Moka.care
- For caregivers and workers with disabilities, a package including an adaptation of the remote policy, extra days off for medical reasons, and psychological support
- Work from EU countries and the UK for up to 10 days per year, thanks to our flexibility days policy
- Up to 14 days of RTT
- A subsidy from the work council to refund part of the membership to a sport club or a creative class
- Lunch voucher with Swile card