Data Engineer
Role details
Job location
Tech stack
Job description
We are looking for a strong Data Engineer to join our Data Team. You'll work on core data infrastructure, support large-scale data initiatives, and help build a robust, reliable, and future-proof data ecosystem as we scale globally.
You should be experienced with modern data engineering tools, comfortable working with diverse storage systems and formats, and capable of applying software engineering best practices.
Key Projects You Will Work On
You will be deeply involved in several strategic initiatives:
- Collecting richer patient data to support marketing optimization and personalization.
- Call center and chat automation, as well as improving quality and analytics for communication channels.
- Improving overall data quality, observability, lineage, and reliability across datasets. Migrating from Redshift to ClickHouse to improve performance, scalability, and cost efficiency.
Tech Stack You'll Use
Our modern data platform is built on:
- DBT (data modeling and transformation)
- Airflow (orchestration)
- Redshift * ClickHouse (DWH migration in progress)
- Various storage formats: Parquet, JSON, CSV, and more, Data Pipeline Development
- Design, build, and maintain reliable data pipelines using Airflow, DBT, and AWS services.
- Ensure pipelines are modular, testable, and follow best software engineering practices.
- Implement unit tests, integration tests, and data validations for ETL/ELT workflows.
Data Architecture & Warehousing
- Contribute to the Redshift-to-ClickHouse migration, optimizing schema design, storage formats, and query performance.
- Evaluate and optimize data storage across structured and semi-structured formats. Ensure data models are stable, efficient, and maintainable.
Data Quality & Observability
- Implement and improve data quality checks, automated validations, and anomaly detection. Contribute to data lineage, documentation, and metadata management.
- Monitor and troubleshoot issues in pipelines, storage, and transformations.
Cross-Team Collaboration
- Work closely with Marketing, Product, Engineering, and Operations teams to collect and deliver high-quality data.
- Translate business needs into scalable and cost-efficient data solutions.
- Support data consumers with access, tooling, and best practices.
Requirements
5+ years of experience as a Data Engineer working on large-scale data platforms. Strong experience with DBT, Airflow, and cloud environments (preferably AWS).
Deep understanding of data warehouse technologies, ideally Redshift and/or ClickHouse. Strong knowledge of data storage formats (Parquet, JSON, CSV) and distributed data patterns. Solid software engineering fundamentals:
- Writing clean, maintainable code
- Version control
- CI/CD workflows
Automated testing for pipelines
- Strong SQL skills and experience optimizing queries.
- Familiarity with Python for ETL, orchestration, or tooling.
Nice to Have
- Understanding of data governance, access control, and security.
- Experience supporting data for marketing automation and CRM systems.
- Knowledge of event-driven data processing and streaming.
Benefits & conditions
- Opportunity to work on critical, high-impact data systems across multiple regions. A modern data stack.
- Collaboration with cross-functional teams across engineering, operations, and marketing. A mission-driven environment where your work directly impacts patient experience and companygrowth.
- Support for courses, certifications, and career development.
- Work fully remotely, or in our central Barcelona HQ