Data Engineer (Codemotion 26)
Role details
Job location
Tech stack
Job description
We are looking for an outstanding Data Engineer to join our Next Generation TV project. In this role, you will work at the core of our data platform, enabling analytics, monitoring, personalization, and product intelligence across our TV ecosystem., * Design, build, and maintain scalable data pipelines ingesting data from TV platforms, backends, and user devices.
- Process and transform high-volume events and telemetry data (user behavior, app usage, playback metrics, errors).
- Develop and optimize ETL / ELT workflows to feed analytics, reporting, and operational use cases.
- Model TV-related data (audience, content consumption, performance metrics) for analytical and product needs.
- Ensure data quality, observability, and reliability across the entire data lifecycle.
- Collaborate closely with product, backend, analytics, and operations teams to understand how data supports decision-making.
- Support use cases such as platform monitoring, incident analysis, audience insights, and feature experimentation.
- Maintain and evolve data infrastructure on cloud environments.
- Proactively improve performance, scalability, and cost efficiency of data solutions.
- Document pipelines, datasets, and best practices to ensure long-term maintainability.
Requirements
Are you passionate about turning data into insights that power next-generation TV experiences?, * +4 years of experience as a Data Engineer or similar role.
-
Strong experience working with Python and SQL in data-intensive environments.
-
Solid knowledge of relational and analytical databases.
-
Experience building data pipelines handling large volumes of events and logs.
-
Hands-on experience with cloud platforms, preferably AWS (S3, Glue, Lambda, Redshift, EMR, etc.).
-
Experience with data lakes and modern analytics architectures.
-
Strong data modeling and performance optimization skills.
-
Experience with version control systems (Git).
-
Analytical mindset and strong problem-solving skills.
-
Technical English proficiency (minimum B2 level).
-
Spanish proficiency (C1/C2).
-
Nice to Have:
-
Experience with streaming and real-time data (Kafka, Kinesis, Spark, Flink).
-
Experience with workflow orchestration tools (Airflow, Prefect, Dagster).
-
Familiarity with TV, media, OTT, or streaming platforms.
-
Experience supporting observability and monitoring use cases.
-
Knowledge of data governance, security, and privacy best practices.