Data Quality Engineer - TV Insight

RED
Siezenheim, Austria
yesterday

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
€ 35K

Job location

Siezenheim, Austria

Tech stack

Airflow
Amazon Web Services (AWS)
Google BigQuery
Cloud Computing
Continuous Integration
Data Validation
Information Engineering
Data Infrastructure
Data Integrity
ETL
Dataspaces
Data Warehousing
Digital Assets
Python
Standard Sql
DataOps
Software Engineering
Snowflake
Information Technology
Data Inconsistencies
Data Pipelines
Docker

Job description

OWNERSHIP OF DATA INFRASTRUCTURE AND PIPELINES

You'll lead the operation, monitoring, and continuous improvement of data pipelines across all data domains. You guarantee data availability and correctness within production systems and implement robust scheduling, monitoring, and alerting solutions for all data workflows.

AUTOMATION & PROCESS OPTIMIZATION

You'll automate ETL workflows and manual maintenance processes, developing CI/CD-based automation for data pipelines and deployments. You introduce standardized data quality checks and reconciliation frameworks to ensure scalable, efficient, and reliable data operations.

DATA QUALITY MANAGEMENT

You define and enforce data validation rules and performance metrics, proactively identifying, analyzing, and resolving data inconsistencies. You drive continuous improvements in data integrity and completeness, ensuring stakeholders can rely on data for critical decisions.

COLLABORATION & CROSS-TEAM ENABLEMENT

You partner with data scientists and analysts to ensure data reliability for modeling, experimentation, and reporting. Where needed, you contribute to internal documentation and knowledge sharing, helping teams understand and effectively use data assets and pipelines.

You evolve TV-Insight's data platform towards self-service capabilities and high scalability. You participate in architecture design for future data products and act as a subject-matter expert for pipeline automation, data observability, and best practices in data engineering.

Requirements

University degree in Computer Science, Software Engineering, Data Science, Data Engineering, Information Systems or a related field 5+ years of experience in data engineering, data quality, or pipeline automation Proven experience with ETL frameworks (e.g., Airflow, Prefect, dbt) Strong knowledge of SQL, Python, and data warehouse concepts (e.g., BigQuery, Snowflake) Experience with CI/CD, Docker, and cloud-based deployment (e.g., GCP, AWS) Solid understanding of data validation, monitoring, and alerting frameworks Analytical mindset with strong attention to detail Strong communication skills and a team-oriented attitude Self-driven, reliable, and able to manage complex data operations independently

Benefits & conditions

Due to legal reasons we are obliged to disclose the minimum salary according to the collective agreement for this position, which is EUR 2,509 gross per month. However, our attractive compensation package is based on market-oriented salaries and is therefore significantly above the stated minimum salary. As an employer, we value diversity and support people in developing their potential and strengths, realizing their ideas and seizing opportunities. We believe passionately that employing a diverse workforce is central to our success. We welcome applications from all members of society irrespective of age, skin colour, religion, gender, sexual orientation or origin.

Apply for this position