Senior Data Pipeline Engineer
Role details
Job location
Tech stack
Job description
This is a high-ownership hire. You will work closely with product and engineering, shape how we collect and process external data, and have real architectural authority from day one., You think in systems, not scripts. You have built crawling or data ingestion infrastructure at a company where external data was a core input to the product, not a side project. You have dealt with the ugly realities: sites that change layouts overnight, anti-bot systems, rate limits, inconsistent APIs, and missing data. You have owned pipelines end to end.
You are self-directed. You make architectural decisions, not just execute tickets. You care about data quality almost obsessively. You are the kind of person who builds three validation checks where one would be "fine."
Requirements
Must have:
- 5+ years of experience building data pipelines and/or web crawling infrastructure in production environments
- Strong Python skills: Scrapy, BeautifulSoup, Selenium, Playwright, or equivalent
- Experience with pipeline orchestration tools such as Airflow, Prefect, or Dagster
- Solid understanding of databases (PostgreSQL, MongoDB, or similar) and data modeling
- Experience handling messy, inconsistent real-world data and building validation layers around it
- Comfort with cloud infrastructure (AWS, GCP, or Azure)
- A track record of building systems that run reliably without constant supervision
- Deep familiarity with AI tools and workflows. AI should already be part of how you work every day. You are joining an AI-native company building AI-powered products.
Nice to have:
- Experience with headless browsers and anti-detection techniques at scale
- Familiarity with the music industry, streaming platforms, or media data
- Experience with data warehousing (BigQuery, Snowflake, Redshift)
- Exposure to message queues (Kafka, RabbitMQ, Redis)
- Experience building internal tooling and dashboards
Benefits & conditions
- Your pipelines feed the AI agents that run scouting, deal modeling, royalties, release planning, and marketing for our clients.
- You inherit a working data infrastructure built over more than a decade and make it yours.
- We rebuilt the entire tech stack from scratch. No legacy to work around.
- Small but highly skilled team. Your work ships fast and touches everything.
- Remote-friendly with flexible hours. Office available in Zurich for those who prefer it.