Senior Data Engineer

Vay
Berlin, Germany
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Remote
Berlin, Germany

Tech stack

API
Amazon Web Services (AWS)
Apache HTTP Server
Google BigQuery
Cloud Computing
ETL
Data Systems
Data Warehousing
Github
Protocol Buffers
Python
OpenFlow
Software Engineering
SQL Databases
Snowflake
Grafana
State Machines
AWS Lambda
Data Analytics
Kafka
Apache Nifi
Data Management
Cloudwatch
Terraform
Data Pipelines
Go

Job description

As a Senior Data Engineer, you'll help us scale our pipelines alongside, while optimizing speed and efficiency having direct impact on our ability to scale our fleet of cars as we launch into new cities and markets. You will sit at the heart of the Data Analytics team in Berlin, where you'll work closely with data producers, Data Analysts, and Data Scientists to build data products that directly impact product quality, operational efficiency, and business performance.

You'll step into an experienced, high-calibre data team with a modern, well-designed tool stack already in place. At the same time, the platform and team have plenty of room to evolve, giving you real ownership, influence, and the opportunity to shape how data supports the company as it scales., You'll be a senior, hands-on contributor at the core of our data platform, owning critical pipelines, data models, and infrastructure end to end. Your work will directly enable analytics, decision making, and product performance as we scale.

  • ETL pipelines and APIs: Own and build reliable ETL pipelines and APIs across the data stack, using tools such as AWS Lambda, AWS Step Functions, GitHub Actions, Apache Kafka, and NiFi/Openflow
  • Data modeling: Design, implement, and continuously optimise data models in the Snowflake data warehouse using dbt, aligned closely with the needs of Data Analysts and business stakeholders
  • Data contracts: Define, negotiate, and maintain Protobuf data contracts to ensure clear and stable data interfaces between systems
  • Cloud infrastructure: Manage and evolve cloud infrastructure using Terraform and GitHub Actions, with a strong focus on automation, maintainability, and scalability
  • Monitoring and data quality: Monitor platform health and data quality using tools such as Grafana, AWS CloudWatch, AWS Alerts, and Apache Superset, identifying and resolving issues early
  • Platform evolution: Partner with the wider data team to continuously improve the reliability, performance, and scalability of the data platform as the company grows

Requirements

Do you have experience in Terraform?, You will be a hands-on, Senior Data Engineer who is comfortable owning data products, models, and pipelines end to end. You will operate across the data stack, work autonomously, and focus on delivering sustainable, high-impact solutions with quality and reusability in mind.

You will bring a strong technical foundation and the ability to integrate quickly into a small, experienced team, contributing with minimal guidance and sound technical judgement. You will have:

  • End-to-end ownership: Demonstrated experience owning data pipelines, data models, and data products from design through production
  • Impact-driven delivery: A track record of building data solutions or analyses that directly improved product, operational, or business outcomes
  • Advanced SQL: Ability to write, analyse, and optimise complex, high-performance SQL queries
  • Data modeling expertise: Experience designing and maintaining scalable, reusable data models using dbt or equivalent tooling
  • Cloud data platforms: Hands-on experience working with modern cloud data warehouses such as Snowflake or BigQuery
  • Infrastructure ownership: Confident managing and evolving AWS and Snowflake infrastructure, ideally using Terraform
  • Modern data tooling: Familiar with contemporary data platforms and ETL or reverse ETL tools such as Airbyte, Hightouch, Kafka, or NiFi/Openflow
  • Engineering capability: Comfortable programming in Python, with Golang as a plus, and applying software engineering best practices in a data context
  • Autonomous execution: Comfortable working independently from problem definition through delivery, without the need for close supervision
  • Quality focus: Prioritise building robust, maintainable, and dependable solutions

About the company

We're rewriting the rules of urban mobility. At Vay, customers tap a button and a car arrives - with no one inside - powered by our world-first Remote Driving technology on real public streets. We're live in Las Vegas and scaling fast, powered by a strategic investment of up to $410 million from Grab. Our mission is simple: replace private car ownership with a faster, cleaner, door-to-door mobility model. If you want to build something real, visible, and genuinely transformative, you'll feel right at home here.

Apply for this position