Senior Analytics Engineer

DeepL
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 58K

Job location

Remote

Tech stack

Airflow
Data analysis
Google BigQuery
Clickstream
Cloud Computing
Databases
Information Engineering
Data Governance
Data Systems
Data Warehousing
Python
Online Analytical Processing
SQL Databases
Data Processing
Scripting (Bash/Python/Go/Ruby)
Snowflake
Core Data
Low Latency
Data Management
Data Pipelines
Databricks

Job description

We are looking for a Senior Analytics Engineer to join a versatile team that builds and maintains data models, pipelines and data products utilised across DeepL. The team owns the core data products and pipelines utilising data from both internal and external sources. You will work with product managers, data scientists, data engineers, and analysts to solve complex technical and business problems. You will work with revenue, customer, and product usage data and will be expected to gain a deep analytical understanding of these datasets.

  • Data pipeline development: design, implement, and maintain scalable data pipelines and architecture in both our on-prem and cloud environments. Build the next generation of DeepL's data products while contributing to the maintenance and migration of our legacy assets.
  • Data modeling: Develop and optimise data models to support analytics and reporting needs, ensuring data quality, consistency and integrity.
  • Collaboration: Work closely with data scientists, product analysts, and business stakeholders to understand data requirements and deliver solutions.
  • Performance optimisation: Monitor and enhance the performance of data systems, ensuring low latency and high reliability.
  • Data quality: Act as an owner for the quality and availability of our data. Apply data governance frameworks and champion data contracts to guarantee confidence in our analytics datasets.
  • Documentation & best practices: Maintain comprehensive documentation and promote best practices in data engineering and analytics.
  • Mentorship: Guide and mentor junior team members, fostering a culture of continuous learning and improvement.

Requirements

  • Experience: 5+ years in data or analytics engineering roles, with a focus on cloud data platforms.
  • Data Modeling: Expertise in designing and implementing data models for analytics.
  • SQL: Advanced proficiency in writing complex queries and optimizing performance.
  • Python: Experience in data manipulation and scripting.
  • Cloud Platforms: Hands-on experience with cloud data warehouses like Snowflake, Databricks, or BigQuery. Experience with an on prem data warehouse such as Clickhouse is valuable.
  • Experience utilising data orchestration and transformation tools like dbt, Airflow, or similar
  • Understands best practice for querying OLAP, click stream, real time event processing databases.
  • Product mindset: Able to work directly with business stakeholders to translate analytics needs into user-friendly, highly-performant data products.
  • Troubleshooter: Experience diagnosing and resolving data issues and outages. Prepared to participate in on-call rotations, perform post-mortems, and proactively recommend improvements for engineering excellence.
  • Soft Skills: Excellent problem-solving abilities, strong communication skills, and a collaborative mindset.

About the company

Helping people overcome communication barriers is the heart of what we do. Founded in Germany in 2017 by a team of engineers and researchers, DeepL has developed the world’s most accurate AI translation technology—enabling real-time, human-sounding translation.

Accessible via a web translator, browser extensions, desktop and mobile apps, and an API, DeepL supports a best-in-class translation experience in 34 languages and counting. Our 550-person team operates across four European hubs in Germany, the Netherlands, the UK, and Poland.

Apply for this position