Lead Analytics Engineer (Databricks | Lakehouse | Data Modelling)

Careerwise
Charing Cross, United Kingdom
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Charing Cross, United Kingdom

Tech stack

Artificial Intelligence
Data analysis
Azure
Continuous Integration
Data Architecture
ETL
Data Warehousing
Python
Sql Optimization
Spark
Pandas
Microsoft Fabric
Data Lake
Star Schema
Data Pipelines
Databricks

Job description

Location: London, United Kingdom (Hybrid: 3 days onsite per week)

Employment Type: Full-time/Permanent

We're looking for a Lead Analytics Engineer to join a fast-growing AI & Data team, where you'll play a critical role in shaping how data drives executive decision-making.

You will own the analytics layer, design scalable data models, and transform raw data into trusted, business-ready insights.

If you enjoy working with Databricks, modern lakehouse architectures, and data modelling, this role is for you.

Key Responsibilities

  • Architect and build scalable data pipelines on Databricks
  • Design and implement data models (Star Schema, Facts & Dimensions)
  • Own the semantic/analytics layer ( single source of truth )
  • Work with Delta Live Tables, Unity Catalog, Workflows
  • Optimize and troubleshoot complex ETL pipelines
  • Collaborate with business and technical teams to deliver impactful insights
  • Drive best practices across data architecture & engineering

What We're Looking For

  • Strong hands-on experience with Databricks (Spark, Delta Lake)
  • Advanced SQL + Python (Databricks Notebooks, Pandas)
  • Proven experience in data modelling (Star Schema, SCD, CDC)
  • Experience building data warehouses/lakehouses
  • Good understanding of Medallion Architecture
  • Experience with Azure DevOps/CI-CD pipelines
  • Ability to translate data into business value

Nice to Have

  • Azure certifications (AZ-900, DP-203, DP-500)
  • Databricks certifications
  • Exposure to Microsoft Fabric
  • Experience with AI-assisted development tools

How to Apply:

Send your CV highlighting hands-on experience in Databricks Lakehouse Architect + Data Modeller.

Requirements

  • Strong hands-on experience with Databricks (Spark, Delta Lake)
  • Advanced SQL + Python (Databricks Notebooks, Pandas)
  • Proven experience in data modelling (Star Schema, SCD, CDC)
  • Experience building data warehouses/lakehouses
  • Good understanding of Medallion Architecture
  • Experience with Azure DevOps/CI-CD pipelines
  • Ability to translate data into business value

Nice to Have

  • Azure certifications (AZ-900, DP-203, DP-500)
  • Databricks certifications
  • Exposure to Microsoft Fabric
  • Experience with AI-assisted development tools

Apply for this position