Principal Data Engineer

Oritain
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Charing Cross, United Kingdom

Tech stack

Continuous Integration
Data as a Services
Information Engineering
Data Governance
ETL
Data Mart
Data Warehousing
Python
SQL Azure
Netsuite
Operational Databases
Salesforce
SQL Databases
Data Streaming
Data Processing
Azure
Data Strategy
Data Lake
Cosmos DB
Terraform
Azure
Data Pipelines
Databricks

Job description

  • Define and own the technical strategy and architecture for our entire data platform: ingestion, storage, processing, governance, and consumption

  • Design and implement scalable, reliable ETL/ELT pipelines handling complex scientific datasets, supply chain inputs, and business data

  • Lead the design of canonical data models for our data warehouse and operational stores, ensuring quality, consistency, and integrity

  • Define and maintain a single source of truth for clients, suppliers, and transactions across systems (Salesforce, NetSuite, internal databases)

  • Implement data governance, security policies, automated quality checks, and robust monitoring across all pipelines

  • Work with the infrastructure team to provision Azure data resources using Terraform or equivalent IaC tooling

  • Partner with our Science teams to ensure accurate ingestion and translation of raw scientific data

  • Mentor engineers across the team on best practices for building and consuming data services

Requirements

  • 7+ years in data engineering, with significant time in a Principal, Lead, or Architect role defining data strategy

  • Deep, practical experience with Databricks - architecture and implementation

  • Strong hands-on experience across the Azure data stack: Data Factory, Data Lake, Synapse Analytics, Azure SQL/Cosmos DB

  • Expert-level Python and SQL, with a strong focus on clean, tested, performant data processing code

  • Proven track record designing and implementing scalable data warehouses and data marts

  • Experience with workflow orchestration, CI/CD for data pipelines, and IaC (Terraform)

Desirable

  • Experience with scientific, geospatial, or time-series data

  • Background in governance or compliance environments

  • Familiarity with streaming data technologies

The Recruitment Process

We like to keep our recruitment process smooth and efficient, giving you the opportunity to showcase your skillset in a comfortable environment. The process for this position is a 2-stage interview and short take home task.

About the company

* Monthly Wellbeing Allowance * Breakfast, Snacks, Friday lunch & Barista Coffee Machine in the office * Learning Portal with over 100,000 assets available to support professional development

Apply for this position