Data Engineer
Role details
Job location
Tech stack
Job description
We are seeking a highly experienced Senior Data Engineer to lead the design, development, and optimization of modern, cloud-based data platforms across Azure and GCP environments.
This is a hands-on technical leadership role where you will take ownership of enterprise-scale data engineering initiatives, define architecture standards, and mentor engineers while delivering high-quality, scalable solutions.
You will play a pivotal role in shaping a strategic cloud data platform, enabling advanced analytics and data-driven decision-making across the organisation. Working within a high-performing cloud and data team, you'll help drive innovation, introduce modern data capabilities, and ensure platforms are secure, resilient, and cost-efficient., Data Architecture & Engineering Leadership
-
Lead the design and implementation of scalable, secure, and robust data pipelines using ETL/ELT frameworks across GCP.
-
Define and promote best practices for data engineering, modular design, and orchestration
-
Provide technical leadership, mentoring, and support across the data engineering team
Cloud Data Platform & Infrastructure
-
Design and build enterprise-grade data solutions across Azure and GCP (Data Lake, Synapse, BigQuery, event streaming)
-
Own infrastructure using Infrastructure as Code (Terraform/Bicep) and implement CI/CD pipelines
-
Ensure platforms are highly available, fault-tolerant, and optimised for performance and cost
Pipeline Design & Optimisation
-
Develop batch, real-time, and streaming pipelines using tools such as Airflow, ADF, dbt, and GCP-native services (Dataflow, Dataproc, Cloud Composer)
-
Design and implement scalable GCP ETL pipelines using BigQuery, Dataflow (Apache Beam), Pub/Sub, and Cloud Storage
-
Monitor, troubleshoot, and optimise pipelines, storage, and query performance
Data Governance, Security & Compliance
-
Embed security and privacy-by-design principles, including encryption, masking, and access controls
-
Ensure compliance with GDPR and enterprise data governance standards
-
Implement data lineage, classification, and audit logging
Integration & Interoperability
-
Integrate with enterprise systems (ERP, APIs, SFTP, event streams)
-
Build reusable ingestion frameworks for structured and semi-structured data (JSON, XML, Parquet, Avro)
Data Quality & Observability
-
Implement data quality and testing frameworks (dbt, Great Expectations, or similar)
-
Establish monitoring, alerting, and observability across pipelines and platforms
Collaboration & Delivery
-
Work closely with engineers, analysts, and stakeholders to deliver scalable data solutions
-
Contribute to agile planning, estimation, and delivery
-
Document architecture, standards, and data models
Requirements
-
7+ years' experience in data engineering, including leading delivery in enterprise environments
-
Expert-level SQL and Python for data transformation and automation
-
Strong hands-on experience with GCP data services, including:
o BigQuery (data warehousing & ELT)
o Dataflow (Apache Beam for ETL processing)
o Pub/Sub (event streaming ingestion)
o Cloud Storage (data lake ingestion layers)
o Cloud Composer (Airflow orchestration)
-
Proven experience building and optimising lakehouse architectures (Delta Lake, Databricks, Snowflake, BigQuery)
-
Hands-on experience with Airflow, ADF, dbt, and data modelling (dimensional/star/snowflake)
-
Experience with APIs and event streaming platforms (Kafka, Azure Event Hub, Pub/Sub)
-
Strong understanding of data security, GDPR, encryption, and IAM
-
Experience with CI/CD pipelines, version control, and DevOps practices
Desirable Skills
-
Experience in manufacturing or FMCG environments (ERP, MES, TPM)
-
Familiarity with data governance/cataloguing tools (e.g., Microsoft Purview)
-
Exposure to multi-cloud or hybrid architectures
-
Experience with BI tools (Power BI, Tableau)
-
Certifications such as:
o GCP Professional (Cloud Architect, Data Engineer, DevOps, Security), Job Description What We're Looking For - Strong experience as an AI / ML Engineer or Applied ML Engineer - Deep hands-on knowledge of Google Cloud Platform (Vertex AI, BigQuery, Cloud Run, GKE, Pub/Sub) - Proven track record building LLM, GenAI, or traditional ML..., What We're Looking For Strong experience as an AI / ML Engineer or Applied ML Engineer Deep hands-on knowledge of Google Cloud Platform (Vertex AI, BigQuery, Cloud Run, GKE, Pub/Sub) Proven track record building LLM, GenAI, or traditional ML solutions end-to-end Comfortable...
Benefits & conditions
Job Description Senior / Lead Data Engineer London, Hybrid. (1-3 Days in London) Up to £90,000 plus 25% bonus. This is an exciting opportunity to lead data engineering within a growing organisation that is scaling its cloud data platform. You will take technical...