Data Engineer (GCP)

Xcede
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Part-time (≤ 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Charing Cross, United Kingdom

Tech stack

Airflow
Google BigQuery
Cloud Computing
Continuous Integration
Data Architecture
Data Governance
Data Infrastructure
ETL
Data Security
Data Structures
Data Systems
Data Warehousing
Python
Query Optimization
DataOps
SQL Databases
Data Classification
Snowflake
Spark
Terraform
Data Pipelines
Databricks

Job description

As a Data Engineer, you'll design, build, and operate scalable, reliable data pipelines and data infrastructure. Your work will ensure high-quality data is accessible, trusted, and ready for analytics and data science - powering business insights and decision-making across the company.

What you'll do

  • Build and maintain data pipelines for ingestion, transformation, and export across multiple sources and destinations
  • Develop and evolve scalable data architecture to meet business and performance requirements
  • Partner with analysts and data scientists to deliver curated, analysis-ready datasets and enable self-service analytics
  • Implement best practices for data quality, testing, monitoring, lineage, and reliability
  • Optimise workflows for performance, cost, and scalability (e.g., tuning Spark jobs, query optimisation, partitioning strategies)
  • Ensure secure data handling and compliance with relevant data protection standards and internal policies
  • Contribute to documentation, standards, and continuous improvement of the data platform and engineering processes

Requirements

  • 5+ years of experience as a Data Engineer, building and maintaining production-grade pipelines and datasets
  • Strong Python and SQL skills with a solid understanding of data structures, performance, and optimisation strategies
  • Familiarity with GCP and ecosystem knowledge: BigQuery, Composer, Dataproc, Cloud Run, Dataplex
  • Hands-on experience with orchestration (like Airflow, Dagster, Databricks Workflows) and distributed processing in a cloud environment
  • Experience with analytical data modelling (star and snowflake schemas), DWH, ETL/ELT patterns, and dimensional concepts
  • Experience with data governance concepts: access control, retention, data classification, auditability, and compliance standards
  • Familiarity with CI/CD for data pipelines, IaC (Terraform), and/or DataOps practices
  • Experience building observability for data systems (metrics, alerting, data quality checks, incident response)

Apply for this position