GCP Data Engineer

Postaladdress Uk
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 85K

Job location

Charing Cross, United Kingdom

Tech stack

Java
Airflow
Google BigQuery
Cloud Database
Cloud Storage
Continuous Integration
Data Architecture
Data Governance
ETL
Data Transformation
Data Systems
Database Design
DevOps
Distributed Computing Environment
Data Flow Control
Python
Data Streaming
Google Cloud Platform
Build Management
Infrastructure Automation Frameworks
Data Management
Cloud Migration
Data Pipelines
Apache Beam

Job description

This GCP Data Engineer role will focus on designing data architectures on Google Cloud Platform, building high-performance pipelines, and enabling reliable, secure and governed data solutions that support business growth and decision-making., * Design and build scalable data pipelines, ETL/ELT workflows and cloud data architectures on GCP

  • Develop solutions using services such as BigQuery, Dataflow, Spanner and Cloud Storage
  • Build and maintain code using Python, Java or Scala for data transformation and processing
  • Optimise data pipelines, queries and workloads for performance and scalability
  • Implement data quality, validation and governance controls
  • Collaborate with cross-functional teams to deliver end-to-end data solutions
  • Ensure alignment with security, privacy and compliance standards
  • Troubleshoot issues across pipelines and processing workflows
  • Maintain documentation across data flows, platforms and architectures, If you are an experienced GCP Data Engineer looking to build enterprise-scale data platforms using modern Google Cloud technologies, this role offers strong exposure to complex delivery and cloud transformation.

Requirements

  • Strong experience as a Data Engineer within GCP environments
  • Hands-on experience with BigQuery, Dataflow and Spanner
  • Strong programming capability in Python, Java or Scala
  • Experience with Apache Beam / Dataflow and distributed processing frameworks
  • Experience designing and managing ETL / ELT pipelines
  • Solid understanding of data modelling and database design
  • Experience with workflow orchestration tools such as Apache Airflow
  • Strong understanding of data governance, security and scalability

Desirable Skills for the GCP Data Engineer:

  • Experience with Pub/Sub, Cloud Composer or Cloud Data Fusion
  • Exposure to CI/CD, DevOps and Infrastructure as Code
  • Experience with real-time streaming architectures
  • Knowledge of modern data governance frameworks

Apply for this position