Data Platform Engineer (GCP, Terraform, Airflow)
Primus Connect
Charing Cross, United Kingdom
2 days ago
Role details
Contract type
Contract Employment type
Full-time (> 32 hours) Working hours
Shift work Languages
English Compensation
£ 150KJob location
Remote
Charing Cross, United Kingdom
Tech stack
Artificial Intelligence
Airflow
Big Data
Google BigQuery
Directed Acyclic Graph (Directed Graphs)
Data Cleansing
Information Engineering
Data Infrastructure
Software Debugging
Distributed Computing Environment
SQL Databases
Google Cloud Platform
Data Ingestion
Large Language Models
Spark
Build Management
PySpark
Kubernetes
Terraform
Data Pipelines
Docker
Job description
Data Platform Engineer (GCP, PySpark, Airflow) - Contract
- Rate: £550 - £575 per day Outside IR35
- Location: Remote (UK)
- Travel: London once per month (preferred)
- Duration: 3 months (initial)
- Start: ASAP
The Role
We're hiring a Data Platform Engineer (GCP, PySpark, Airflow) to support a modern data platform/Lakehouse environment.
This is a hands-on engineering role focused on building and optimising data pipelines, while also contributing to the underlying platform and infrastructure.
You'll work closely with the data engineering team, delivering production-grade pipelines and ensuring systems are reliable, scalable, and performant.
Key Responsibilities
- Design and build end-to-end data pipelines (PySpark/SQL)
- Develop and manage workflows using Airflow (DAGs, scheduling, orchestration)
- Work with BigQuery for large-scale data processing and analytics
- Monitor, debug, and optimise pipelines for performance and reliability
- Collaborate with data analysts and stakeholders to validate outputs
- Support and improve GCP data platform infrastructure
- Build and manage Terraform (IaC) for scalable environments
- Implement robust data ingestion and transformation patterns
Required Skills
- Strong experience as a Data Platform Engineer/Data Engineer (GCP)
- Hands-on experience with PySpark/Apache Spark
- Strong experience with Airflow (or similar orchestration tools)
- Experience working with BigQuery
- Solid understanding of distributed data processing
- Experience building data pipelines end-to-end
- Experience with Terraform (modules, reusable infrastructure)
- Strong debugging and problem-solving skills
Nice to Have
- Experience with data quality tools (Dataplex, Great Expectations, Soda)
- Exposure to AI/ML or LLM-based workflows
- Experience with containerisation (Docker/Kubernetes)
What We're Looking For
- Engineers who can work independently and deliver quickly
- Strong ownership mindset, able to troubleshoot and improve systems
- Comfortable working across data and platform
- Focused on delivery over over-engineering
Why This Role
- Work on a modern GCP data platform
- High-impact role across data and platform engineering
- Strong engineering culture focused on delivery and ownership
- Flexible remote working with minimal travel
Requirements
- Strong experience as a Data Platform Engineer/Data Engineer (GCP)
- Hands-on experience with PySpark/Apache Spark
- Strong experience with Airflow (or similar orchestration tools)
- Experience working with BigQuery
- Solid understanding of distributed data processing
- Experience building data pipelines end-to-end
- Experience with Terraform (modules, reusable infrastructure)
- Strong debugging and problem-solving skills
Nice to Have
- Experience with data quality tools (Dataplex, Great Expectations, Soda)
- Exposure to AI/ML or LLM-based workflows
- Experience with containerisation (Docker/Kubernetes), * Engineers who can work independently and deliver quickly
- Strong ownership mindset, able to troubleshoot and improve systems
- Comfortable working across data and platform
- Focused on delivery over over-engineering
Benefits & conditions
Why This Role
- Work on a modern GCP data platform
- High-impact role across data and platform engineering
- Strong engineering culture focused on delivery and ownership
- Flexible remote working with minimal travel