Senior Data Engineer

PTSG
Hale, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Remote
Hale, United Kingdom

Tech stack

Query Performance
Java
Airflow
Google BigQuery
Cloud Computing
Code Review
Information Engineering
Data Governance
Data Integrity
Data Warehousing
DevOps
Data Flow Control
Github
Identity and Access Management
Python
SQL Databases
Google Cloud Platform
Cloud Platform System
Snowflake
Data Build Tool (dbt)
Spark
Infrastructure as Code (IaC)
Gitlab
Kubernetes
Information Technology
Real Time Data
Kafka
Terraform
Apache Beam
Docker

Job description

  • Pipeline Architecture: Design, develop, and maintain high-performance batch and real-time data pipelines using Dataflow (Apache Beam), Cloud DataProc, and Cloud Pub/Sub.
  • Data Modeling: Architect sophisticated data warehouses in BigQuery, utilizing partitioning, clustering, and optimized schema designs (Star/Snowflake) to ensure lightning-fast query performance.
  • Infrastructure as Code (IaC): Manage and deploy cloud infrastructure using Terraform, ensuring environment consistency across Dev, Staging, and Production.
  • Orchestration: Build and manage complex workflows using Cloud Composer (Apache Airflow) to ensure data integrity and timely delivery.
  • Optimization & Governance: Monitor and optimize BigQuery costs and performance.
  • Implement robust data governance, security, and IAM policies.
  • Mentorship: Lead code reviews, define engineering standards, and mentor junior engineers on GCP-native patterns.

Requirements

Do you have experience in Terraform?, Do you have a Master's degree?, Must have:

  • Cloud Platform - Expert-level knowledge of Google Cloud Platform (GCP).
  • Languages - Mastery of Python and/or Java/Scala; Advanced SQL.
  • Data Warehousing - Deep expertise in BigQuery (slots, reservations, BQML).
  • Processing - Experience with Apache Beam, Spark, and Kafka. DevOps - CI/CD pipelines (GitHub Actions/GitLab), Docker, and Kubernetes (GKE).
  • Demonstrable experience in data engineering, with at least 3 years focused specifically on the Google Cloud ecosystem.
  • Mindset: A "security-first" approach to data and a passion for automation over manual intervention.

Nice to have:

  • Transformation - Experience with dbt (data build tool) is highly preferred.
  • Certification: Professional Google Cloud Data Engineer certification is a significant plus.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.

Benefits & conditions

Pulled from the full job description

  • Employee discount
  • Company pension, This is a full time, permanent role, working 8am to 5pm Monday to Friday. You will be based from either our Altrincham office (WA15 8FH) or Castleford head office (WF10 5HW) on a hybrid basis (working from home with an office day approximately once a week)., * A competitive salary
  • 25 days holiday plus bank holidays
  • Company pension scheme
  • Life Assurance (3 x salary)
  • Discounts on everyday shopping, fashion, tech, holidays, meals out, gyms & more
  • Hybrid working
  • A supportive, friendly office culture, and plenty of chances to learn

Apply for this position