Data Engineer
Role details
Job location
Tech stack
Job description
We aren't just migrating legacy systems at PTSG; we are building the future. We're looking for a Senior Data Engineer to join our team. As a Senior GCP Data Engineer, you will be the architect and builder of our data backbone. We're looking for someone who doesn't just "move data" but designs elegant, scalable, and resilient systems that turn raw information into a competitive advantage. You'll be a mentor to the team, a partner to our Data Scientists, and a champion of Google Cloud best practices. This is a full time, permanent role, working 8am to 5pm Monday to Friday. You will be based from either our Altrincham office (WA15 8FH) or Castleford head office (WF10 5HW) on a hybrid basis (working from home with an office day approximately once a week). What You'll Do: Pipeline Architecture: Design, develop, and maintain high-performance batch and real-time data pipelines using Dataflow (Apache Beam), Cloud DataProc, and Cloud Pub/Sub. Data Modeling: Architect sophisticated data warehouses in BigQuery, utilizing partitioning, clustering, and optimized schema designs (Star/Snowflake) to ensure lightning-fast query performance. Infrastructure as Code (IaC): Manage and deploy cloud infrastructure using Terraform, ensuring environment consistency across Dev, Staging, and Production. Orchestration: Build and manage complex workflows using Cloud Composer (Apache Airflow) to ensure data integrity and timely delivery. Optimization & Governance: Monitor and optimize BigQuery costs and performance. Implement robust data governance, security, and IAM policies. Mentorship: Lead code reviews, define engineering standards, and mentor junior engineers on GCP-native patterns.
Requirements
Must have: Cloud Platform - Expert-level knowledge of Google Cloud Platform (GCP). Languages - Mastery of Python and/or Java/Scala; Advanced SQL. Data Warehousing - Deep expertise in BigQuery (slots, reservations, BQML). Processing - Experience with Apache Beam, Spark, and Kafka. DevOps - CI/CD pipelines (GitHub Actions/GitLab), Docker, and Kubernetes (GKE). Demonstrable experience in data engineering, with at least 3 years focused specifically on the Google Cloud ecosystem. Mindset: A "security-first" approach to data and a passion for automation over manual intervention. Nice to have: Transformation - Experience with dbt (data build tool) is highly preferred. Certification: Professional Google Cloud Data Engineer certification is a significant plus. Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.
Benefits & conditions
A competitive salary 25 days holiday plus bank holidays Company pension scheme Life Assurance (3 x salary) Discounts on everyday shopping, fashion, tech, holidays, meals out, gyms & more Hybrid working A supportive, friendly office culture, and plenty of chances to learn