Data Engineer - Highly competitive salary

Anson McCade
Bristol, United Kingdom
5 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Bristol, United Kingdom

Tech stack

Agile Methodologies
Artificial Intelligence
Airflow
Automation of Tests
Google BigQuery
Continuous Integration
ETL
Data Warehousing
Relational Databases
Data Flow Control
PostgreSQL
Microsoft SQL Server
MySQL
Oracle Applications
Data Streaming
Google Cloud Platform
Data Storage Technologies
Spark
Data Lake
Kafka
Data Management
Software Version Control
Data Pipelines

Job description

We're partnering with a leading technology consultancy that helps organisations harness the power of data to modernise platforms and drive business outcomes. As a Data Engineer, you'll be at the forefront of designing and delivering cloud-native solutions on Google Cloud, turning complex datasets into actionable insights.

In this role, you'll work on diverse projects, from batch and streaming pipelines to data warehouses, data lakes, and AI-powered analytics platforms. This is a hands-on role where your expertise will guide delivery, shape best practices, and mentor other team members., * Lead the design, development, and deployment of scalable data pipelines using BigQuery, Dataflow, Dataproc, and Pub/Sub

  • Automate ETL/ELT workflows and orchestrate pipelines with tools such as Cloud Composer
  • Contribute to architecture and end-to-end solution design for complex data platforms
  • Set engineering standards and ensure high-quality code, deployment, and documentation practices
  • Collaborate with clients and internal teams, translating business requirements into practical solutions
  • Mentor and coach junior engineers to grow their skills and adopt best practices, This is a chance to work on high-impact, cloud-native projects as a Data Engineer, taking ownership of technical decisions, shaping delivery practices, and developing your career. You'll join a supportive environment where mentoring and learning are highly valued, and your work will directly contribute to the success of complex data programmes.

Requirements

  • Proven experience building production-ready solutions on Google Cloud
  • Expertise with batch and streaming frameworks like Apache Spark or Beam
  • Strong understanding of data storage, pipeline patterns, and event-driven architectures
  • Experience with CI/CD, version control, automated testing, and Agile delivery
  • Ability to communicate clearly to both technical and non-technical stakeholders
  • Mentoring or coaching experience

Bonus skills: Kafka, enterprise data platform migrations, RDBMS experience (Postgres, MySQL, Oracle, SQL Server), and exposure to ML pipelines. Security Eligibility

Candidates must be eligible for UK Security Clearance (SC or DV) if required. Why This Role?

Apply for this position