Data Engineer

Experis
Ubley, United Kingdom
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 66K

Job location

Ubley, United Kingdom

Tech stack

Airflow
Amazon Web Services (AWS)
Azure
Big Data
Cloud Computing
Continuous Integration
Data Architecture
Data Integration
ETL
Data Security
Data Warehousing
DevOps
Hadoop
Python
Performance Tuning
Scrum
Standard Sql
SQL Databases
Technical Data Management Systems
Azure
Spark
Amazon Web Services (AWS)
Kafka
Apache Nifi
Data Management
Data Pipelines

Job description

  • Design, build and maintain scalable data pipelines (batch and/or streaming) within secure environments
  • Develop and optimise ETL / ELT processes for high-volume, structured and semi-structured datasets
  • Work with stakeholders to translate complex requirements into technical data solutions
  • Ensure data platforms meet security, accreditation, and information assurance standards
  • Support data quality, lineage, monitoring, and performance optimisation
  • Contribute to data architecture and platform evolution, including cloud and on-prem solutions
  • Collaborate in Agile delivery teams (Scrum / SAFe environments)

Technologies:

  • Airflow
  • AWS
  • AWS Glue
  • Azure
  • Big Data
  • Cloud
  • DevOps
  • ETL
  • GCP
  • Support
  • Python
  • SQL
  • Security
  • Hadoop
  • Kafka
  • Spark, We are seeking an experienced DV (SC Considered) Cleared Data Engineer to support a high-profile, mission-critical programme within a secure government environment. The role will focus on designing, building, and maintaining robust data pipelines and platforms that support advanced analytics, intelligence, and operational decision-making. This position is based in Bath, requiring on-site presence three days a week, and offers a competitive rate of £550 outside IR35. Join our team to be part of this impactful opportunity!

Requirements

  • Proven experience as a Data Engineer in secure or government environments
  • Strong skills in Python, SQL, and data pipeline development
  • Experience with data integration tools (e.g. Airflow, NiFi, Azure Data Factory, AWS Glue, or similar)
  • Hands-on experience with cloud platforms (AWS, Azure, or GCP - secure tenants preferred)
  • Knowledge of data modelling, data warehousing, and big data technologies
  • Familiarity with DevOps / CI-CD practices within restricted environments
  • Strong understanding of data security, governance, and access controls

Apply for this position