Senior AWS Data Engineer - SC Cleared

Sanderson Recruitment Plc
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 117K

Job location

Charing Cross, United Kingdom

Tech stack

Agile Methodologies
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Business Analytics Applications
Confluence
JIRA
Big Data
Computer Programming
Databases
Continuous Integration
ETL
Data Transformation
Data Virtualization
Data Warehousing
DevOps
Amazon DynamoDB
Job Scheduling
Python
Open Source Technology
Oracle Applications
Pentaho Data Integration
Scrum
Power BI
Standard Sql
SAS (Software)
Workflow Management Systems
Grafana
Electronic Medical Records
Sap Business Objects
Gitlab
SC Clearance
Amazon Web Services (AWS)
Data Lake
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Tools for Reporting
Data Delivery
Cloudwatch
Terraform
Data Pipelines
Dynatrace
Redshift

Job description

We are seeking an experienced Senior AWS Data Engineer to support a large-scale central government data transformation programme. The role sits within a major data delivery area that provides data and analytics solutions aligned to critical business priorities, including cloud migration, fraud reduction, and revenue protection.

The primary focus of the role is the migration of data from Legacy on-premise platforms (primarily Oracle and Informatica) to a modern, AWS cloud-native architecture.

You will be part of an Agile delivery team, working closely with engineers, architects, project managers, business analysts, and key stakeholders., * Support cloud transformation initiatives, working closely with the technical lead on design and stakeholder engagement

  • Provide technical guidance and support to junior engineers within the team
  • Design, develop, and test robust data pipelines for ingestion, processing, and transformation
  • Implement and maintain ETL/ELT workflows feeding data warehouses, data lakes, or lakehouse platforms
  • Leverage open-source and AWS-native tools to deliver scalable data solutions
  • Apply DevOps practices, including CI/CD and automation
  • Ensure solutions align with agreed architectures and non-functional requirements
  • Proactively identify opportunities to improve engineering quality and delivery practices

Requirements

  • Active SC Clearance (must already be held)
  • Strong hands-on experience with core AWS data services, including:
  • AWS Glue
  • Lambda
  • S3
  • Redshift
  • Strong programming skills in Python
  • Solid working knowledge of SQL
  • Experience with data warehouse and database technologies, including AWS Redshift and AWS RDS
  • Experience designing or supporting AWS-based data lakes using S3
  • Proven experience delivering large-scale data engineering solutions
  • Experience working in Agile (Scrum) delivery environments
  • Strong communication and stakeholder-management skills
  • Proactive, collaborative, and delivery-focused mindset

Desirable Skills

  • Knowledge of open table formats (Iceberg, Delta)
  • Experience with additional AWS services (CloudWatch, SNS, Athena, DynamoDB, EMR, Kinesis)
  • Data modelling
  • Job scheduling and orchestration tools
  • Data virtualisation tools (eg Denodo)
  • ALM tools (Jira, Confluence)
  • CI/CD tooling (GitLab, Terraform)
  • Reporting tools (Power BI, Business Objects, Pentaho BA)
  • Analytics platforms (eg SAS Viya)
  • Observability tools (Grafana, Dynatrace)

Apply for this position