Pyspark Engineer (AWS Glue) - Hybrid Position

Akkodis
Stevenage, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 80K

Job location

Stevenage, United Kingdom

Tech stack

Amazon Web Services (AWS)
Cloud Computing
Databases
ETL
Data Transformation
Data Migration
Data Mining
Data Structures
Data Warehousing
DevOps
Identity and Access Management
Python
Pentaho Data Integration
Performance Tuning
Role-Based Access Control
SQL Databases
Talend
Scripting (Bash/Python/Go/Ruby)
Informatica Powercenter
Snowflake
Boomi
Indexer
SC Clearance
PySpark
Amazon Web Services (AWS)
Star Schema
Mulesoft

Job description

  • Analyse existing data structures and understand business and technical requirements for migration initiatives.
  • Design and deliver robust data migration strategies and ETL solutions.
  • Develop automated data extraction, transformation, and loading (ETL) processes using industry-standard tools and scripts.
  • Work closely with stakeholders to ensure seamless migration and minimal business disruption.
  • Plan, coordinate, and execute data migration projects within defined timelines.
  • Ensure the highest standards of data quality, integrity, and security.
  • Troubleshoot and resolve data-related issues promptly.
  • Collaborate with wider engineering and architecture teams to ensure migrations align with organisational and regulatory standards.

Technologies:

  • AWS
  • Cloud
  • ETL
  • IAM
  • Informatica
  • MuleSoft
  • Python
  • PySpark
  • RBAC
  • SQL
  • Security
  • Snowflake
  • Talend
  • AWS Glue
  • DevOps

Requirements

  • Expert-level SQL skills for complex query development, performance tuning, indexing, and data transformation across on-premise databases and AWS cloud environments.
  • Strong hands-on experience with ETL processes and tools (Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi) or scripting using Python, PySpark, and SQL.
  • Solid understanding of data warehousing and modelling techniques (Star Schema, Snowflake Schema).
  • Familiarity with security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, as well as AWS security features including IAM, KMS, and RBAC.
  • Ability to identify and resolve data quality issues across migration projects.
  • Strong track record of delivering end-to-end data migration projects and working effectively with both technical and non-technical stakeholders.
  • SC Clearance is required or candidates must be eligible to obtain it.

About the company

We are looking for a Security-Cleared Pyspark expert to take the reins on a range of highly ambitious Data Migration projects supporting high-impact programmes across the UK. This is a unique opportunity to work on cutting-edge cloud, software, and infrastructure projects that shape the future of technology in both public and private sectors. As part of our collaborative team, you will deliver scalable, next-generation digital ecosystems, with a salary of up to £80,000 and a variety of wider benefits.

Apply for this position