SC Cleared Data Engineer

iO Associates
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 65K

Job location

Charing Cross, United Kingdom

Tech stack

Amazon Web Services (AWS)
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Azure
Big Data
Cloud Computing
Code Review
Continuous Integration
Information Engineering
Data Infrastructure
Data Integration
ETL
Data Security
Data Systems
DevOps
Cloud Services
DataOps
Spark
Multi-Cloud
Electronic Medical Records
Infrastructure as Code (IaC)
Gitlab
Event Driven Architecture
Containerization
Infrastructure Automation Frameworks
Data Management
Video Streaming
Functional Programming
Terraform
Software Version Control
Data Pipelines

Job description

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Develop and maintain data pipelines and ETL processes using cloud-native technologies (e.g. Spark).

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Support integration across cloud and on-premise data sources.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Help design and implement event-driven data pipelines, improving performance and scalability.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Contribute to the development of data access layers and reusable ETL components.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Support onboarding of new data sources, including preparation and transformation of datasets.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Work with senior engineers to apply data engineering best practices and standards.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Contribute to CI/CD pipelines and Infrastructure as Code (IaC) activities.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Operate within AWS and Azure environments, supporting deployment and maintenance of data solutions.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Work with services including:

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> AWS: EC2, EMR, S3, Lambda

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Azure: Blob Storage and related services

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Collaborate with wider teams to ensure data quality, reliability, and performance.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Participate in knowledge sharing and continuous improvement initiatives., Hands-on experience with cloud platforms (AWS and/or Azure; exposure to both advantageous).

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Solid understanding of ETL processes and data integration concepts.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Basic experience with Infrastructure as Code tools (e.g. Terraform) and/or CI/CD tools (e.g. GitLab).

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Familiarity with key cloud services such as:

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> AWS: EC2, S3 (EMR/Lambda exposure beneficial)

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Azure: Blob Storage

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Understanding of version control and collaborative development practices.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Awareness of data security, governance, and compliance considerations.

Desirable

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Experience working with event-driven architectures or streaming technologies.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Exposure to data access layers and data platform design.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Experience in public sector or regulated environments.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Knowledge of cloud networking fundamentals.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Background in fraud, analytics, or large-scale data use cases.

Security Requirements

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> This role requires Security Clearance (SC).

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Candidates must be eligible and willing to undergo the clearance process.

Engineering Practices

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Growing understanding of DevOps and Agile delivery practices.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Commitment to writing maintainable, scalable code.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Enthusiasm for automation and reusable engineering patterns.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Active participation in code reviews and collaborative development sessions.

Requirements

We are seeking a skilled Data Engineer to support the design, development, and optimisation of data platforms across complex transformation programmes. This position is ideal for an engineer with hands-on experience in data pipelines, cloud platforms, and ETL processes who wants to deepen their technical expertise within a collaborative, delivery-focused environment.

You'll play a key role in building reliable, scalable data solutions that enable analytics, reporting, and operational use cases., * p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Proven experience in data engineering or ETL development.

  • p]:pt-0 [&>p]:mb-2 [&>p]:my-0'> Strong knowledge of data pipelines and processing frameworks (e.g. Spark or similar).

Apply for this position