Software Developer SFIA4 - Data/Analytics

Lucid Support Services Ltd
2 days ago

Role details

Contract type
Contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Remote

Tech stack

Agile Methodologies
Airflow
Amazon Web Services (AWS)
Data analysis
Continuous Integration
ETL
Data Security
Distributed Computing Environment
File Transfer
Python
Operational Databases
Trend Micro
Software Engineering
Tripwire
Workflow Management Systems
Jupyter Notebook
Data Processing
Data Ingestion
Spark
Gitlab
Containerization
Infrastructure Automation Frameworks
Terraform
Data Pipelines
Docker
Vulnerability Analysis

Job description

Job Title: Software Developer (x2) - SC Eligible Location: Remote/UK-based with occasional on-site presence Duration: 6-12 months (initial contract) Clearance: Must be eligible for Security Check (SC) clearance SFIA Level: 4 - Software Development

Rate: £325 (Outside London)/£375 (London)

Role Overview

We're seeking two experienced Software Developers to support the ongoing maintenance and enhancement of a significant central government data asset and its associated data pipelines. The project aggregates event data from multiple citizen interaction channels, transforming and integrating it into a unified analytical resource accessible through internal analytical platforms.

You will play a key role in designing, maintaining, and improving reliable data ingestion and publishing pipelines, ensuring strong performance, resilience, and stability. The work includes hands-on development, technical leadership, and mentorship of junior colleagues.

Responsibilities

  • Lead development across sets of related stories within agile sprints.
  • Take ownership of system knowledge and proactively share understanding with other developers and stakeholders.
  • Collaborate with Product Owners, Business Analysts, and wider technical teams to clarify requirements and deliver robust solutions.
  • Operate and enhance production data pipelines and publishing services.
  • Identify and implement improvements for system performance, scalability, and security.
  • Coach and guide junior team members, fostering technical growth and best practice adoption.

Technical Skill Requirements

  • Strong proficiency in Python for data manipulation and ETL processes.
  • Hands-on experience with AWS Services (including compute, storage, and security components).
  • Infrastructure automation using Terraform.
  • Distributed data processing with Apache Spark.
  • Workflow orchestration using Apache Airflow.
  • Containerisation and environment management with Docker.
  • CI/CD workflows via GitLab.
  • Familiarity with Security Scanning Agents (Trivy, Trend Micro, Wiz, etc).
  • Experience with Jupyter Notebooks for data analysis and development.
  • Understanding of integration with government or enterprise-level data products (eg, Data Access Layers, Secure File Transfer services).

Candidate Profile

  • Demonstrated ability to work in fast-paced, agile development environments.
  • Strong problem-solving skills and ability to design scalable, maintainable solutions.
  • Excellent communication and collaboration skills.
  • Prior experience in public sector or government data projects (desirable but not essential).

If you are available and interested in this opportunity, please apply for further information. Please note that due to high volumes of applications we are unable to contact every applicant. If you do not hear back from us within 7 days of sending your application, please assume that you have not been successful on this occasion.

At Lucid, we celebrate difference and value diverse perspectives, underpinned by our values 'Honesty, Integrity and Pragmatism'. We are proud to provide equal opportunities in line with our Diversity and Inclusion policy and welcome applications from all suitably qualified or experienced people, regardless of personal characteristics. If you have a disability or health condition and seek support throughout the recruitment process, please do not hesitate to contact us via the details below.

Requirements

  • Strong proficiency in Python for data manipulation and ETL processes.
  • Hands-on experience with AWS Services (including compute, storage, and security components).
  • Infrastructure automation using Terraform.
  • Distributed data processing with Apache Spark.
  • Workflow orchestration using Apache Airflow.
  • Containerisation and environment management with Docker.
  • CI/CD workflows via GitLab.
  • Familiarity with Security Scanning Agents (Trivy, Trend Micro, Wiz, etc).
  • Experience with Jupyter Notebooks for data analysis and development.
  • Understanding of integration with government or enterprise-level data products (eg, Data Access Layers, Secure File Transfer services).

Candidate Profile

  • Demonstrated ability to work in fast-paced, agile development environments.
  • Strong problem-solving skills and ability to design scalable, maintainable solutions.
  • Excellent communication and collaboration skills.
  • Prior experience in public sector or government data projects (desirable but not essential).

Apply for this position