Senior Data Engineer
Role details
Job location
Tech stack
Job description
- Designing and delivering end-to-end data solutions (ingestion, integration, storage, processing, analysis)
- Building robust data pipelines and ETL workflows for complex, high-volume datasets
- Developing tools and automation to enable data-driven insights
- Working with cloud platforms (AWS, Azure, GCP) in secure environments
- Supporting machine learning workflows and data science initiatives
- Collaborating with DevOps, Data Scientists, Analysts, and Cyber teams
- Leveraging big data and orchestration technologies (Spark and NiFi)
- Leading small work packages and mentoring junior engineers
- Contributing to technical design decisions and engineering best practice
Requirements
A leading Defence prime based in Reading are looking for a security cleared Senior Data Engineer to play a key role in shaping data capability within a greenfield, mission-critical Defence & Aerospace environment, such as Aircraft Telemetry, Flight Data or similar - This is an essential requirement, we are not looking for candidates without this experience.
This is a hands-on opportunity to design and deliver secure, scalable data pipelines and platforms supporting complex engineering and operational datasets including high-volume sensor and aerospace data
If you enjoy building from scratch, solving complex data challenges, and working in secure, high impact environments, this role is built for you.
Key skills
- Live security clearance (SC) - This is essential
- Proven experience as a Data Engineer working with Aerospace data (Sensors/Aircraft platform data, testing datasets or similar) - Essential experience, we are not able to consider candidates without this experience.
- Strong Python and SQL skills, with experience building and optimising pipelines
- Solid understanding of SQL and NoSQL database architectures
- Experience with ETL pipelines, orchestration tools and APIs
- Hands-on experience with cloud platforms (AWS, Azure, or GCP)
- Familiarity with Docker and modern engineering tooling
- Exposure to big data ecosystems (Spark, Hadoop, NiFi)
- Familiarity with Infrastructure as Code (Terraform, Ansible)
- Understanding of machine learning models and deployment