Data Engineer
Role details
Job location
Tech stack
Job description
Our client is a large-scale, delivery-focused technology organisation operating an Advanced Technology Centre in Newcastle, supporting complex data and digital programmes across both public and private sector environments.
As a Data Engineer, you will design, build, and maintain scalable data solutions that enable analytics, AI, and operational insight. You'll work as part of collaborative, agile teams to deliver cloud-based data platforms that support intelligent decision-making at scale.
This role is available at Senior Analyst and Specialist levels, offering clear technical development and progression opportunities within a supportive, hands-on engineering environment.
You'll have the opportunity to:
- Build and optimise large-scale data pipelines supporting analytics and AI use cases
- Work on real-time and batch data processing platforms
- Contribute to modern data architectures and engineering standards
- Deliver cloud-native data solutions using industry-leading tools
- Collaborate closely with analytics, AI/ML, and product teams
- Grow rapidly within a technically strong, delivery-oriented environment, As a Data Engineer, you will:
- Build, optimise, and maintain scalable data pipelines (Java primary, with Python exposure)
- Develop real-time streaming and event-driven integrations
- Integrate data from multiple sources including streaming, batch, and APIs
- Work with managed cloud services such as Kinesis, MSK, Lambda, and Glue
- Contribute to data modelling, architecture patterns, and engineering best practice
- Ensure data quality, lineage, governance, and security controls are embedded
- Deploy and maintain data applications using CI/CD tooling
- Use Infrastructure as Code to manage cloud environments
- Collaborate across engineering, analytics, and product teams
- Support and guide junior engineers where appropriate, Working as a Data Engineer, you will gain exposure to:
- Large-scale streaming and event-driven data platforms
- Modern data architectures such as medallion and domain-oriented patterns
- Enterprise data platforms such as Databricks, Snowflake, or equivalent
- CI/CD pipelines and Infrastructure-as-Code practices
- Client-facing delivery within regulated and security-conscious environments
- Engineering teams delivering real-time, low-latency systems
Why Join?
- Work within a modern Advanced Technology Centre in Newcastle
- Deliver data solutions that support analytics, AI, and operational insight
- Develop strong cloud and streaming data engineering skills
- Clear progression across senior technical pathways
- Hybrid working model balancing flexibility with collaboration
Requirements
- Strong programming experience in Java (preferred) or Python
- Experience building ETL/ELT or streaming data pipelines
- Hands-on experience with Kafka, Flink, or Spark (streaming experience preferred)
- Good understanding of stream-processing concepts (event time, state, backpressure)
- Experience with cloud platforms (AWS preferred; Azure or GCP also considered)
- Knowledge of software engineering best practices (testing, CI/CD, Git)
- Experience with containerised workloads (Docker, Kubernetes/EKS)
- Comfort working in Agile delivery environments
- Strong communication skills across technical and non-technical stakeholders
- Eligibility to work on projects requiring UK Government BPSS and active SC clearance