Senior Data Engineer - OMNI & ODS Platforms

ETeam Inc
Glasgow, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 105K

Job location

Glasgow, United Kingdom

Tech stack

Java
API
Computing Platforms
Build Automation
Cloud Engineering
Code Review
Continuous Integration
Data as a Services
Data Architecture
Information Engineering
Data Governance
Data Infrastructure
Data Stores
Data Systems
DevOps
Distributed Data Store
Distributed Systems
Event-Driven Programming
Fault Tolerance
Python
MongoDB
NoSQL
Operational Data Store
Secure Coding
Data Streaming
Datadog
Data Logging
Data Ingestion
Database Optimization
Spring-boot
GIT
Event Driven Architecture
Containerization
Kubernetes
Infrastructure Automation Frameworks
Apache Flink
Integration Frameworks
Kafka
Spark Streaming
Data Management
Api Design
REST
Stream Processing
Software Version Control
Docker
Microservices

Job description

We are seeking a highly skilled Senior Data Engineer to join the Development & Engineering team, working on the OMNI and Operational Data Store (ODS) platforms. This role is heavily data centric, focused on building and evolving scalable, event driven data platforms and Back End services that power downstream analytics, operational use cases, and Real Time integrations., You will work closely with internal engineering teams, contributing to architecture discussions, design reviews, and shared delivery ownership, ensuring data solutions are secure, resilient, observable, and aligned with enterprise standards., Data Platform & Engineering Design, develop, and enhance data driven services on the OMNI and ODS platforms using Java (Spring Boot) and/or Python. Build event driven and streaming data pipelines supporting Real Time and near Real Time data processing. Develop and maintain microservices and data APIs that are scalable, resilient, and performance optimised. Implement data ingestion, transformation, enrichment, and persistence layers, ensuring data quality, consistency, and reliability. Work with MongoDB and other data stores, applying appropriate NoSQL data modelling and indexing strategies.

Streaming & Distributed Data Systems Build and operate distributed data pipelines using Apache Kafka and related ecosystem components. Design robust event schemas and messaging patterns to support decoupled, scalable data flows. Address challenges related to ordering, idempotency, replayability, and fault tolerance in streaming systems.

Architecture, Security & Resilience Contribute to data architecture and platform design, ensuring alignment with enterprise standards and long term scalability. Ensure all data solutions follow secure coding and data protection practices, including authentication, authorisation, and encryption where applicable. Implement strong error handling, logging, monitoring, and alerting to support operational excellence.

DevOps & Delivery Apply DevOps best practices across the data engineering life cycle: o Git based version control and code reviews o CI/CD pipelines for data services and pipelines o Automated builds, testing, and deployments Use containerisation and orchestration technologies (eg Docker, Kubernetes) to support cloud native, scalable data platforms. Support production stability, proactively identifying and addressing performance, scalability, and reliability bottlenecks.

Collaboration & Ways of Working Participate in joint design sessions, architecture discussions, and technical reviews with Client' internal engineering teams. Share data engineering knowledge and best practices, contributing to a strong engineering culture. Take shared ownership of delivery outcomes from design through to production support.

Requirements

The successful candidate will play a key role in designing, developing, and optimising data ingestion, streaming, transformation, and persistence layers, leveraging modern microservices architectures and data streaming technologies. The role requires strong hands on engineering skills combined with a DevOps mindset and the ability to collaborate effectively across teams., Strong Back End engineering experience with Java (Spring Boot) and/or Python in a data centric environment. Proven experience designing and building data driven microservices architectures. Hands on experience with event driven systems and data streaming, ideally using Apache Kafka. Solid experience with MongoDB and a strong understanding of NoSQL data modelling principles. Strong knowledge of API design (RESTful services) and data integration patterns. Experience building and operating distributed systems, with a focus on scalability, resilience, and performance. Solid understanding of secure coding practices, data protection, and access controls. Practical experience with DevOps practices, including CI/CD pipelines, containerisation, and infrastructure automation. Strong problem solving skills, with high attention to data quality, reliability, and code standards. Excellent communication skills and the ability to operate effectively in cross functional engineering teams., Experience with stream and data processing frameworks such as Kafka Streams, Apache Flink, or Spark Streaming. Exposure to cloud platforms and managed data or streaming services. Familiarity with observability tooling (metrics, logs, traces) in distributed data systems. Experience working within large enterprise or regulated environments, such as financial services. Understanding of data governance, lineage, and data quality controls.

About the company

Glasgow, Scotland - £402/Day (MUST BE PAYE THROUGH UMBRELLA) Contract Posted by: eTeam Workforce Limited Posted: Wednesday, 22 April 2026 We are a Global Recruitment specialist that provides support to the clients across EMEA, APAC, US and Canada. We have an excellent job opportunity for you.

Apply for this position