Data Engineer

Solvace
Manchester, United Kingdom
2 days ago

Role details

Contract type
Temporary to permanent
Employment type
Part-time (≤ 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 130K

Job location

Remote
Manchester, United Kingdom

Tech stack

.NET
Artificial Intelligence
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Apache HTTP Server
Code Coverage
Databases
Data as a Services
Information Engineering
ETL
Cursor (Graphical User Interface Elements)
Linux
Digital Data
Github
Python
PostgreSQL
Microsoft SQL Server
DataOps
Statistical Process Control (SPC)
SQL Databases
Data Processing
Spark
Change Data Capture
Backend
Data Lake
Kubernetes
Apache Flink
Real Time Data
Kafka
Data Management
Terraform
Stream Processing
Data Pipelines
Legacy Systems
Databricks
Microservices

Job description

This is an entirely forward-looking role. While there is an active platform modernisation underway (the engineering team is migrating a .NET platform from SQL Server to PostgreSQL and containerising for Kubernetes), this role's primary focus is building new data pipelines and analytics infrastructure, not maintaining legacy systems. That said, collaboration with the platform engineering team is important, as the data pipelines need to ingest from both the existing SQL Server estate and the new PostgreSQL databases., * Weaviate or other vector database experience - currently used for semantic search in the AI agent layer

Why Join?

  • Greenfield data platform - the entire data engineering practice needs to be built from scratch. A lakehouse POC is being evaluated but the production architecture is theirs to define - patterns, tooling, and platform decisions from day one

  • Direct AI impact - every pipeline they build directly unblocks AI agent capabilities. The KAI copilot is live in production and evolving towards agentic AI - autonomous agents that take actions, orchestrate workflows, and operate across systems. This role provides the data foundation that makes agentic capabilities possible

  • Event streaming architecture - opportunity to design and implement a Kafka-based event backbone, shaping the platform's real-time data architecture

  • Manufacturing AI - opportunity to work at the intersection of industrial operations and generative AI, a domain with massive untapped potential. Rich, real-world data: quality inspections, KPIs, maintenance records, safety observations, process parameters across global manufacturing sites

  • Small team, high autonomy - the Innovation Hub operates with startup-level autonomy inside a funded, growing company

  • Leadership trajectory - the data function is critical to the platform and will grow. As an early hire, there is an open career path that will grow with the function - earned, not guaranteed Similar jobs, Senior Data Platform Engineer Remote, United Kingdom £90,000 to £105,000 base salary This is an opportunity to join a small, fast-growing team building large-scale data and AI platforms used to drive advanced analytics and high-volume data processing. You will work at..., GCP - Senior Data Engineer Location: Fully remote - UK Contract length: Initial 6 months Working frequency: Part-time Rate: £450-£500 per day IR35: Outside IR35 Tech: GCP / Datastream / SQL / Python SR2 is supporting a client delivering a large-scale enterprise data..., Senior MS Fabric Data Engineer (Azure) - Location: Warrington Salary: £60k + Employment Type: Permanent About the Role We're looking for a talented Senior / Principal Data Engineer to work on a long-term engagement with a key client, currently based in North East,..., Data Engineer Fully Remote UK Based £60,000 to £70,000 plus bonus of up to 25% This is an exciting opportunity to join a high-growth data organisation where you will help shape the internal tooling and Python libraries that power their data products. If you enjoy..., Delivery Manager - Data Engineering ?? Manchester (Hybrid) We're working with a leading global investment firm looking to hire a Delivery Manager - Data Engineering to lead a team of Data, DataOps and MLOps engineers. This is a great opportunity to take ownership of...

Requirements

  • Python - primary language for data pipelines, scripting, and integration with the AI stack. This is a Python-first team

  • SQL (SQL Server + PostgreSQL) - deep experience required across both engines. The platform runs a large-scale multi-tenant database estate with per-client schemas (local, global, corporate). The new platform targets Aurora PostgreSQL, with several modules already migrated. Pipelines must bridge both.

  • Databricks / Apache Spark - a lakehouse POC is being evaluated with Unity Catalog and Delta Sharing via AWS PrivateLink. Experience with Databricks or equivalent analytics platforms is important for assessing and scaling the right approach

  • Apache Kafka - event streaming experience is essential. The platform is evaluating Kafka as the strategic backbone for ordered event logs, consumer group replay, and guaranteed partition ordering as the microservices fleet scales

  • ETL/ELT pipeline design - building and maintaining pipelines into a medallion architecture (bronze/silver/gold layers) on Databricks or equivalent analytics platform

  • Delta Lake - understanding of lakehouse table formats, time travel, and ACID transaction patterns

  • AWS data services - RDS, S3, Lambda, Glue or equivalent. The platform runs entirely on AWS * Linux / CLI proficiency - must be comfortable working

AI-Assisted Development Methodology ->

Solvace is transitioning towards AI-assisted development as a core engineering practice. Candidates should demonstrate:

  • Hands-on experience with AI coding tools - Claude Code, OpenAI Codex, GitHub Copilot, Cursor, or similar. We're looking for engineers who have integrated these tools into their professional workflow, not just experimented casually

  • Spec-driven development - ability to write clear technical specifications that can be used to drive both human and AI-assisted implementation, with strong evaluation criteria and test coverage

  • Portfolio evidence - professional projects or side projects that demonstrate AI-assisted development practices. Contributions to or experimentation with emerging projects like OpenClaw are a strong signal

  • Testing and evaluation rigour - experience building robust test suites, automated quality gates, and evaluation frameworks that ensure AI-assisted code meets production standards

Nice-to-Have ->

  • Apache Flink - for real-time stream processing as the platform moves from batch to streaming analytics

  • Apache Iceberg - experience with open table formats alongside or as an alternative to Delta Lake

  • Go or Rust - valued as evidence of strong backend engineering and systems thinking, even though Python is the primary language

  • Multi-tenant data architectures - the platform runs per-client schemas with tenant isolation requirements. Understanding tenant isolation patterns is a significant advantage

  • Terraform / IaC - infrastructure is managed via Terraform; the Databricks POC has been codified in Terraform

  • dbt or similar transformation framework

  • Manufacturing / industrial data domain experience (sensor data, quality metrics, OEE, SPC)

  • CDC (Change Data Capture) - the platform is building CDC pipelines for the SQL Server to PostgreSQL transition, Job Description Mechanical Engineer - Data centres Building Services Design Oldham / Manchester Area Salary up to & around £55,000 + benefits Do you have experience in Building Services Design and would now like to work on mission critical projects? We're working...

Benefits & conditions

Job Description Senior Data Platform Engineer Remote, United Kingdom £90,000 to £105,000 base salary This is an opportunity to join a small, fast-growing team building large-scale data and AI platforms used to drive advanced analytics and high-volume data processing...., Data Engineer 6-Month Contract Manchester - Initially 2-3 days per week in the office, reducing to 1-2 days per month after the first three weeks. Up to £90 per hour - Inside IR35 (based on a 37.5-hour working week) A global technology business is seeking an experienced..., Job Description Data Engineer 6-Month Contract Manchester - Initially 2-3 days per week in the office, reducing to 1-2 days per month after the first three weeks. Up to £90 per hour - Inside IR35 (based on a 37.5-hour working week) A global technology business is...

About the company

Job Description Marks Sattin is partnered with a leading European retail and vending technology company that is building a brand-new data function and looking for an experienced Senior Data Engineer to help shape, design, and deliver a modern data platform from the ground..., Job Description Senior Data Engineer £65,000 - £75,000 Manchester or Leeds (Hybrid) This is an opportunity to join a growing, modern banking organisation that is investing heavily in its data platform as it scales. You will have a real impact on how data is engineered,...

Apply for this position