Data Analytics Engineer - Azure, Fabric, SQL, PySpark/Python
Role details
Job location
Tech stack
Job description
Data Analytics Engineer - Azure, Fabric, SQL, PySpark/Python
Contract (Temp to Perm)
Start date: ASAP Contract length: 3 months initial (with potential to convert to permanent) Location: Fully remote (occasional travel to Flintshire and London may be required) Rate: Up to £700 per day (Umbrella)
Overview
We are looking for an experienced Data Analytics Engineer to join a growing Data Centre of Excellence on an initial contract, with the possibility of moving into a permanent role once the contract period concludes.
This is a founding-style position where you'll help establish scalable analytics foundations, enabling consistent, high-quality insight delivery across the business.
You'll work closely with insight analysts and technology teams to create reusable metrics, robust semantic models, and business-ready datasets within a modern data platform environment (Microsoft Fabric).
What the Data Analytics Engineer will be doing
- Building and optimising semantic models (star schemas, measures, calculation patterns)
- Creating and maintaining a reusable KPI and metrics library with clear definitions and ownership
- Delivering business-ready "gold" datasets aligned to medallion architecture principles
- Defining analytics engineering standards and ways of working (coding standards, documentation, testing, release practices)
- Establishing data contracts, versioning, and change control
- Partnering with data engineers to ensure data availability, reliability, and performance
- Enabling analysts through templates, exemplars, and best-practice guidance
Skills & experience for the Data Analytics Engineer
- Analytics or BI engineering background
- Advanced SQL with proven semantic modelling experience
- Understanding of dimensional modelling and star schemas
- Experience working with modern data architectures (lakehouse/medallion concepts)
- Comfortable operating in pre-production environments and setting standards in a greenfield context
- Microsoft Fabric experience (Lakehouse/Warehouse, pipelines, notebooks, semantic models) and Azure
- PySpark/Python exposure
- Experience in commercial, supply chain, or manufacturing domains (healthcare a plus)
- Previous experience with Snowflake is desirable
Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.
Requirements
- Analytics or BI engineering background
- Advanced SQL with proven semantic modelling experience
- Understanding of dimensional modelling and star schemas
- Experience working with modern data architectures (lakehouse/medallion concepts)
- Comfortable operating in pre-production environments and setting standards in a greenfield context
- Microsoft Fabric experience (Lakehouse/Warehouse, pipelines, notebooks, semantic models) and Azure
- PySpark/Python exposure
- Experience in commercial, supply chain, or manufacturing domains (healthcare a plus)
- Previous experience with Snowflake is desirable