Data Engineer

Big Red Recruitment
Bradford, United Kingdom
10 days ago

Role details

Contract type
Temporary to permanent
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 60K

Job location

Bradford, United Kingdom

Tech stack

Microsoft Excel
Azure
Data Architecture
Information Engineering
ETL
Data Transformation
Data Systems
Azure
SQL Databases
Data Lake
Tools for Reporting
Azure
Data Pipelines
Legacy Systems
Databricks

Job description

You will be working with a major UK-based organisation that operates across a portfolio of well-known brands, delivering at scale within a complex distribution environment. With deeply integrated ERP and finance systems, the business is now undergoing a significant shift in how it uses data. Following recent system changes, there is a clear focus on modernising reporting and building a more transparent, reliable data landscape that can support future growth. Project overview: This programme is focused on rebuilding critical financial reporting that has been disrupted, including complex margin and rebate calculations that sit at the heart of commercial performance. Alongside this, you will help design and deliver scalable data pipelines, transformation logic, and reporting outputs within a modern Azure environment. The work will establish a repeatable blueprint that can be rolled out across multiple business units, playing a key role in the transition away from legacy systems to a clean, future-ready data platform. What you will be doing: You will take ownership of building and maintaining robust data pipelines within Azure, working hands-on to deliver reliable, high-performance data solutions. This includes developing workflows in Databricks and or Synapse, and transforming data from a range of sources such as CSV and Excel into clean, usable datasets. A key part of the role will involve translating complex pricing and margin logic into scalable data models, ensuring accuracy and consistency across reporting. You will also focus on improving data quality, validation processes, and overall performance, while supporting the move away from legacy reporting tools. Alongside this, you will contribute to the design of a scalable data architecture that can be leveraged across multiple business units. Tech environment: You will be working across a modern Azure stack including Data Lake, Synapse, Databricks, Data Factory, and SQL, with exposure to complex enterprise data environments.

Requirements

We are looking for someone with strong experience across Azure data engineering tools and a proven track record of building ETL or ELT pipelines in production environments. You should have solid SQL and data transformation skills, along with experience handling large and complex datasets. Equally important is your ability to work closely with analysts and business stakeholders, translating requirements into effective data solutions., Experience working with ERP or finance data would be highly beneficial, as would any background in transformation or migration programmes where legacy systems are being modernised.

Apply for this position