Data Engineer

Connells Group
Central Milton Keynes, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Central Milton Keynes, United Kingdom

Tech stack

Agile Methodologies
Amazon Web Services (AWS)
Confluence
JIRA
Azure
Cloud Computing
Cloud Engineering
Continuous Integration
Data Architecture
Information Engineering
Data Systems
Data Warehousing
Github
Python
SQL Databases
Data Streaming
Data Ingestion
Spark
Data Strategy
GIT
Data Lake
Core Data
Information Technology
Data Pipelines

Job description

Job DescriptionWe're looking for a talented and passionate Core Data Engineer to join our Group Technology team in Milton Keynes. You'll play a key role in delivering Connells Group Reporting Data, covering architecture, data modelling, design, and pipelines. You'll work closely with technical specialists to innovate, share ideas and continually enhance team capability. Your work will support strategic decision-making across the Group by delivering accurate, timely reporting for all brands and business units.We offer a hybrid working arrangement with 1 day per week in our Milton Keynes office.Key Responsibilities

  • Apply best practices for data design, ensuring scalable, high-quality and consistent architecture and modelling.
  • Provide timely and accurate root-cause analysis and resolution within SLA timeframes.
  • Work with the Core Data Lead to build a unified team work plan aligned with business objectives and data initiatives.
  • Collaborate with Reporting Engineers to define, develop and enhance the Common Data Model.

Team Roles & Responsibilities

  • Work within the overall data architecture, ensuring that it aligns with the business's data strategy, scalability and future requirements. Continuously optimise for improved data flow, accessibility and security.
  • Design and develop data ingestion processes, integrating multiple data sources. Maintain the Common Data Model to ensure organisation-wide consistency.
  • Apply Agile principles for iterative and collaborative development.
  • Ensure data pipeline quality, reliability and performance. Develop, test and implement monitoring to ensure effective operation.
  • Support cross-functional data projects, providing expertise as required.
  • Deliver data solutions that support project goals and business outcomes.
  • Proactively monitor systems and pipelines, identifying issues early and responding promptly to minimise disruption.

Requirements

  • Proven experience in Data Engineering, with strong hands-on experience in Python, Data Modelling, Data Warehousing.
  • Strong background in incident resolution, requests, changes and problem-solving within SLAs.
  • Hands-on experience with Spark, Data Architecture, SQL and Delta Lake.
  • Cloud development experience (AWS, GCP or Azure).
  • Working knowledge of Medallion Architecture.
  • Demonstrated capability in implementing and supporting pipelines in demanding environments.
  • Strong communication skills and confidence presenting ideas and technical approaches.
  • Willingness to learn, adopt and improve best practices and standards.
  • Ability to work effectively in complex, high-pressure environments using both legacy and modern technologies.
  • Strong analytical thinking and attention to detail.

Desirable

  • Experience with Fabric, Azure, JIRA, Confluence, CI/CD, GitHub and Git Actions.
  • Certifications in Data Engineering, Cloud, Data Modelling or Data Architecture.
  • Experience with development lifecycle processes for data pipelines.
  • STEM degree (Computer Science, Mathematics, Engineering, Physics) or equivalent practical experience.

Apply for this position