Data Architect - Bristol - Hybrid Opportunity

Hays plc
Bradley Stoke, United Kingdom
2 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Remote
Bradley Stoke, United Kingdom

Tech stack

Artificial Intelligence
Data analysis
Application Lifecycle Management
Azure
Big Data
Data Architecture
Information Engineering
Data Infrastructure
ETL
Data Systems
Distributed Systems
Fraud Prevention and Detection
Github
Machine Learning
Metadata Standards
Feature Engineering
Data Ingestion
Azure
Spark
Data Strategy
Microsoft Fabric
Data Lake
Data Lineage
Deployment Automation
Bicep
Machine Learning Operations
Terraform
Data Pipelines
Databricks

Job description

They are a specialist insurance and risk solutions provider, supporting clients with tailored coverage and expert advice across a range of sectors. The business is known for its client-focused approach, strong market relationships and commitment to delivering practical, dependable solutions.With a collaborative culture and a focus on professional development, they offer a supportive environment where people are trusted, valued and encouraged to grow their careers within a forward-thinking organisation. Your new role As a Data Architect, you'll play a key role in shaping how data is designed, managed and used across the business. You'll set the architectural direction for our data estate - from the point data first lands on the platform, through the Bronze, Silver and Gold layers of our Medallion Architecture, and all the way to analytics, AI and self-service reporting. Working within the Microsoft Azure and Databricks ecosystem, you'll help build a data platform that's scalable, flexible and built to last. Your work will directly support high-impact use cases, including advanced analytics, pricing models, AI/ML solutions and regulatory reporting - ensuring teams across the business can trust and use data with confidence. You'll collaborate closely with colleagues in Data Engineering, Data Science, Pricing, Platform Engineering and MLOps, acting as a trusted partner and technical authority. Together, you'll define clear standards for how data is structured, governed and consumed, helping the organisation grow without unnecessary complexity or fragmentation. This is a great opportunity for someone who enjoys balancing strategic thinking with hands-on collaboration - influencing how data is used across the organisation as they continue its journey towards its Top 5 in 5 ambition. What you'll need to succeed & What you'll be responsible for Data Architecture & Modelling Define and own the architectural principles, standards and policies governing SBG's data estate from the landing zone through to the Gold layer. Design and govern the Medallion Architecture (Bronze / Silver / Gold), ensuring every layer is built for analytics, AI/ML and self-service consumption. Own data modelling standards - conceptual, logical and physical - and ensure models are fit for both regulatory reporting and AI-driven insight. Define Unity Catalogue structure, metadata standards and data lineage governance across the estate. Data Ingestion & Processing Define ingestion standards and data contracts for data arriving from the landing zone into the Bronze layer, working in partnership with the Development and Application Management team.Design and optimise ETL/ELT pipeline frameworks using Databricks, Delta Lake and Azure Data Factory. * Ensure Silver and Gold layer data products are fit for purpose for analytics, pricing, AI and ML model consumption. Optimise data pipelines for efficiency, cost-effectiveness and high performance, leveraging Databricks for big data processing and machine learning. Governance & Standards Act as the architectural authority for the data estate - reviewing designs, enforcing standards and preventing platform fragmentation as SBG scales. Ensure all data architecture decisions align with regulatory requirements - FCA, GDPR, Solvency II, IFRS 17 and BCBS 239. Define and maintain data architecture policies and guidelines ensuring long-term scalability and sustainability. Analytics & AI Enablement Design the Gold layer to ensure data products are structured, documented and accessible for self-service analytics and AI/ML model consumption. Collaborate with ML Ops and Data Science teams to define data product standards and feature engineering patterns. Evaluate and lead adoption of emerging Azure and Databricks capabilities - including Microsoft Fabric, OneLake and DirectLake - where they advance the data architecture. Drive innovation by evaluating and implementing emerging cloud-based data technologies to enhance SBG's competitive advantage. Risk & Regulatory Proactively identify, manage and mitigate architectural and data risks encountered in day-to-day delivery.Act with integrity, adhering to FCA regulatory frameworks and ensuring the best possible outcomes for customers.

Requirements

Strong stakeholder management across business, IT and compliance teams.Excellent communication, collaboration and influencing skills at all levels of an organisation.Experience leading data architecture and engineering teams in an enterprise environment.Ability to define and implement a data strategy aligned with business objectives.Proven track record of delivering enterprise-scale data solutions with a focus on performance, security and scalability.Experience in regulated financial services, ensuring compliance with industry standards.Deep expertise in data modelling - conceptual, logical and physical.Data warehousing and data lake architecture for high-performance analytics.ETL/ELT pipeline development and optimisation to support large-scale data processing.Data integration across structured and unstructured sources, ensuring high availability.Metadata management and governance to maintain data quality and lineage.Experience defining data contracts and ingestion standards between source delivery teams and the data estate.Deep expertise in Microsoft Azure cloud services - ADF, ADLS, Synapse, Purview.Databricks - Delta Lake architecture, optimisation and advanced data processing.Apache Spark for large-scale distributed computing and performance tuning.Microsoft Fabric - OneLake and DirectLake integration.Azure Synapse Analytics for enterprise-scale data warehousing.Infrastructure-as-Code (Terraform or Azure Bicep) to automate cloud deployments.CI/CD pipelines with Azure DevOps or GitHub Actions for automated deployment of data pipelines.MLOps best practices - MLflow, Databricks Model Serving, Feature Store.Knowledge of IFRS 17, BCBS 239, UK Data Protection Act and Solvency II compliance.Experience with pricing models, claims processing and fraud detection in the insurance sector.Strong problem-solving skills and ability to translate business needs into technical solutions.Ability to document and present complex data architectures to technical and non-technical stakeholdersWhat you'll get in return Hybrid working - 2 days in the office and 3 days working from home25 days annual leave, rising to 27 days over 2 years' service and 30 days after 5 years' service. Plus bank holidays!Discretionary annual bonusPension scheme - 5% employee, 6% employerFlexible working - we will always consider applications for those who require less than the advertised hoursFlexi-timeHealthcare Cash Plan - claim cashback on a variety of everyday healthcare costsElectric vehicle - salary sacrifice scheme100's of exclusive retailer discountsProfessional wellbeing, health & fitness app - WrkitEnhanced parental leave, including time off for IVF appointmentsReligious bank holidays - if you don't celebrate Christmas and Easter, you can use these annual leave days on other occasions throughout the year.Life Assurance - 4 times your salary25% Car Insurance Discount20% Travel Insurance DiscountCycle to Work SchemeEmployee Referral SchemeCommunity support day

Apply for this position