Senior Data Architect
Role details
Job location
Tech stack
Job description
Role: Senior Data Architect
Location: London, UK - 2 days onsite & 3 days' remote
Work Mode: Permanent
This contract Data Architect role is focused on stabilising, documenting, and enabling BAU operation of critical data engineering pipelines within a complex Healthcare data estate.
The contractor will operate in a delivery-led, time-bound engagement, supporting Phase 1 and Phase 2 pipeline operations by producing clear architectural artefacts, data flow analysis, and operationally usable documentation that reduce risk, enable service continuity, and support handover into managed BAU support.
The role is hands-on and outcomes-driven, requiring the contractor to be productive from day one, work with partially documented systems, and extract critical knowledge from data engineering SMEs.
Objectives for the Contract
- The Data Architect contractor will be accountable for:
- Rapidly understanding how critical data pipelines work today
- Making implicit and tribal knowledge explicit
Producing architecture and data flow artefacts that are:
- Operable
- Supportable
- Transferable into BAU
- Enabling service readiness rather than defining long-term future-state platforms
- This role does not own platform modernisation and must operate within clearly defined project boundaries.
Key Deliverables & Responsibilities
- Critical Pipeline Architecture Support
- Analyse and document existing critical data engineering pipelines.
Contribute directly to:
- Critical Pipeline Inventory
- Service criticality and operational risk assessments
Identify:
- Architectural dependencies
- Single points of failure
- Knowledge concentration risks
- Ensure architectural documentation aligns with operational continuity needs, not just design intent.
- Data Flow & Source System Analysis
- Produce end-to-end data flow mappings for critical pipelines, covering:
- Source systems and data collection mechanisms
- Processing stages and centralised transformations
- Intermediate storage and handoffs
- Downstream consumers and dependencies
Document:
- Submission triggers (scheduled, event-driven, manual)
- Data volumes, cadence, and SLA sensitivities
- Source system availability and dependency risks
- Explain and contextualise the current heavy central processing model, identifying:
- Statutory or regulatory drivers
- Legacy or convenience-driven complexity
- Business Logic & Architectural Knowledge Capture
- Capture and structure business rules, transformations, validations, and aggregations per critical pipeline.
Clearly distinguish:
- Regulatory/statutory logic
- Operationally required logic
- Legacy technical debt
- Populate and maintain a Business Logic Repository suitable for:
- Incident resolution
- Knowledge transfer
- Ongoing BAU support
- BAU Service Model Enablement
- Ensure architectural outputs support the defined BAU service model, including:
- Operational playbooks
- Incident response procedures
- Escalation and dependency clarity
- Provide architectural input into:
- SLA/OLA alignment
- Change management and release controls
- Disaster recovery (DR) robustness assessments
- Validate architectural readiness for handover into managed operations.
- Knowledge Transfer & Handover Support
- Actively support knowledge transfer activities with:
- Healthcare data pipeline SMEs
- Operational and support teams
Ensure all architectural artefacts are:
- Clear
- Complete
- Validated with SMEs
- Support shadow and supervised operation periods as required.
- Enable formal handover and sign-off for BAU readiness.
- Pipeline Improvement Identification (Architectural Input)
- Identify pipeline-level improvement opportunities with an operations-first mindset.
Assess recommendations against:
- Immediate operational benefit
- Risk reduction
- Alignment with platform modernisation boundaries
Support classification of improvements as:
- Tactical ( Do Now )
- Strategic ( Defer to Platform Modernisation )
- Ensure no recommendations create rework or conflict with future platform changes.
Required Contracting Profile Essential Experience
- Proven experience as a Data Architect on complex, Legacy data estates.
Strong background working with:
- Data engineering pipelines
- Batch/scheduled processing
- Centralised data processing models
Demonstrated ability to:
- Rapidly understand undocumented or poorly documented systems
- Produce operationally usable architecture and data flow artefacts
- Experience supporting BAU operations, service transition, or operational handover.
- Comfortable working under time-boxed delivery constraints with minimal onboarding.
Requirements
- Proven experience as a Data Architect on complex, Legacy data estates.
Strong background working with:
- Data engineering pipelines
- Batch/scheduled processing
- Centralised data processing models
Demonstrated ability to:
- Rapidly understand undocumented or poorly documented systems
- Produce operationally usable architecture and data flow artefacts
- Experience supporting BAU operations, service transition, or operational handover.
- Comfortable working under time-boxed delivery constraints with minimal onboarding.