Data Architect
Role details
Job location
Tech stack
Job description
Lead the design and development of a strategic data platform, supporting proof-of-concept builds and long-term architecture direction Define and implement data architecture standards, frameworks, and best practices across multiple teams Drive adoption of data governance, metadata management, and data quality frameworks Design scalable lakehouse architectures to support analytics, reporting, and downstream consumption Provide technical leadership on distributed data processing and Real Time/event-driven architectures Collaborate with senior stakeholders to influence data strategy, platform investment, and architectural decisions Guide engineering teams on performance optimisation, scalability, and modern data engineering practices Ensure robust data security, access controls, and platform governance across cloud environments
Requirements
Candidates must have recent experience working with enterprise-level financial services organisations (eg Tier 1 banks) Candidates must be based within a commutable distance of Central London due to the on-site requirement (2 days per week). Non-local candidates will not be considered
This is a senior-level role focused on shaping and delivering a modern data platform, with responsibility for architecture design, governance frameworks, and proof-of-concept development. The successful candidate will play a key role in influencing enterprise-wide data strategy and enabling scalable, high-performance data solutions., Strong experience in cloud-based data architecture, with deep expertise in AWS data platforms (Glue, Lambda, S3, Redshift, Athena) Experience working with Databricks and modern lakehouse architectures (eg Delta Lake or Iceberg) Advanced proficiency in Python, PySpark, and SQL, with a focus on scalable and optimised data solutions Proven experience designing and implementing data models, including dimensional modelling and schema evolution Strong understanding of data governance, metadata management, and data quality frameworks Experience with distributed and Real Time data processing (eg Kafka, Spark Streaming) Exposure to dbt and analytics engineering best practices Experience with containerisation technologies such as Docker Strong understanding of cloud security, IAM, and fine-grained data access controls (eg Immuta) Demonstrated ability to influence senior stakeholders and operate at an architectural level
Desirable Experience
Experience working within large-scale, regulated financial services environments Exposure to data mesh principles and domain-driven data architectures Experience defining enterprise-wide data standards and operating models
Additional Information
2 days per week on-site in Central London is mandatory Candidates must be immediately available or on a short notice period Strong communication and stakeholder management skills are essential
If you're looking to play a key role in shaping a strategic data platform within a leading financial services organisation, this is an excellent long-term opportunity.