Data Architect

Stott and May
Charing Cross, United Kingdom
6 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Remote
Charing Cross, United Kingdom

Tech stack

Third Normal Form
API
Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Azure
Bash
Cloud Computing
Data Architecture
Data Systems
Data Vault Modeling
Data Warehousing
Github
Python
Machine Learning
Redis
Azure
SQL Databases
Azure
Cloud Monitoring
Snowflake
Grafana
Data Strategy
FastAPI
Data Lake
Amazon Web Services (AWS)
Data Management
Physical Data Models
Terraform
Serverless Computing
Docker

Job description

We are seeking an experienced Data Architect to work with a high-performing data team, designing and delivering advanced data solutions that support the future data strategy of the organisation. You will harness modern technologies including Data Vault, Snowflake, DBT, Airflow and AWS/Azure to shape scalable, future-proof data architectures.

In this role, you will be responsible for defining, owning and governing data models across delivery teams, ensuring alignment with enterprise architecture principles and business objectives. You will collaborate closely with engineers, product managers, data scientists and the wider architecture community to design innovative, high-value data products and services., * Triage new data requirements, assess their architectural impact and provide estimates for changes to data models.

  • Design, develop and enhance business and physical data models to meet evolving analytical and reporting needs.
  • Act as the custodian of data models, defining modelling standards and ensuring consistent adoption across delivery teams.
  • Create data architecture solutions that meet business needs and align with target-state architecture.
  • Document, communicate and centrally manage data models and associated artefacts.
  • Collaborate with the wider architecture community to promote alignment, reuse and innovation.
  • Apply strong expertise in data warehousing and end-to-end analytics architecture to bring structure and clarity to reporting environments.
  • Design and implement data models using methodologies such as 3NF, Dimensional and Data Vault.
  • Stay up to date with industry advancements, including machine learning and modern algorithmic techniques at scale, and embed best practice into data design.
  • Influence engineering and product teams to adopt robust data modelling and architecture standards.
  • Develop a deep understanding of business processes and the data they generate, identifying opportunities to structure and use data to drive business value.
  • Provide technical leadership and mentorship to data engineers.
  • Partner with data scientists to productionise research models on Snowflake and AWS.
  • Engage with product and business stakeholders to align data and AI solutions with enterprise strategy.

Requirements

  • Strong experience as a Data Architect or senior Data Modeller within complex data and analytics environments.
  • Deep expertise in data warehousing and end-to-end analytics architecture.
  • Proven experience with data modelling methodologies including 3NF, Dimensional and Data Vault.
  • Strong understanding of modern data platforms and cloud-based architectures.
  • Experience designing data solutions that incorporate machine learning or advanced analytics use cases.
  • Excellent communication and influencing skills, with the ability to engage both technical and non-technical stakeholders.
  • Strong stakeholder management and cross-functional collaboration skills.
  • Highly analytical mindset with the ability to translate business requirements into effective data architecture solutions.
  • Experience working with or supporting cloud migrations.

Languages: Python (primary), SQL, Bash Cloud: Azure, AWS Tools: Airflow, DBT, Docker, Fargate, FastAPI Data Platforms: Snowflake, Delta Lake, Redis, Azure Data Lake Infrastructure and Operations: Terraform, GitHub Actions, Azure DevOps, Grafana, Azure Monitor, * Experience working with enterprise data platforms such as Snowflake and Azure Data Lake.

  • Experience deploying data or machine learning models as APIs using FastAPI or Azure Functions.
  • Understanding of monitoring, model performance tracking and observability best practices.
  • Familiarity with orchestration tools such as Airflow or Azure Data Factory.

Apply for this position