Data Architect

Spait Infotech Private Limited
Charing Cross, United Kingdom
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Junior
Compensation
£ 110K

Job location

Remote
Charing Cross, United Kingdom

Tech stack

API
Agile Methodologies
Amazon Web Services (AWS)
Azure
Big Data
Google BigQuery
Data Architecture
Data Governance
Data Integration
ETL
Data Systems
Data Warehousing
Software Design Patterns
Dimensional Modeling
Meta-Data Management
NoSQL
Scrum
SQL Databases
Data Streaming
Snowflake
GIT
Data Lake
Data Lineage
Star Schema
Kafka
Data Management
Physical Data Models
Data Delivery
Azure
Software Version Control
Redshift
Databricks

Job description

  • Design scalable, secure, and high-performance data architectures, including data models, pipelines, and integration solutions.
  • Develop conceptual, logical, and physical data models aligned with business requirements.
  • Work closely with data engineers, analysts, and business teams to ensure reliable data delivery and architecture consistency.
  • Partner with cloud engineers to define and implement cloud-based data platforms (AWS, Azure, or GCP).
  • Establish best practices for data quality, governance, metadata management, and data lineage.
  • Lead or contribute to the implementation of modern data technologies (e.g., data lakes, data warehouses, streaming architectures).
  • Ensure data solutions meet performance, security, privacy, and compliance requirements (including GDPR).
  • Document architecture, standards, and design patterns for cross-team use.
  • Participate in Agile/sprint planning, estimation, and technical discussions.
  • Mentor and support less experienced team members (for mid/senior candidates).

Requirements

Do you have experience in SQL?, * Strong understanding of data modelling techniques (dimensional modelling, star schema, relational/NoSQL concepts).

  • Experience with modern cloud data platforms, ideally AWS, Azure, or GCP.
  • Familiarity with data warehouse and data lake technologies (Snowflake, Redshift, BigQuery, Synapse, Databricks).
  • Proficiency in SQL and experience working with large datasets.
  • Understanding of ETL/ELT pipeline development and orchestration.
  • Knowledge of data governance, security, and compliance frameworks.
  • Awareness of API-based data integration and streaming technologies (Kafka, Kinesis, EventHub).
  • Experience with version control (Git) and Agile ways of working.
  • Strong analytical and problem-solving abilities.
  • Ability to communicate technical concepts clearly to business and engineering stakeholders.
  • Detail-oriented with a strong focus on reliability and quality.
  • Ability to work independently within a remote environment.
  • Collaboration and teamwork mindset.

Benefits & conditions

Job Types: Full-time, Permanent

Pay: £60,000.00-£110,000.00 per year

Benefits:

  • Work from home

Apply for this position