Principal Data Architect

Meraki Talent Limited
Glasgow, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 120K

Job location

Glasgow, United Kingdom

Tech stack

Artificial Intelligence
Amazon Web Services (AWS)
Databases
Data Architecture
Information Engineering
Data Systems
Distributed Systems
Python
PostgreSQL
Message Queuing Telemetry Transport (MQTT)
Neo4j
Operational Databases
Data Streaming
Systems Architecture
Unstructured Data
Data Strategy
Amazon Web Services (AWS)
Data Lake

Job description

Meraki Talent have just engaged with a Glasgow based business who are revolutionising their industry. After significant investment, they are looking to scale the business and need a Principal Data Architect to design and lead the evolution of their data architecture.

Your mission is to define & implement how data flows across their platforms, how it is stored, synchronized, governed, and shared, both internally and with external partners.

You will enjoy solving complex technical problems that blend system architecture, data engineering and distributed systems.

Responsibilities

  • Responsibility for the business AI-Native data strategy.
  • Define the enterprise data architecture for data to ensure it is "ML-ready" from the moment of ingestion.
  • Establish a Data Lakehouse architecture on AWS to manage the massive scale of raw, unstructured data.
  • Advanced relational & semantic modelling.
  • Industrial telemetry & edge synchronization
  • Governance & enterprise readiness.

Requirements

You are an experience Data Architect with strong Python experience in production data and will have experience working in safety critical environment such as Med/Health Tech, Labs, Pharma, Science, Defence, Space industries or physical domains such as robotics, automotive, aerospace, industrial automation.

  • Experienced Data Architect with strong python experience.
  • Deep experience with PostgreSQL, ideally in AWS RDS.
  • Proven experience designing high-throughput telemetry / IoT / industrial data systems generating very large volumes of time-series data.
  • Hands-on understanding of stream ingestion patterns (MQTT).
  • Experience with graph or Vector databases ( Neo4j, Pinecone, pgvector) and modelling complex, highly relational domains.
  • Familiarity with modern data stack components (e.g., data lakes, streaming, or batch/real-time pipelines).

Apply for this position