Principal Data Engineer

TEKsystems
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Charing Cross, United Kingdom

Tech stack

Artificial Intelligence
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
C Sharp (Programming Language)
Continuous Integration
Data as a Services
Information Engineering
Data Governance
Identity and Access Management
Python
Systems Development Life Cycle
Role-Based Access Control
DataOps
SQL Databases
Working Model 2D
Privacy Controls
Large Language Models
Spark
Data Lake
PySpark
Collibra
Kafka
Data Management
Amazon Web Services (AWS)

Job description

Principal Data Platform Architect (Principal IC)

Location: Richmond (Hybrid - 2 days per week on-site)

(Req title: Principal Software Engineer - Data Platform & Product Analytics)

Drive architecture and governance for a modern data estate, lead multiple teams through influence (no direct reports), and enable AI-powered analytics while improving reliability, security, and cost. Non-Negotiables (You must clearly show these on your CV)

  1. Architecture ownership of a data platform (lake/lakehouse/mesh, data products, pipelines) - not just contribution.
  2. Hands-on delivery with Python + Spark/PySpark - real, demonstrable pipeline design, build, validation, monitoring.
  3. Governance/security/privacy outcomes, including catalog/lineage/metadata/quality plus GDPR and privacy controls (IAM/RBAC/ABAC, encryption, retention/minimisation, auditing).
  4. Leadership by influence - leading multiple teams and communicating with senior stakeholders (including directors) without direct line management.

If your profile is primarily hands-on Data Engineering without architectural ownership and leadership-by-influence, this role will not be a fit. Why this role matters

We're hiring a principal-level data platform/domain architect who blends strong hands-on credibility (Python/PySpark + modern data stack) with principal-level influence and architectural leadership. Your work will elevate governance, privacy, security, and platform reliability while enabling AI and LLM-driven analytics across a high-value data domain.

This role is central to shaping a trusted, cost-efficient, AI-ready data ecosystem. What you'll do

  • Own the architecture for a domain-scale data platform (Data Lake/Lakehouse/Mesh), aligning to strategy and business needs.
  • Provide principal-level leadership through influence, guiding delivery teams, engineering managers, and product owners toward aligned technical direction.
  • Lead data governance and privacy: metadata, catalog, lineage, quality, GDPR compliance, access controls (IAM/RBAC/ABAC), encryption, retention/minimisation.
  • Offer deep hands-on technical guidance in Python/PySpark: validation frameworks, error handling, monitoring, and CI/CD for data workflows.
  • Shape batch and Real Time pipelines (Spark/Glue; Kafka/SQS).
  • Drive observability & operational excellence: alerting, incident response, SLAs/SLOs, automation.
  • Lead FinOps practices - measure, monitor, and reduce cloud data costs.
  • Champion AI/LLM adoption, including Copilot best practices for engineering productivity.

What you'll bring Essential

  • Architecture ownership for large-scale data platforms (lake/lakehouse/mesh; data products; domain leadership).
  • Strong leadership-by-influence: navigating multi-team environments, senior stakeholder communication, and decision clarity.
  • Hands-on proficiency with Python + Spark/PySpark in real delivery.
  • Deep experience with AWS data services (S3, Glue, Lake Formation; Kafka/SQS).
  • Strong SDLC, DataOps, monitoring, alerting, CI/CD, incident response.
  • Proven delivery of governance/privacy frameworks: catalog, lineage, quality, GDPR, access controls, encryption.

Strong Differentiators

  • Demonstrated cloud cost optimisation results (FinOps).
  • experience enabling AI/LLM/Copilot for engineering workflows.
  • Use of data catalog & lineage tools (Collibra, Atlas, OpenMetadata, etc.).
  • Broader language exposure (SQL; C# okay - Python is core).

How you'll work

  • Principal IC (no direct reports) acting as the domain's lead architect and technical authority.
  • Operate as the bridge between engineering teams and senior leadership, making trade-offs clear and decisions scalable.
  • Hybrid working model: Richmond office - 2 days per week.

Interview process

  1. Technical deep-dive (conversation-led; no live coding).
  2. Director behavioural interview (leadership, influence, communication).

What success looks like (6-12 months)

  • A clearly adopted architecture direction for the domain.
  • Measurable gains in reliability, observability, and cloud cost reduction.
  • Embedded governance-by-default (catalog, lineage, quality, access controls, privacy).
  • Team-wide adoption of AI/Copilot best practices.

Job Title: Principal Data Engineer

Location: London, UK

Job Type: Contract

Trading as TEKsystems. Allegis Group Limited, Bracknell, RG12 1RT, United Kingdom. No. 2876353. Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at our website.

To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go our website.

We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice on our website for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.

Requirements

  • Architecture ownership for large-scale data platforms (lake/lakehouse/mesh; data products; domain leadership).
  • Strong leadership-by-influence: navigating multi-team environments, senior stakeholder communication, and decision clarity.
  • Hands-on proficiency with Python + Spark/PySpark in real delivery.
  • Deep experience with AWS data services (S3, Glue, Lake Formation; Kafka/SQS).
  • Strong SDLC, DataOps, monitoring, alerting, CI/CD, incident response.
  • Proven delivery of governance/privacy frameworks: catalog, lineage, quality, GDPR, access controls, encryption.

Strong Differentiators

  • Demonstrated cloud cost optimisation results (FinOps).
  • experience enabling AI/LLM/Copilot for engineering workflows.
  • Use of data catalog & lineage tools (Collibra, Atlas, OpenMetadata, etc.).
  • Broader language exposure (SQL; C# okay - Python is core).

Apply for this position