Lead Software Engineer
Role details
Job location
Tech stack
Job description
Prudential & Analytics (P&A) is on a significant transformation journey, modernising regulatory reporting, risk analytics, and model-ready data through sustained investment in people, data, and cloud technology!
As a Lead Software Engineer, you'll provide hands-on technical leadership to high-performing engineering teams, delivering secure, compliant, and scalable cloud-native data platforms and analytics services on Google Cloud Platform (GCP). You'll play a key role in modernising legacy analytics and data engineering platforms, setting strong engineering standards, and translating complex prudential risk requirements into robust technical solutions!
You'll deliver data pipelines, analytics data platforms, and regulatory reporting solutions that underpin critical prudential and risk outcomes across the bank.
This role is suited to a technical lead who enjoys working at the intersection of data engineering and software engineering, and who thrives in a regulated UK banking or financial services environment.
What you'll do
- Translate complex business, risk, and regulatory requirements into practical, high-quality technical solutions.
- Own engineering plans and delivery across Agile teams, with accountability for outcomes, quality, and pace.
- Provide technical leadership and influence engineering decisions across a modern cloud-native data and software engineering stack on GCP.
- Design, build, and evolve data pipelines, data services, and analytics platforms supporting regulatory reporting and risk analytics.
- Champion engineering excellence by embedding CI/CD pipelines, automation, testing, and robust quality gates.
- Partner closely with DevOps, QA, Security, Architecture, and Product teams to deliver integrated solutions.
- Mentor and develop engineers, setting clear standards and building a culture of ownership and continuous improvement.
Requirements
- Strong experience in data engineering, including data modelling, normalisation, and entity relationships for analytics at scale.
- Hands-on experience designing and building ETL/ELT data pipelines and orchestration on Google Cloud Platform (GCP), using tools such as Airflow / Cloud Composer.
- Solid software engineering fundamentals, including version control (Git/GitHub), pull request field, code reviews, automated testing (unit and integration), and CI/CD.
- Proficiency in Python for data engineering and service development, writing clean, efficient, and maintainable code.
- A pragmatic data governance approach, embedding access controls, data lineage, cataloguing, and data quality into delivery.
- Experience working in a regulated environment (banking or financial services preferred), with exposure to prudential or regulatory data such as Basel III and CRD IV.
- Proven experience in technical and people leadership, including coaching engineers, setting engineering standards, and aligning teams around shared outcomes.
Nice to have
- Experience modernising legacy analytics or data platforms into cloud-native data pipelines and services.
- Experience with GKE / Kubernetes, containerised services, and cloud storage patterns.
- Knowledge of Retail or Commercial Banking products and platforms to support architecture design and data migration discussions.
- Familiarity with data product thinking, including domains, data contracts, SLAs and SLOs.
- Observability for data systems, including metrics, alerting, lineage, and anomaly detection.
- Performance optimisation in BigQuery, including partitioning, clustering, and materialised views.
- Experience using AI-enabled software development tools and modern engineering practices.