GCP Data Engineer
Role details
Job location
Tech stack
Job description
We Are Dcoded are partnered with a specialist consultancy delivering high-impact data, analytics, and cloud programmes across financial services. Their end-client, a leading digital bank, is scaling capability across its regulatory data environment and requires an experienced GCP Data Engineer to support a Basel III framework build-out.
This assignment sits within a high-performing engineering squad and will focus on enhancing data pipelines, regulatory data models, and analytics capability underpinning Basel III/III.I compliance., * Develop, optimise, and maintain end-to-end data pipelines and ETL workflows within Google Cloud Platform (GCP).
- Work closely with data, risk, and regulatory SMEs to ensure datasets meet Basel III/III.I standards.
- Support analytical reporting through integration and modelling for Tableau dashboards.
- Build and enhance BigQuery architectures, ensuring scalability, performance, and governance.
- Contribute to data quality frameworks, lineage, controls, and documentation across the regulatory data landscape.
- Collaborate with cross-functional engineering, analytics, and compliance teams in an agile delivery environment.
Requirements
- Strong commercial experience as a Data Engineer within GCP environments.
- Proficiency with BigQuery, Cloud Composer, Dataflow, or similar GCP-native tooling.
- Proven background delivering data solutions in financial services, ideally banking.
- Demonstrable understanding of Basel III regulations (Basel III.I highly advantageous).
- Experience supporting analytical/reporting teams using Tableau.
- Strong SQL engineering and data modelling skills.
- Comfortable operating in fast-paced, regulated environments.