GCP Data Engineer
EXL Service
Edinburgh, United Kingdom
5 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Edinburgh, United Kingdom
Tech stack
Airflow
Data Governance
ETL
Data Transformation
Data Systems
Data Warehousing
DevOps
Python
Data Processing
Calculation of Risk Weighted Assets
Infrastructure as Code (IaC)
Data Management
Terraform
Data Pipelines
Legacy Systems
Job description
We are seeking an experienced GCP Data Engineer with strong expertise in DBT, Cloud Composer, Python, and Terraform. This role will focus on migrating legacy data platforms and regulatory use cases (e.g., risk, finance, RWA) to GCP, while actively contributing to design and development. The ideal candidate combines strong technical depth and will work with a team of engineers to deliver scalable, high-quality data solutions. As part of your duties, you will be responsible for:
- Lead the design, development, and deployment of data pipelines on GCP.
- Drive migration of legacy data platforms and use cases to GCP, ensuring minimal disruption and optimal performance.
- Build and manage data transformation workflows using DBT.
- Orchestrate pipelines using Cloud Composer (Apache Airflow).
- Develop robust, reusable code in Python for data processing and automation.
- Implement Infrastructure as Code (IaC) using Terraform for scalable and repeatable deployments.
- Collaborate with business and technology stakeholders to understand requirements and translate them into technical solutions.
- Ensure data quality, governance, and best practices across all implementations.
- Provide technical leadership, mentor team members, and guide design decisions.
Requirements
- Strong hands-on experience with Google Cloud Platform (BigQuery, Cloud Storage, etc.).
- Proven experience in DBT for data transformation.
- Expertise in Cloud Composer / Apache Airflow for workflow orchestration.
- Advanced proficiency in Python.
- Solid experience with Terraform for infrastructure provisioning.
- Demonstrated experience in migrating legacy systems (on-prem or other cloud) to GCP.
- Strong understanding of data warehousing concepts and ETL/ELT frameworks.
- Experience in leading teams and managing end-to-end delivery.
- Preferred Qualifications
- Experience in large-scale data transformation programs.
- Familiarity with CI/CD pipelines and DevOps practices.
- Exposure to data governance and regulatory environments (e.g., banking/financial services).
- Strong problem-solving and stakeholder management skills.
- Soft Skills
- Strong leadership and communication skills.
- Ability to work in a fast-paced, collaborative environment.
- Proactive mindset with a focus on ownership and delivery.
To be considered for this role, you must already be eligible to work in the United Kingdom.
About the company
EXL (NASDAQ: EXLS) is a global data and artificial intelligence ("AI") company that offers services and solutions to reinvent client business models, drive better outcomes and unlock growth with speed. EXL harnesses the power of data, AI, and deep industry knowledge to transform businesses, including the world's leading corporations in industries including insurance, healthcare, banking and financial services, media and retail, among others. EXL was founded in 1999 with the core values of innovation, collaboration, excellence, integrity and respect.
We are headquartered in New York and have more than 60,000 employees spanning six continents. For more information, visit .