GCP Data Engineer
Role details
Job location
Tech stack
Requirements
We are looking for a Senior GCP Data Engineer to build and manage scalable data platforms on Google Cloud.
The ideal candidate will have strong experience in designing batch and real-time data pipelines, working with large-scale datasets, and delivering robust data solutions for enterprise applications.
Key Responsibilities
Design and build data pipelines (batch & streaming) on GCP
Work with BigQuery, Dataflow, Cloud Storage, Spanner
Develop real-time data streaming solutions (Kafka / Beam)
Implement Infrastructure as Code using Terraform
Deploy workloads on Kubernetes (GKE)
Build and manage CI/CD pipelines (Jenkins / Spinnaker), Strong experience in GCP
Hands-on with BigQuery, Dataflow / Apache Beam
Experience in data engineering (batch & streaming)
Knowledge of Kafka / Spark / Big Data tools
Proficiency in Python / Java / SQL
Experience with Terraform, Kubernetes, CI/CD
Good understanding of data modelling
Nice to Have
DBT experience
Banking domain experience
GCP Certification (Associate / Data Engineer)