GCP Data Engineer
Spait Infotech Private Limited
Belfast, United Kingdom
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
Intermediate Compensation
£ 95KJob location
Remote
Belfast, United Kingdom
Tech stack
Java
Airflow
Google BigQuery
Cloud Computing
Cloud Computing Security
Cloud Storage
Cluster Analysis
ETL
Data Security
Data Systems
Data Warehousing
Data Flow Control
Github
Identity and Access Management
Python
Networking Basics
Performance Tuning
SQL Databases
Google Cloud Platform
GIT
Kubernetes
Google Cloud Functions
Terraform
Software Version Control
Data Pipelines
Apache Beam
Docker
Job description
- Design, develop, and maintain scalable data pipelines and ETL/ELT workflows on GCP.
- Work with services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Cloud Functions, etc.
- Implement best practices for data modelling, data warehousing, performance tuning, and data quality.
- Optimize cloud workloads for efficiency, reliability, and cost management.
- Collaborate with data analysts, data scientists, product managers, and engineering teams to deliver high-quality data solutions.
- Write clean, efficient, and reusable code using Python, SQL, and/or Scala/Java (optional).
- Build and maintain CI/CD pipelines for data workflows using tools like Cloud Build, GitHub Actions, or similar.
- Ensure data security, compliance, governance, and privacy best practices across systems.
- Contribute to solution architecture (for senior engineers).
- Provide technical mentoring and guidance to junior engineers (for senior roles).
Requirements
Do you have experience in Terraform?, Eligibility: Candidates must have valid eligibility to work in the UK, * Strong experience with Google Cloud Platform (GCP).
- Proficiency with BigQuery: SQL, partitioning, clustering, optimization.
- Experience building pipelines using Dataflow (Apache Beam), Cloud Composer (Airflow), or Pub/Sub.
- Excellent SQL and Python development skills.
- Strong understanding of ETL/ELT, data modelling, and data warehouse principles.
- Experience with version control (Git) and CI/CD tools.
- Knowledge of cloud security, IAM, and networking basics.
- Experience with containerized workloads (Docker, Kubernetes) is a plus.
- Familiarity with Terraform or infrastructure-as-code is desirable.
- Strong problem-solving skills and attention to detail.
- Ability to work independently in a remote environment.
- Effective communication and stakeholder-management skills.
- Ability to mentor junior team members (for senior-level applicants).