Big Data Engineer (GCP)
OMEGAHIRES
Phoenix, United States of America
15 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
Senior Compensation
$ 115KJob location
Phoenix, United States of America
Tech stack
Java
Agile Methodologies
Airflow
Business Analytics Applications
Big Data
Google BigQuery
Computer Programming
Continuous Integration
Data Architecture
Information Engineering
Data Governance
ETL
Data Systems
Data Warehousing
Database Queries
Data Flow Control
Python
Query Optimization
Cloudera
Data Streaming
Unstructured Data
Workflow Management Systems
Data Processing
Google Cloud Platform
Data Lake
Kafka
Stream Processing
Data Pipelines
Job description
We are seeking an experienced Big Data Engineer with strong expertise in Google Cloud Platform (GCP) to design, build, and optimize scalable data pipelines and analytics solutions. The ideal candidate will have hands-on experience with BigQuery and GCP data services, and will collaborate closely with data scientists, architects, and business stakeholders to deliver high-performance, reliable data systems., Data Engineering & Pipeline Development
- Design, develop, and maintain scalable data pipelines using GCP services.
- Build efficient ETL/ELT processes for structured and unstructured data.
- Ensure data quality, integrity, and availability across systems.
GCP & Big Data Technologies
- Work extensively with BigQuery, Dataflow, and Dataproc for data processing and analytics.
- Optimize BigQuery queries for performance and cost efficiency.
- Leverage GCP-native tools for scalable and resilient data architectures.
Programming & Processing
- Develop data processing solutions using Python, Java, or Scala.
- Implement batch and real-time data processing frameworks.
Workflow Orchestration & Automation
- Design and manage workflows using Airflow or Cloud Composer.
- Automate data pipelines and integrate with CI/CD processes.
Collaboration & Delivery
- Partner with data scientists, analysts, and business teams to understand requirements.
- Participate in Agile ceremonies and contribute to sprint deliverables.
- Ensure timely delivery of high-quality data solutions.
Requirements
- 7+ years of experience in Big Data Engineering.
- Strong hands-on experience with GCP services (BigQuery, Dataflow, Dataproc).
- Proficiency in Python, Java, or Scala for data engineering.
- Strong SQL skills with experience in query optimization.
- Experience with workflow orchestration tools (Airflow/Composer).
- Familiarity with Agile methodologies and CI/CD practices.
- Strong problem-solving and analytical skills.
Nice to Have
- Experience with real-time streaming (Pub/Sub, Kafka).
- Knowledge of data warehousing and data lake architectures.
- Exposure to data governance and security best practices.
Powered by JazzHR
Benefits & conditions
- $80,000-115,000 per year
About the company
© 2026 Careerjet All rights reserved