Data Engineer

Hire IT People
1 month ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Remote

Tech stack

Java
Google BigQuery
Cloud Storage
Databases
ETL
Data Flow Control
Python
Google Cloud Platform
Information Technology
Data Pipelines
Programming Languages

Job description

  • Design, develop, and maintain scalable data pipelines and ETL processes on Google Cloud Platform (GCP).
  • Implement data ingestion, transformation, and storage solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Collaborate with cross-functional teams to gather requirements and design data models that meet business needs.
  • Optimize data pipelines for performance, reliability, and cost-effectiveness.
  • Ensure data quality and integrity throughout the data lifecycle by implementing data validation and monitoring processes.
  • Troubleshoot and resolve issues related to data pipelines, infrastructure, and performance.
  • Stay up-to-date with industry trends and best practices in data engineering and GCP technologies.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or related field.
  • Active certification in Google Cloud Platform (GCP), such as Google Certified Professional Data Engineer.
  • Proven experience in designing and implementing data pipelines and ETL processes.
  • Strong proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Experience with programming languages such as Python, Java, or Scala.
  • Familiarity with data modeling concepts and database technologies.
  • Excellent problem-solving skills and attention to detail.
  • Effective communication and collaboration skills.

Apply for this position