Data Engineer - Google Cloud Platform

PRIMUS Global Services, Inc
Madison, United States of America
5 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 104K

Job location

Madison, United States of America

Tech stack

Agile Methodologies
Artificial Intelligence
Airflow
Google BigQuery
Cloud Engineering
Cloud Storage
Continuous Integration
Directed Acyclic Graph (Directed Graphs)
Data Architecture
Information Engineering
Data Integration
Data Warehousing
Software Debugging
Python
Performance Tuning
SQL Databases
Unstructured Data
Google Cloud Platform
Data Ingestion
Delivery Pipeline
Git Flow
Infrastructure Automation Frameworks
Google Cloud Functions
Terraform
Data Pipelines

Job description

We have an immediate need for a Senior Data Engineer in Madison, WI (Onsite) to support enterprise-scale data platform initiatives in a cloud-first environment.

In this role, you will design, build, and maintain scalable data pipelines and architectures using Google Cloud Platform services such as BigQuery, Cloud Run, Cloud Storage, Pub/Sub, and Cloud Composer. You will leverage Python and SQL to develop robust data processing workflows, while using Terraform to automate infrastructure provisioning. The role involves building and managing CI/CD pipelines, developing and monitoring Airflow DAGs, and ensuring reliable data ingestion, transformation, and delivery. You will also focus on performance tuning, production support, debugging, and implementing cloud-native architectures. Collaboration with global teams and clear documentation will be key to delivering high-quality, scalable solutions.

Requirements

The ideal candidate should have strong hands-on experience with Google Cloud Platform services, Python, SQL, and Terraform, along with expertise in data engineering concepts such as data integration, data quality, and data architecture. Knowledge of relational and dimensional data modeling, experience with unstructured data, and familiarity with tools like Airflow, CI/CD pipelines, and Git workflows are essential. Strong communication skills and the ability to work in Agile environments are critical, while exposure to the insurance domain and AI tools will be an added advantage.

Apply for this position