Senior Data Engineer

CVS Health
Hartford, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 204K

Job location

Hartford, United States of America

Tech stack

Java
Artificial Intelligence
Airflow
Batch Processing
BigTable
Google BigQuery
Cloud Computing
Cloud Engineering
Cloud Storage
Data Governance
ETL
DevOps
Distributed Systems
Data Flow Control
Identity and Access Management
Python
Query Optimization
SQL Databases
Data Processing
Scripting (Bash/Python/Go/Ruby)
Data Ingestion
Delivery Pipeline
Spark
Firebase
GIT
Containerization
AI Platforms
Kubernetes
Kafka
Machine Learning Operations
Video Streaming
Data Pipelines
Apache Beam
Docker

Job description

  • Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.

  • Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.

  • Optimize BigQuery performance through partitioning, clustering, and query tuning.

  • Implement data governance, security, and compliance best practices within GCP.

  • Work with Cloud Storage, Pub/Sub,Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing.

  • Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.

  • Collaborate with data scientists, analysts, and software engineers to support business requirements.

  • Ensure data quality, validation, and integrity using appropriate testing frameworks.

Requirements

  • Strong expertise in GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).

  • Proficiency in SQL, Python, and Java for data processing and automation.

  • Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.

  • Strong understanding of data modeling, warehousing, and distributed computing.

  • Experience with real-time and batch processing architectures.

  • Knowledge of CI/CD pipelines, Git, and DevOps best practices.

  • Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).

Required Qualifications

  • 5+ Years of experience with machine learning pipelines on GCP (Vertex AI, AI Platform, etc.).

  • Exposure to Kafka, Ni-Fi, or other streaming technologies.

  • Experience with containerization and orchestration (Docker, Kubernetes, GKE)., * GCP certifications (e.g., Professional Data Engineer, Associate Cloud Engineer)., Bachelor's Degree

Benefits & conditions

This pay range represents the base hourly rate or base annual full-time salary for all positions in the job grade within which this position falls. The actual base salary offer will depend on a variety of factors including experience, education, geography and other relevant factors. This position is eligible for a CVS Health bonus, commission or short-term incentive program in addition to the base pay range listed above.

Our people fuel our future. Our teams reflect the customers, patients, members and communities we serve and we are committed to fostering a workplace where every colleague feels valued and that they belong.

Great benefits for great people

We take pride in offering a comprehensive and competitive mix of pay and benefits that reflects our commitment to our colleagues and their families.

This full-time position is eligible for a comprehensive benefits package designed to support the physical, emotional, and financial well-being of colleagues and their families. The benefits for this position include medical, dental, and vision coverage, paid time off, retirement savings options, wellness programs, and other resources, based on eligibility.

Apply for this position