W2: Data Engineers Charlotte, NC(Hybrid)

Stellent IT LLC
San Jose, United States of America
8 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
$ 194K

Job location

San Jose, United States of America

Tech stack

Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Code Review
ETL
Data Warehousing
Relational Databases
Database Queries
DNS
Identity and Access Management
Virtual Private Networks (VPN)
Python
Key Management
Enterprise Messaging Systems
Networking Basics
Operational Databases
Oracle Fusion Middleware
TCP/IP
Workflow Management Systems
Data Processing
System Availability
Spark
Cloudformation
Concourse
Pandas
Event Driven Architecture
PySpark
Kafka
Bitbucket
Cloudwatch
Api Gateway
REST
Amazon Web Services (AWS)
Terraform
Data Pipelines
Confluent

Job description

Must have skills: Pyspark, AWS, Terraform and Building Data Pipelines

  • Collaborate with Data Engineers, Software Engineers, Data Scientists, and Technical Leads to gather requirements and define technical solutions
  • Design, build, and maintain scalable data pipelines (batch & real-time) within AWS
  • Develop and optimize data warehouse solutions to support analytics and reporting use cases
  • Ensure data quality, reliability, and long-term scalability across all pipelines
  • Partner with Data and Solution Architects on key technical decisions
  • Identify data gaps and implement automated solutions to enhance data availability and usability
  • Manage and troubleshoot production data environments in AWS
  • Build and maintain ETL workflows, orchestration pipelines, and supporting infrastructure
  • Implement monitoring, alerting, and automated remediation for production issues
  • Participate in code reviews and contribute to best practices across the team

Requirements

  • Experience working within AWS environments

  • Strong experience with AWS services including:

  • Redshift, S3, EMR, Glue, Lambda, Athena

  • CloudWatch, CloudTrail, SNS, SQS, Step Functions, QuickSight

  • Experience with data warehousing solutions (Redshift, Athena)

  • Hands-on experience building ETL pipelines and data models

  • Experience with real-time and batch data processing

  • Proficiency in Python, Spark, PySpark, Pandas

  • Strong SQL skills and experience with RDBMS platforms

  • Experience with Kafka or messaging systems (Confluent preferred)

  • Familiarity with event-driven architectures

  • Experience with workflow orchestration tools (Airflow or Step Functions)

  • Experience with Infrastructure as Code (Terraform or CloudFormation)

  • Experience with CI/CD pipelines (Bitbucket, Concourse, or similar)

  • Experience with Secrets Management (Vault, AWS Secrets Manager)

  • Strong understanding of IAM roles, policies, and AWS security best practices

  • Knowledge of networking fundamentals (DNS, TCP/IP, VPN)

  • Experience with REST APIs and API Gateway

Stellent IT

About the company

© 2026 Careerjet All rights reserved

Apply for this position