Senior Data Engineer

Eliassen Group
Raleigh, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 156K

Job location

Raleigh, United States of America

Tech stack

Java
Agile Methodologies
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Profiling
Continuous Integration
Information Engineering
Data Infrastructure
ETL
Data Mart
Data Vault Modeling
Data Warehousing
DevOps
Identity and Access Management
Job Scheduling
Python
Maven
Object-Oriented Software Development
Performance Tuning
Scrum
Query Optimization
Ansible
SQL Databases
Scripting (Bash/Python/Go/Ruby)
Sql Optimization
Snowflake
Cloudformation
Containerization
Data Lake
Information Technology
Data Management
Functional Programming
Cloudwatch
Docker
Jenkins
Control M

Job description

Our client seeks a senior Data Engineer to design, build, and maintain operational and analytical capabilities across modern data platforms. The role focuses on Snowflake, AWS, and Python to enable scalable data lakes and warehousing. You will drive solution design, data analysis, production rollout, and support while shaping a growing data infrastructure., * Design and implement scalable data solutions on Snowflake and AWS for data lake and warehouse workloads.

  • Build and maintain ELT/ETL pipelines to move and transform data to and from Snowflake.
  • Perform data analysis, modeling, and profiling to support analytics and operational use cases.
  • Optimize SQL and Snowflake performance, including query tuning and cost efficiency.
  • Develop automation and CI/CD pipelines to enable reliable deployments and operations.
  • Leverage AWS services such as EC2, IAM, S3, EKS, KMS, CloudWatch, and CloudFormation to operate data platforms.
  • Use Python or Java for data engineering, orchestration, and utility development.
  • Implement scheduling and orchestration using enterprise tools.
  • Apply container technologies such as Docker and Kubernetes for packaging and runtime.
  • Collaborate in Agile teams to improve efficiency and deliver high-quality data products.

Requirements

Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance., * 10+ years of experience with a Bachelor's or Master's degree in a technology-related field.

  • 6+ years in data warehousing and data mart concepts and implementations.
  • 4+ years building ELT/ETL pipelines with Snowflake.
  • 4+ years using AWS services including EC2, IAM, S3, EKS, KMS, CloudWatch, and CloudFormation.
  • 1+ years with object-oriented programming in Python or Java.
  • Advanced SQL or SnowSQL knowledge.
  • Hands-on SQL query optimization and performance tuning.
  • Experience with job scheduling tools such as Control-M.
  • Proven data analysis and data modeling skills, including Dimensional or Data Vault.
  • Experience with Docker and Kubernetes.
  • Experience with DevOps, CI/CD, and related tooling such as Maven, Jenkins, Stash, Ansible, and Docker.
  • Experience with Agile methodologies such as Kanban or Scrum (preferred).
  • Ability to handle ambiguity and work in a fast-paced environment.
  • Effective interpersonal skills to collaborate with multiple teams.
  • Strong Snowflake and AWS expertise, including S3, Lambda, and CloudFormation.
  • Python scripting experience supporting data engineering workflows.
  • Leadership experience (preferred)., * Bachelor's or Master's degree in a technology-related field such as Engineering or Computer Science.
  • AWS-related certifications (preferred).

Apply for this position