Data Engineer
Alpha Inc.
Chicago, United States of America
2 days ago
Role details
Contract type
Temporary contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
Senior Compensation
$ 166KJob location
Remote
Chicago, United States of America
Tech stack
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Business Software
Cloud Database
ETL
Data Transformation
Data Security
Data Structures
Data Visualization
Python
Performance Tuning
Unstructured Data
Scripting (Bash/Python/Go/Ruby)
Snowflake
Build Management
Kafka
Tools for Reporting
Data Pipelines
Job description
We are seeking a Senior Data Engineer to design and build scalable data pipelines supporting enterprise data initiatives. This role will focus on extracting, transforming, and loading both structured and unstructured data into a modern cloud data platform, primarily leveraging AWS and Snowflake. You will play a key role in building end-to-end data workflows, orchestrating pipelines, and enabling downstream data consumption through optimized data structures and user-facing interfaces., * Design and develop end-to-end data pipelines using Python
- Extract data and documents from enterprise data stores using connectors
- Load and manage data in AWS (S3) and Snowflake
- Process both structured and unstructured data for downstream consumption
- Build and maintain data models and tables for business applications
- Orchestrate workflows using Apache Airflow for scheduling and monitoring
- Collaborate on building UI/data access layers on top of Snowflake
- Ensure data quality, scalability, and performance across pipelines
- Support cross-functional teams with data and reporting needs, * Build a cloud-based data pipeline using Python
- Load data into AWS S3, then into Snowflake
- Transform raw data into structured formats for business use
- Orchestrate the entire workflow using Airflow
- Enable end-user access via UI on top of Snowflake
Requirements
- Strong experience in Python (data pipeline development, scripting)
- Hands-on experience with AWS services (especially S3, Lambda)
- Expertise in Snowflake (data loading, modeling, performance tuning)
- Experience with Apache Airflow for orchestration and scheduling
- Solid understanding of ETL/ELT processes
- Experience handling both structured and unstructured data
Nice-to-Have Skills
- Experience building UI or data access layers on top of Snowflake
- Familiarity with data visualization/reporting tools
- Knowledge of Kafka (not required but beneficial for other projects)
- Experience with traditional ETL tools
Benefits & conditions
- Medical for full time employees
- Dental, and Vision Insurance
- Life Insurance, Short-Term Disability, Long-Term Disability, etc.