Snowflake Data Engineer - ETL (W2)
Role details
Job location
Tech stack
Job description
Data Engineer who will design, build, and modernize data pipelines and data platforms in a cloud environment. This role focuses on migrating legacy data systems to AWS and Snowflake, developing scalable ETL/ELT pipelines, and supporting analytics and business intelligence through strong data modeling and architecture. Day to day, this person will work hands-on with pipelines, collaborate with cross-functional teams, and contribute to a cloud data modernization initiative. Legacy to Snowflake Migration (Snowflake Migration Expert) Top 5 Skills: AWS (Glue, CDK, cloud data services) Snowflake Python ETL/ELT Data Pipeline Development- Informatica is preferred Responsibilities: Design, build, and maintain scalable data pipelines Migrate legacy pipelines (Informatica, Teradata) to AWS/Snowflake Develop and optimize ETL/ELT processes Build and manage data warehouse and data models (star schema, dimensional) Work with RDBMS (Oracle, SQL Server, RDS) and MongoDB Implement data governance, security, and quality standards Collaborate with business stakeholders and data teams Develop and test data features and integrations Monitor performance and optimize cloud costs Contribute to Agile development processes
Requirements
Required: Strong experience with AWS and Snowflake Proficiency in Python for data engineering Experience building ETL/ELT pipelines Data warehousing and data modeling experience Experience with relational databases (Oracle, SQL Server) Git-based version control Nice to Have: MongoDB or document database experience Experience migrating legacy data platforms AWS Glue and CDK Agile tools (JIRA, Confluence) Experience mentoring junior engineers