Lead Data Engineer

AVIAN LLC
Jackson Township, United States of America
5 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Jackson Township, United States of America

Tech stack

Agile Methodologies
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Big Data
Cloud Engineering
Continuous Integration
Information Engineering
Data Governance
ETL
Data Migration
Data Security
Data Systems
Data Vault Modeling
Data Warehousing
Software Debugging
DevOps
Python
Performance Tuning
SQL Databases
Data Storage Management
Data Ingestion
Snowflake
Spark
GIT
PySpark
Information Technology
Amazon Web Services (AWS)
Integration Frameworks
Jenkins
Programming Languages

Job description

Key Skills: Snowflake, SQL, Python, Spark, AWS- Glue, Big Data Concepts, Lead the design, development, and implementation of data solutions using AWS and Snowflake. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Develop and maintain data pipelines, ensuring data quality, integrity, and security. Optimize data storage and retrieval processes to support data warehousing and analytics. Provide technical leadership and mentorship to junior data engineers. Work closely with stakeholders to gather requirements and deliver data-driven insights. Ensure compliance with industry standards and best practices in data engineering. Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions.

Requirements

Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer, with a focus on AWS and Snowflake. Strong understanding of data warehousing concepts and best practices. Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Experience in the insurance industry, preferably with knowledge of claims and loss processes. Proficiency in SQL, Python, and other relevant programming languages. Strong problem-solving skills and attention to detail. Ability to work independently and as part of a team in a fast-paced environment. Experience with data modelling and ETL processes. Familiarity with data governance and data security practices. Certification in AWS or Snowflake is a plus.

Must have: 6-8 years of relevant experience in Data Engineering and delivery. 5+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Strong experience with SQL, python and Pyspark Good understanding of Data ingestion and data processing frameworks Good experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology

Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Worked on cloud implementations, data migration, Data Vault 2.0, etc.

Apply for this position