Software Engineer II

M&T Bank
Wilmington, United States of America
yesterday

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Junior
Compensation
$ 153K

Job location

Wilmington, United States of America

Tech stack

Airflow
Unit Testing
ETL
Data Mapping
Data Warehousing
Hadoop
Hive
Python
Microsoft SQL Server
Systems Development Life Cycle
Shell Script
Software Deployment
Software Engineering
SQL Databases
Integration Testing
Data Processing
Snowflake
Spark
Software Application Programming
Backend
GIT
Information Technology
Bitbucket
Data Management
Programming Languages

Job description

Job Description: Develop, design, and support ETL processes. Prepare and manage the technical components of project plans. Perform unit testing of the jobs and assist with SIT (System Integration Testing) and UAT (User Acceptance Testing). Create pipelines to deploy code to higher environments. Work closely with data stewards for data mapping and implement extraction and transformation of data from various sources. Work with ETL tools and programming languages to create ETL workflows. Collaborate with other development, operations, and technology staff in overall systems development. Maintain code repositories. Document ETL processes, procedures, and specifications. Maintain the efficient operation and effectiveness of supported applications. Fine-tune production applications for performance. Support production applications and resolve production incidents from the incident queue.

Requirements

Minimum requirements: Bachelor's degree (or foreign equivalent) in Computer Science or a related technical field plus five (5) years of experience in the job offered or as a Software Developer, Software Programmer, Computer Engineer or related occupation.

Requires one (1) year of experience with each of the following:

  • Designing and Developing Datawarehouse Applications using Hadoop ecosystem tools including Apache Hive;
  • Using Shell Scripting as part of the application design;
  • Designing and Developing Applications in Python;
  • Using Snowflake as the backend for Datawarehouse Applications;
  • Using ETL tool Informatica for building ETL workflows;
  • Maintaining code repositories and creating pipelines using Git or Bitbucket;
  • Using Spark programming language for data processing;
  • Using a Scheduling tool including Apache Airflow or Automic to schedule jobs and workflows and monitor the daily runs of jobs and workflows; and,
  • Using SQL to query data from SQL Server.

Benefits & conditions

Job Location: 1100 North Market Street, Wilmington, DE 19801. Position requires in-office work four (4) days every week.

Apply for this position