Data Engineer

Innoworx Technology Services LLC
New York, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

New York, United States of America

Tech stack

API
Agile Methodologies
JIRA
Batch Processing
Databases
Data Validation
ETL
Data Warehousing
Linux
Hadoop
Hadoop Distributed File System
Hive
Issue Tracking Systems
Python
Log Analysis
DataOps
Shell Script
SQL Databases
Scripting (Bash/Python/Go/Ruby)
Snowflake
GIT
Data Management
Software Version Control
Data Pipelines
ServiceNow

Job description

  • Provide L2 application/data operations support for analytics and ETL workloads. Focus on SQL-based investigation, monitoring scheduled jobs, basic Python/Shell automation, and clear stakeholder communication., * Monitor and support daily data pipelines and batch jobs across Snowflake/Hadoop environments.
  • Investigate incidents and service requests using strong SQL; perform data validation, reconciliation, and root-cause triage.
  • Develop and maintain lightweight Python/Shell scripts to automate health checks, log parsing, alerts, and routine recovery steps.
  • Operate job schedules using TWS (IWS): monitor job streams, handle failures, perform safe restarts/reruns, manage dependencies and calendars as per runbooks.
  • Own tickets end-to-end in the ITSM tool (e.g., ServiceNow/Jira): classify, prioritize, communicate status, and drive to closure with vendors and internal teams.
  • Perform environment and application health checks, create/maintain runbooks and knowledge articles, and contribute to problem management and RCA documentation.
  • Collaborate with L3 engineers, data engineers, and product owners; participate in Agile ceremonies and release/smoke validations as needed.
  • Produce daily shift handovers and weekly operational reports.

Requirements

  • Strong SQL and Database knowledge (Snowflake/Hadoop).
  • Strong Python/Shell script knowledge
  • Good Communication skills.

Secondary (Good to have) Skills:

  • TWS Schedular
  • Agile Development., * Target start date: March 30, 2026, * 2-5 years in application support, data operations, or ETL/analytics platform support.
  • Strong SQL and database skills, including writing complex queries and troubleshooting performance on at least one platform; exposure to Snowflake and/or Hadoop (Hive/HDFS) preferred.
  • Strong scripting with Python and Shell on Linux/Unix (file I/O, error handling, log parsing, simple APIs, scheduling fundamentals).
  • Solid understanding of batch processing, data warehousing/ETL concepts, and incident management practices.
  • Good communication skills: clear written updates, user/vendor coordination, and concise handovers.
  • Familiarity with version control (Git) and working in ticketing systems (ServiceNow/Jira).

Good-to-Have Skills

  • TWS/IWS (IBM Workload Scheduler) hands

Apply for this position