BI / ELT Developer - Remote / Telecommute

CYNET SYSTEMS INC.
Providence, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Remote
Providence, United States of America

Tech stack

API
Agile Methodologies
Databases
Data Architecture
Information Engineering
Data Integration
ETL
Data Transformation
Data Structures
Data Warehousing
Database Design
DevOps
Scrum
Salesforce
Software Technical Review
Freeform SQL
Cloud Platform System
Snowflake
GIT
Data Layers
Semi-structured Data
Deployment Automation
Data Management
Data Pipelines

Job description

Technical Leadership:

  • Lead end-to-end data engineering delivery including ingestion, transformation, and persistence layers.
  • Own technical decisions related to database architecture, schema design, and data modeling.
  • Review and guide development of complex SQL queries and performance-optimized data pipelines.
  • Act as the primary escalation point for data-related technical issues.

ETL And Data Engineering:

  • Design and implement ETL pipelines using Matillion and Snowflake for structured and semi-structured data.
  • Build reusable and metadata-driven transformation frameworks.
  • Optimize data loads, transformations, and Snowflake performance.
  • Ensure data quality, consistency, and reconciliation across pipelines.

Automation And DevOps:

  • Drive automation across ETL deployments, validation, reconciliation, and regression checks.
  • Enable CI/CD pipelines using Git and automated deployment practices.
  • Collaborate with DevOps teams to enforce coding standards and release controls.
  • Reduce manual intervention through automation, monitoring, and alerting.

Data Architecture And Modeling:

  • Design and maintain enterprise data warehouse schemas.
  • Develop fact and dimension models and conformed data layers.
  • Collaborate with Data Architects to align logical and physical data models.
  • Ensure scalability, extensibility, and performance of data structures.

Agile Delivery And Collaboration:

  • Lead data engineering workstreams within Agile sprints.
  • Support backlog grooming, estimation, and technical story readiness.
  • Collaborate with API, UI, Salesforce, and QA teams for data integration and validation.
  • Participate in sprint ceremonies and technical design reviews.

Requirements

  • 10+ years of experience in Data Engineering, BI, or Data Platforms.
  • Strong hands-on experience with Matillion and Snowflake.
  • Advanced SQL expertise and strong database design/data modeling skills.
  • Experience building large-scale ETL/ELT pipelines and data transformation processes.
  • Strong focus on automation and implementation of scalable solutions.
  • Experience with cloud-based data platforms (Snowflake preferred).
  • Experience working in Agile development environments.

Soft Skills:

  • Strong analytical and problem-solving abilities.
  • Excellent communication and collaboration skills.
  • Ability to lead technical initiatives and mentor team members.
  • Detail-oriented with a focus on data quality and performance.
  • Ability to manage multiple priorities in a fast-paced environment.

Apply for this position