Data Engineer

Eclaro International Inc.
New York, United States of America
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Remote
New York, United States of America

Tech stack

API
Azure
Cloud Computing
Code Review
Continuous Integration
Information Engineering
Data Integration
ETL
Dimensional Modeling
Github
Monitoring of Systems
Python
Standard Sql
Systems Integration
Azure
Delivery Pipeline
Informatica Cloud
Data Management
Terraform
Software Version Control
Data Pipelines
Jenkins
Databricks

Job description

  • The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and systems that deliver trusted data for analysis and product use cases.
  • This role partners with cross-functional teams to understand data needs and implement solutions that support both near-term and long-term objectives.
  • This role requires the ability to contribute to technical design, ensure data quality, and operate with increasing independent and accountability., * Develop and maintain batch and streaming data pipelines using modern tools and frameworks. Design transformations, optimize performance, and ensure reliable data delivery.
  • Design and implement scalable and maintainable data models and storage solutions that align with business needs and support efficient querying, analysis, and data integration efforts.
  • Engage in Agile Best Practices, help refine stories, identify dependencies, and proactively raise risks or concerns to ensure work is completed on time or escalated when needed.
  • Implement and enforce data quality controls, validation, and compliance standards across pipelines.
  • Support the deployment, scheduling, and monitoring of data pipelines and workflows to ensure consistent, reliable execution.
  • Maintain comprehensive documentation and advocate for coding standards, best practices, and reusable components.
  • Collaborate regularly with cross-functional teams to clarify data requirements, document assumptions, and deliver high-quality solutions.
  • Communicate clearly during stand-ups, design discussion, and retrospectives. Actively contribute to team code reviews and sharing learnings with peers.

Requirements

  • 2-5 years of experience in data engineering, data modeling, and ETL pipelines.
  • Proficient in SQL and Python for creating, improving, and fixing data pipelines.
  • Experience with cloud and data platforms, especially Azure and Databricks (Delta Live Tables and Unity Catalog).
  • Strong understanding of tools like SnapLogic, Azure Data Factory, and Jenkins for data integration and orchestration.
  • Practical experience with Terraform for infrastructure as code and managing deployment pipelines
  • Experience integrating with APIs.
  • Knowledge of data quality and monitoring tools, particularly Soda or similar.
  • Proficient in version control and CI / CD workflows, using tools like GitHub.
  • Solid understanding of data modeling principles (e.g., dimensional modeling, normalization).
  • Comfortable working in agile teams, with a proactive approach to planning, organizing tasks, and collaborating.

About the company

ECLARO's client's provides retirement savings, health & investment management and insurance services. If you're up to the challenge, then take a chance at this rewarding opportunity

Apply for this position