Consultant, Data Engineer

IBM
Philadelphia, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Philadelphia, United States of America

Tech stack

JavaScript
Artificial Intelligence
Amazon Web Services (AWS)
Azure
Google BigQuery
Cloud Computing
Databases
Data Architecture
Information Engineering
Data Governance
ETL
Data Systems
Data Visualization
Data Warehousing
Cursor (Graphical User Interface Elements)
Database Development
Software Debugging
Programming Tools
Data Flow Control
Github
Python
Cloud Services
SQL Databases
Data Processing
Cloud Platform System
Data Ingestion
Azure
Snowflake
Information Technology
Amazon Web Services (AWS)
Data Management
Cloud Integration
Data Delivery
Data Pipelines

Requirements

We are in search of a skilled Consultant Data Engineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols.

The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the Data Engineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects.

This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity.

This role can be performed from anywhere in the US

Required technical and professional expertise

  • Bachelor's degree in engineering, computer science or equivalent area
  • 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains.
  • Experience developing data warehouses.
  • Experience building ETL / ELT ingestion pipelines.
  • Proficiency in using cloud platform services for data engineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow).
  • Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance.
  • Knowledge of how to manipulate, process and extract value from large disconnected datasets.
  • SQL and Python scripting experience require, Scala and Javascript is a plus.
  • Cloud experience (AWS, Azure or GCP) is a plus.
  • Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT.
  • Strong interpersonal skills including assertiveness and ability to build strong client relationships.
  • Strong project management and organizational skills.
  • Ability to support and work with cross-functional and agile teams in a dynamic environment.
  • Advanced English required.

Preferred technical and professional experience

  • Cloud Integration Knowledge: Exposure to integrating cloud computing concepts and technologies with Snowflake platforms, enhancing data and AI use case implementation.
  • Advanced Data Engineering: Experience working with data engineering principles and practices to deliver high-quality solutions on Snowflake platforms, leveraging expertise in Snowflake and cloud computing.
  • Technical Solution Optimization: Experience applying technical expertise to optimize solutions on Snowflake platforms, ensuring seamless integration and optimal performance for data and AI use cases.
  • AI Development Experience: Familiarity with leveraging AI-assisted development tools (e.g., GitHub Copilot, Cursor, or similar) to accelerate coding, debugging, and solution design within data engineering workflows.

About the company

At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.

Apply for this position