Data Engineer - AI/Cloud Specialist
Role details
Job location
Tech stack
Job description
We are seeking an Application Programmer III to join our Product Master Environment team, the firm-wide provider of product and pricing reference data serving 400+ applications across multiple lines of business. The ideal candidate will have strong experience in modern data engineering across AWS, GCP, and Azure, expert skills in Python and SQL, and a proven ability to design, build, and operate scalable ETL/ELT pipelines using Snowflake and Databricks to deliver reliable, secure, and cost-efficient data products. Responsibilities:
- Deliver technical solutions across all phases of the SDLC for data platforms.
- Design, implement, and optimize data ingestion, transformation, and orchestration workflows.
- Build and operate production-grade ETL/ELT pipelines across AWS, GCP, and Azure.
- Collaborate with data scientists, platform engineers, and product teams to deliver data products.
- Own application design, development, delivery, and post-production support.
- Enhance existing application stack and manage key reference data applications.
Requirements
-
5-8+ years in data engineering or software engineering with hands-on pipeline development.
-
Production experience with at least one cloud: AWS, GCP, or Azure.
-
Strong proficiency in Python and SQL. Scala is a plus.
-
Deep experience with Snowflake including ELT, performance tuning, and security.
-
Experience with Databricks and Spark, including Delta Lake and structured streaming.
-
Experience integrating and modeling data from MongoDB for operational and analytics use cases.
-
Knowledge of orchestration tools such as Airflow, ADF, or Cloud Composer, plus CI/CD, Git, and Terraform.
-
Understanding of data modeling, distributed systems, file formats, and performance optimization.
-
History of shipping production pipelines with monitoring, alerting, and documentation. Preferred Skills:
-
Streaming ingestion with Kafka, Kinesis, or Event Hubs.
-
dbt for transformation and testing in Snowflake or Databricks.
-
Security practices: encryption, KMS or Key Vault, tokenization, and network isolation.
-
Cost governance and FinOps for data platforms.
-
Exposure to ML feature pipelines and feature stores such as Databricks Feature Store.
-
Bachelor's or Master's in Computer Science, Engineering, or related field, or equivalent experience.
Benefits & conditions
- Competitive compensation and benefits.
- Opportunities for growth with global clients.
- A supportive, inclusive culture that values innovation and people.
- Exposure to cutting-edge technologies and projects. About Our Commitment BCforward is an equal opportunity employer. We value diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, or veteran status.