Data Engineer

Intercontinental Exchange
27 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Tech stack

Artificial Intelligence
Automation of Tests
Unit Testing
Continuous Integration
Information Engineering
Data Governance
Data Integration
Data Integrity
Data Transformation
Data Security
Data Systems
Cursor (Graphical User Interface Elements)
Software Debugging
Machine Learning
Open Source Technology
Performance Tuning
DataOps
SQL Databases
Feature Engineering
GitHub Copilot
Kubernetes
Deployment Automation
Data Management
Stream Processing
Software Version Control
Data Pipelines

Job description

We're seeking a talented Senior Data Engineer to join our Enterprise Architecture team in a cross-cutting role that will help define and implement our next-generation data platform. In this pivotal position, you'll lead the design and implementation of scalable, self-service data pipelines with a strong emphasis on data quality and governance. This is an opportunity to shape our data engineering practice from the ground up, working directly with key stakeholders to build mission-critical ML and AI data workflows.

We emphasize building systems that are maintainable, scalable, and focused on enabling self-service data access while maintaining high standards for data quality and governance. The ideal candidate is a problem-solver who enjoys working on complex data systems and is passionate about data quality. You thrive in collaborative environments but can also work independently to deliver solutions. You're comfortable working directly with technical and non-technical stakeholders and can communicate complex technical concepts clearly. Most importantly, you're excited about creating systems that empower others to work with data efficiently and confidently.

Responsibilities

  • Design, build, and maintain our on-premises data orchestration platform using the best-in-breed open-source tools
  • Create self-service capabilities that empower teams across the organization to build and deploy data pipelines without extensive engineering support
  • Implement robust data quality testing frameworks that ensure data integrity throughout the entire data lifecycle
  • Establish data engineering best practices, including version control, CI/CD for data pipelines, and automated testing
  • Collaborate with ML/AI teams to build scalable feature engineering pipelines that support both batch and real-time data processing
  • Develop reusable patterns for common data integration scenarios that can be leveraged across the organization
  • Work closely with infrastructure teams to optimize our Kubernetes-based data platform for performance and reliability
  • Mentor junior engineers and advocate for engineering excellence in data practices

Requirements

  • 5+ years of professional experience in data engineering, with at least 2 years working on enterprise-scale data platforms
  • Demonstrated experience with orchestrating workflows, performance optimization, and operational management
  • Strong understanding of data transformation techniques, including experience with testing frameworks and deployment strategies
  • Experience with stream processing frameworks and technologies
  • Proficiency with SQL and Python for data transformation and pipeline development
  • Familiarity with containerized application deployment
  • Experience implementing data quality frameworks and automated testing for data pipelines
  • Ability to work cross-functionally with data scientists, ML engineers, and business stakeholders

Preferred Knowledge and Experience

  • Experience with self-hosted data orchestration platforms (rather than managed services)
  • Background in implementing data contracts or schema governance
  • Knowledge of ML/AI data pipeline requirements and feature engineering
  • Experience leveraging AI tools (e.g., GitHub Copilot, Cursor, Claude Code) to debug, develop unit tests and generate test cases from requirements documents
  • Experience with real-time data processing and streaming architectures
  • Familiarity with data modeling and warehouse design principles
  • Prior experience in a technical leadership role

Apply for this position