Senior Data Engineer

Equifax
Clayton, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Clayton, United States of America

Tech stack

Agile Methodologies
Artificial Intelligence
Airflow
Cloud Computing
Code Review
Data Security
Data Flow Control
JSON
Python
SQL Databases
Tableau
Scripting (Bash/Python/Go/Ruby)
Google Cloud Platform
Data Strategy
GIT
Data Pipelines
Programming Languages

Job description

The Senior Data Engineer is responsible for architecting, implementing, and maintaining scalable data pipelines while ensuring data quality, security, and accessibility. Fulfilling this mission will enable the Finance team to transform raw information into a strategic asset, driving competitive advantage through advanced analytics and cloud-native solutions.

This role requires a visionary leader capable of synthesizing complex business strategies into a technical roadmap. You will spearhead the Finance Business Intelligence team data strategy, balancing the delivery of immediate, high-impact results with long-term architectural integrity. As a mentor and technical authority, you will oversee the automation of complex workflows and lead critical security and governance initiatives to protect and optimize our data ecosystem.

This role requires being in the office 3 days/week on Tues - Thurs. This position does not offer immigration sponsorship (current or future) including F-1 STEM OPT extension support.

What you'll do:

  • Strategize & Architect: Develop a cohesive data strategy aligned with business unit objectives, designing scalable and secure pipelines on GCP
  • Drive Simplification and Efficiency: Create and execute a strategic roadmap for the simplification of existing pipelines, ensuring high performance and cost-optimization.
  • Lead Agile Delivery: Champion Agile methodologies and iterative product development, collaborating with cross-functional squads to ensure high-velocity deployment, continuous improvement, and alignment with evolving business priorities.
  • Automate & Optimize: Streamline data workflows using Python, JSON, Composer, and Airflow, while proactively monitoring pipelines to resolve performance bottlenecks.
  • Lead Governance & Security: Implement best-in-class data security, access control, and lineage tracking, serving as the primary architect for security initiatives.
  • Innovate with AI/ML: Develop and deploy advanced solutions, including predictive models, recommendation engines and agentic agents, through individual contribution and cross-team collaboration.
  • Ensure Data Excellence: Lead data quality initiatives by designing comprehensive monitoring dashboards and maintaining rigorous data models.
  • Mentor & Empower: Guide junior data engineers, fostering a high-performing culture through knowledge sharing, code reviews, and Git best practices.

Requirements

  • BS degree in a STEM major or equivalent discipline; Master's Degree strongly preferred
  • 8+ years of experience as a data engineer or related role, with experience demonstrating leadership capabilities
  • Cloud certification strongly preferred
  • Expert level skills using programming languages such as Python or SQL and advanced level experience with scripting languages.
  • Demonstrated proficiency in all Google Cloud Services
  • Experience building and maintaining complex data pipelines, troubleshooting complex issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects; Proficiency in Airflow strongly desired
  • Experience designing and implementing advanced to complex data models and experience enabling advanced optimization to improve performance
  • Experience leading a team with Git expertise strongly preferred

What could set you apart:

  • Financial Planning & Analysis Experience
  • Google Cloud Professional Certifications (e.g., Cloud Data Engineer, Cloud Architect).
  • Tableau Certifications
  • Familiarity with data pipelines using Composer, Dataproc or real-time processing with Cloud Dataflow
  • Proven experience in building and scaling AI/ML-driven products or agentic workflows.

About the company

Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference!

Apply for this position