Data Engineer
Role details
Job location
Tech stack
Job description
The Infrastructure AI and Data program is a cornerstone initiative designed to transform how our Global Business Unit (GBU) manages, governs, and leverages data to drive innovation and operational excellence. As we scale digital capabilities across complex engineering and delivery environments, the ability to harness data effectively is critical to enabling advanced analytics, AI-driven insights, and seamless collaboration across diverse stakeholders.
This role combines technical leadership with hands-on data engineering, focusing on designing and optimizing data pipelines, improving performance, and contributing to data modeling and architecture decisions. The Data Engineer will work with the Data Solutions Architect and functional SMEs to deliver scalable, secure, and high-quality data solutions.
"This position is designated as part-time telework per our global telework policy and may require at least three days of in-person attendance per week at the assigned office or project. Weekly in-person schedules will be determined by the individual and their supervisor, in consultation with functional or project leadership".
Major Responsibilities:
- Develop and maintain complex ETL/ELT pipelines to ingest and transform data from multiple sources into the UDP environment.
- Performance tuning and optimization of data workflows within Azure Databricks.
- Implement advanced data quality checks and validation routines to ensure integrity and consistency of datasets.
- Collaborate with data solutions architect and functional SMEs to understand data requirements and deliver solutions aligned with business needs.
- Contribute to data modeling and architecture decisions to support analytics and reporting needs.
- Support integration of structured and semi-structured data into the lakehouse architecture.
- Document processes and contribute to best practices for data engineering within the PIIM team.
- Mentor junior engineers and promote best practices in coding, testing, and documentation., For decades, Bechtel has worked to inspire the next generation of employees and beyond! Because our teams face some of the world's toughest challenges, we offer robust benefits to ensure our people thrive. Whether it is advancing careers, delivering programs to enhance our culture, or providing time to recharge, Bechtel has the benefits to build a legacy of sustainable growth. Learn more at Bechtel Total Rewards
Requirements
Do you have experience in Unity?, Do you have a Bachelor's degree?, * Bachelor's degree in computer science or related field
- Experience in data engineering or related role.
- Strong proficiency in SQL and advanced experience with Python or Scala for data processing.
- Hands-on experience with cloud-based data platforms (Azure preferred).
- Solid understanding of data modeling, ETL/ELT processes, and performance optimization techniques.
- Ability to lead technical discussions and provide guidance on data engineering best practices.
- Ability to troubleshoot and optimize data workflows.
Required Knowledge and Skills:
- Experience with Databricks or Spark-based environments, including optimization strategies.
- Familiarity with data governance tools and frameworks (e.g., Unity Catalog).
- Experience with JavaScript Full-Stack development environments including MongoDB or similar NoSQL databases
- Experience with CI/CD pipelines and version control (Git) for data workflows.
- Exposure to EPC industry data structures and standards.