Data Engineer
Role details
Job location
Tech stack
Job description
MS Companies is partnering with a confidential, industry-leading organization in the infrastructure and construction materials space to identify a mid-level Data Engineer to support the development of a modern, cloud-based data platform.
This role is ideal for a hands-on builder who thrives in Databricks and Azure environments and enjoys transforming raw data into scalable pipelines and meaningful business insights. You'll work cross-functionally with finance, operations, and technical teams to build and optimize data systems that directly impact decision-making and long-term data strategy - including future AI/ML initiatives., * Design, build, and maintain scalable data pipelines in a cloud environment
- Develop and optimize Databricks clusters and workflows
- Build and manage data storage solutions (e.g., Delta Lake, cloud warehouse)
- Integrate data platforms with Azure services and enterprise systems
- Support implementation of a modern cloud data warehouse and governance structure
- Develop and maintain CI/CD pipelines for data workflows
- Monitor and optimize performance, cost, and reliability of data processes
- Partner with business stakeholders to deliver data-driven insights and solutions
Physical and Environment Requirements
This role operates in a professional, hybrid office environment.
Responsibilities may include extended periods working at a computer, participating in collaborative meetings, and onsite engagement Tuesday-Thursday.
Requirements
- 3+ years of experience in Data Engineering or Cloud Data Engineering
- Hands-on experience with Databricks and/or Snowflake
- Strong proficiency in Python and SQL
- Experience building and maintaining ETL/ELT pipelines
- Experience working in a cloud environment (Azure preferred)
- Strong understanding of data modeling, schema design, and optimization
- Exposure to CI/CD pipelines and workflow automation
- Bachelor's degree in Computer Science, Engineering, or related field
- Must be authorized to work in the U.S. without sponsorship
Preferred Skills & Qualifications
- Experience with Azure Data Factory, ADLS, or Azure DevOps
- Familiarity with Delta Lake architecture and Databricks optimization
- Experience with PySpark, Pandas, or similar frameworks
- Working knowledge of Linux, Bash, or Docker
- Experience with workflow orchestration tools (ADF, Databricks Jobs, etc.)
- Exposure to Power BI, Tableau, or data visualization tools
- Experience with ERP, financial, supply chain, or operations data
- Exposure to predictive analytics, NLP, or AI/ML initiatives
Benefits & conditions
- Competitive salary with performance-based incentives
- Comprehensive health, dental, and vision coverage
- 401(k) with company match
- Hybrid work flexibility
- Opportunity to work on modern cloud data architecture
- Strong cross-functional exposure across finance and operations
- Long-term growth within a stable, growing organization