Data Engineer - Business Automation & Analytics
Role details
Job location
Tech stack
Job description
Xylem is seeking a Data Engineer - Business Automation & Analytics to join our global team. This role builds and supports data and analytics platforms that power Finance, Operations, and Customer tools across the organization.
NOTE: This is a SQL-first role focused on data transformation and analysis within relational databases, particularly ERP and finance systems. It is not a Spark- or cloud-first data engineering position. The majority of day-to-day work is performed in SQL, designing and optimizing relational data models that serve analytics and reporting needs.
The Data Engineer is responsible for critical design, support, and management of key internal data platform applications. You'll design, develop, and maintain scalable data pipelines, dashboards, and automations that enable smarter, data-driven decisions supporting Xylem's mission, * Design and build SQL-centric, relational datapipelines that aggregate and clean data from multiple sources (DB2, Dynamics AX, Progress OpenEdge, Oracle, Salesforce, etc.)
- Develop, optimize and maintain and maintain complex SQL-based relational models s in PostgreSQL
- Orchestrate and automate SQL-driven transformations using Airflow and Python
- Deploy and maintain analytics infrastructure across Dev / Test / Prod Linux environments
- Design and publish Power BI datasets and dashboards using DAX and SQL
- Partner with process owners and business leads to define metrics and improve data quality
- Document data flows and automation logic to ensure maintainability and transparency
- Troubleshoot performance and data issues across systems and environments, Join the global Xylem team to be a part of innovative technology solutions transforming water usage, conservation, and re-use. Our products impact public utilities, industrial sectors, residential areas, and commercial buildings, with a commitment to providing smart metering, network technologies, and advanced analytics for water, electric, and gas utilities. Partner with us in creating a world where water challenges are met with ingenuity and dedication; where we recognize the power of inclusion and belonging in driving innovation and allowing us to compete more effectively around the world.
Requirements
Must-haves: Heavy SQL/PostgreSQL, Airflow (pipelines/orchestration), Python, ERP/CRM data in finance/ops, Linux (on-prem familiarity)., * Bachelor's degree in Data Engineering, Business Information Systems, or related field
- 3+ years hands on experience building and maintaining data pipelines or ETL jobs
- Experience with PostgreSQL (preferred) or other RDBMS
- Advanced SQL proficiency with hands-on experience transforming data in relational databases (PostgreSQL strongly preferred)
- Familiarity with Linux / Red Hat environments and version control (Git)
- Understanding of ERP, CRM, and MRP data models (Dynamics, Salesforce, etc.)
- Experience with Power BI or similar BI tools (DAX, data modeling)
- Solid communication, documentation, and follow through
WHAT SUCCESS LOOKS LIKE
You are comfortable spending most of your time writing and optimizing SQL against relational databases, transforming complex ERP and finance data into clean, trusted datasets. You think in terms of relational modeling, query performance, and data quality - not distributed compute frameworks or cloud-native ingestion patterns.