Senior Enterprise Applications Engineer

The Information Technology
Atlanta, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Atlanta, United States of America

Tech stack

Adobe InDesign
API
Agile Methodologies
Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Data analysis
Application Lifecycle Management
ARM
Data Architecture
Data Validation
Information Engineering
Data Governance
Data Integration
Data Integrity
ETL
Data Transformation
Data Security
Data Systems
Data Warehousing
Relational Databases
DevOps
Github
PostgreSQL
Machine Learning
Meta-Data Management
Microsoft SQL Server
MySQL
Scrum
Regression Testing
Power BI
TensorFlow
SQL Databases
Systems Integration
Tableau
Enterprise Software Applications
Data Storage Technologies
PyTorch
Snowflake
Technical Debt
GIT
Scikit Learn
Real Time Data
Kafka
Data Management
Azure
Data Pipelines
Redshift

Job description

Cox Automotive is seeking a Senior Enterprise Applications Engineer to join our Enterprise Finance Technology team. This role is responsible for designing, building, and optimizing data architecture, pipelines, and enterprise applications that power strategic decision-making across the organization. The Senior Engineer will provide information, insights, and analyses in support of financial consolidation, planning, forecasting, and reporting-combining deep data engineering expertise with strong business intelligence capabilities to ensure data is accessible, trusted, and actionable., This position interacts with a variety of teams-including Finance, Accounting, FP&A, and cross-functional technology groups-to understand business needs and deliver data-driven solutions. The insights and infrastructure provided by this individual will be used to make informed prioritizations, support standard and ad-hoc reporting, and enable the development of new analytical capabilities. The Senior Engineer is expected to perform these duties with minimal daily oversight while mentoring junior team members, contributing to Agile planning, and staying current on emerging technologies., Data Engineering, Quality & Pipeline Development

  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows using Snowflake, AWS, dbt, and/or Informatica, integrating enterprise financial applications with data warehouses, relational databases (PostgreSQL, MySQL, SQL Server), and BI platforms.
  • Ensure data integrity by implementing automated quality checks, regression testing, validation frameworks, and anomaly monitoring-working with internal and external data providers to customize data feeds and mappings.

Project Delivery & Solution Design

  • Independently plan, manage, and deliver small to medium-sized projects end-to-end (requirements gathering, solution design, development, testing, deployment, documentation, and post-implementation support), while also contributing technical components to larger cross-functional programs.

Analytics, Reporting & Continuous Improvement

  • Support dashboard and reporting development using Power BI, Tableau, or similar tools to deliver KPI insights, and lead enhancements that improve financial processes, reduce manual work, and increase accuracy.
  • Leverage AI-assisted tools (e.g., Claude, GitHub Copilot, Snowflake Cortex, M365 Copilot) to accelerate development, data validation, and analysis workflows.

Agile Planning & Emerging Technologies

  • Partner with finance and analytics teams to understand day-to-day challenges and design viable data solutions; recommend improvements to processes, technology, and interfaces that reduce technical debt.
  • Stay current on new data technologies, AI, ML, Data Science, CPM platform innovations, and best practices; share insights and contribute to design standards across the organization.

Requirements

Do you have experience in Technical solutions implementation?, Do you have a Master's degree?, * Bachelor's degree in related discipline and 4+ years of experience in data engineering or architecture. The right candidate could also have a different combination, such as a master's degree and 2 years' experience; a Ph.D. and up to 1 year of experience; or 16 years' experience in a related field.

  • 4+ years of hands-on experience in data engineering, business intelligence, and/or enterprise analytics across multiple functional areas (reporting, dashboards, data pipelines, and data modeling).
  • Advanced SQL proficiency and experience with data integration tools (e.g., MS SQL Developer, dbt, Informatica).
  • Must have strong working experience with data modeling, data access, schemas, and data storage techniques within Snowflake.
  • Working experience in design, development, and implementation of scalable data pipelines in cloud environments (AWS, Snowflake).
  • Working experience with ETL/ELT patterns, data warehousing concepts, and data orchestration tools.
  • Working experience working with relational databases such as SQL, MySQL, Postgres/PostgreSQL.
  • Working experience with business intelligence tools and platforms (Power BI, Tableau, or similar).
  • Working experience with data quality tools.
  • Working experience with application lifecycle methodologies (e.g. waterfall, agile, iterative).
  • Demonstrated project management experience with complex system implementations.
  • Experience working with Git.
  • Experience working with AI related tools such as GitHub Copilot, Snowflake Cortex, or M365 Copilot.
  • Experience implementing new tools into environments.
  • Excellent analytical, problem-solving, and communication skills with the ability to present technical concepts to non-technical stakeholders., * Experience with AI/ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) for data transformation and predictive modeling.
  • Familiarity with data orchestration tools such as Apache Airflow, dbt, or Dagster.
  • Hands-on experience with cloud-native data platforms (e.g., Snowflake, AWS Redshift, Azure Synapse).
  • Knowledge of data governance and metadata management best practices.
  • Experience integrating external data sources and APIs into enterprise data ecosystems.
  • Strong understanding of CI/CD pipelines and DevOps practices for data engineering.
  • Ability to work in Agile environments and contribute to sprint planning and backlog grooming.
  • Exposure to real-time data streaming technologies (e.g., Kafka, Kinesis) is a plus.

About the company

Cox empowers employees to build a better future and has been doing so for over 120 years. With exciting investments and innovations across transportation, communications, cleantech and healthcare, our family of businesses - which includes Cox Automotive and Cox Communications - is forging a better future for us all. Ready to make your mark? Join us today!

Apply for this position