Senior Data Engineer
Role details
Job location
Tech stack
Job description
AECOM is seeking an experienced Senior Data Engineer to play a key role in designing, delivering, and optimising data platforms and solutions across a wide range of projects.
As a Senior Data Engineer, you will be responsible for leading major components of the data solution lifecycle, mentoring junior engineers, and ensuring the delivery of robust, scalable, secure, and value-driven data architecture. You will work closely with Data Analysts, Data Scientists, and cross-functional digital teams, supporting analytics use cases and occasionally contributing to light data-science activities such as feature engineering, exploratory analysis, or model operationalisation.
Role responsibilities:
Technical Leadership & Solution Delivery
- Lead a concept through our solution development lifecycle considering standard principles like optimisation and scalability.
- Oversee end-to-end data processes such as ingestion, transformation, modelling, and integration across multiple external, facing projects.
- Oversee end-to-end data processes such as ingestion, transformation, modelling, and integration across multiple external, facing projects.
Collaboration & Stakeholder Engagement
- Facilitate technical workshops and requirements gathering sessions with stakeholders across the organisation and external clients.
- Collaborate with cross-functional data teams to translate client strategic and business requirements into technical specifications.
- Work closely with Data Analysts and Data Scientists to support analytical projects providing support for work such as feature engineering, and big data-analysis activities.
- Collaborate with project managers, architects, and technical teams to ensure seamless integration of data solutions within wider digital ecosystems.
Quality, Governance & Operational Excellence
- Promote and lead with data engineering best practices including code quality, testing, CI/CD, and documentation standards.
- Lead the implementation of data governance controls, including metadata management, access controls, data lineage, PII protection, and compliance with organisational and regulatory requirements.
- Develop monitoring and alerting strategies for data solutions, maintaining high availability, performance, and reliability.
- Troubleshoot complex issues across infrastructure, data solutions, and custom analytical products.
Innovation, Prototyping & Continuous Improvement
- Lead prototyping and proof-of-concept efforts to evaluate emerging technologies and their value within AECOM's data ecosystem.
- Support the operationalisation and deployment of predictive models and analytics solutions.
- Continuously explore new cloud capabilities, data platforms, and modern data stack tools to drive innovation within the team.
Mentoring & Team Development
- Provide technical guidance, code reviews, and coaching to junior and mid-level data engineers.
- Champion knowledge-sharing, standardisation, and collaborative team practices.
Requirements
Do you have experience in Terraform?, Do you have a Master's degree?, * Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field (or equivalent professional experience).
- Professional experience designing and delivering cloud-based data engineering solutions at scale.
- Advanced proficiency in at least one programming language commonly used in data engineering (Python preferred; Scala, Java, or C# also beneficial).
- Strong SQL skills and deep understanding of relational databases, non-relational stores, and data warehouse principles.
- Solid experience with data modelling methodologies (dimensional modelling, star/snowflake schemas, data vault, etc.).
- Strong grounding in analytical workflows and support for data-science activities (feature engineering, data preparation, exploratory analysis).
- Experience designing and operating ETL/ELT pipelines and modern workflow orchestration tools (e.g., Apache Airflow, Azure Data Factory, Azure Functions).
- Practical experience with CI/CD, version control (Git), testing frameworks, and DevOps practices.
- Understanding of APIs, REST principles, and data integration patterns.
- Experience implementing data quality, validation, and observability frameworks.
Preferred Qualifications
- Master's degree in Computer Science, Engineering, Mathematics, or related discipline.
- Professional certifications in cloud platforms (AWS, Azure, or GCP).
- Experience supporting or operationalising machine-learning models (e.g., model deployments, monitoring, ML pipelines).
- Exposure to advanced analytics frameworks (e.g., scikit-learn, MLflow, Databricks Runtime).
- Proficiency in containerisation and IaC (Docker, Kubernetes, Terraform, Bicep).
Soft Skills
- Excellent communication skills with the ability to simplify technical concepts for non-technical audiences.
- Strong analytical mindset, with the ability to identify issues, propose solutions, and make architecture recommendations.
- Ability to lead, mentor, and influence teams while remaining hands-on.
- A proactive, experimental, and continuous learning approach to emerging technologies.
- Strong organisational skills and the ability to manage multiple tasks in parallel.