Data Engineer - Newcastle
Accenture
Newcastle upon Tyne, United Kingdom
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Newcastle upon Tyne, United Kingdom
Tech stack
Java
API
Agile Methodologies
Artificial Intelligence
Amazon Web Services (AWS)
Azure
Google BigQuery
Cloud Computing
Code Review
Computer Programming
Continuous Integration
Data Architecture
Information Engineering
ETL
Data Systems
Software Design Patterns
DevOps
Distributed Systems
Github
Python
Software Engineering
Google Cloud Platform
Snowflake
Spark
GIT
Cloudformation
Containerization
Kubernetes
Apache Flink
Kafka
Data Management
Terraform
Stream Processing
Data Pipelines
Docker
Jenkins
Databricks
Microservices
Job description
As a Data Engineer, you will design, build, and maintain scalable data solutions that enable analytics, AI, and operational insights. You'll work alongside client and internal teams to create robust data pipelines, ensure data reliability, and support cloud-based architectures that power intelligent decision-making., Data Pipeline Development
- Build, optimize, and maintain scalable data pipelines using Java (primary), plus exposure to Python, Flink, Kafka, or Spark.
- Develop and support real-time streaming pipelines and event-driven integrations.
- Integrate data from multiple sources (streaming, batch, APIs) using AWS managed services (e.g., Kinesis, MSK, Lambda, Glue).
Data Architecture & Standards
- Contribute to data modelling, data architecture best practices, and modern patterns (e.g., medallion architecture).
- Ensure data quality, lineage, governance, and security controls are applied consistently.
DevOps & Deployment
- Deploy and maintain data applications using CI/CD tooling (Azure DevOps, GitHub Actions, Jenkins).
- Use Infrastructure as Code (e.g., Terraform, CloudFormation) to manage cloud environments.
- Work with container technologies such as Docker and Kubernetes-based workloads.
Collaboration
- Work closely with analytics, ML/AI, and product teams to deliver clean, well-structured datasets.
- Participate in code reviews and internal knowledge-sharing sessions.
- Provide guidance to junior engineers where needed.
Requirements
Do you have experience in Terraform?, * Strong programming proficiency in Java (preferred) or Python.
- Hands-on experience with at least one of: Kafka, Flink, Spark (Flink/Kafka preferred for streaming).
- Solid understanding of stream processing concepts (e.g., event time, state, backpressure).
- Understanding of software engineering best practices: testing, design patterns, CI/CD, Git.
- Experience building ETL/ELT or streaming data pipelines.
- Exposure to microservices and distributed system concepts.
- Experience working with cloud platforms, ideally AWS, but Azure/GCP also acceptable.
- Understanding of distributed compute, large-scale data systems, and performance considerations.
DevOps & Engineering Practices
- Experience with CI/CD tools (Azure DevOps, GitHub Actions, Jenkins etc.).
- Infrastructure-as-Code (Terraform preferred).
- Experience with containerisation (Docker) and orchestration platforms (Kubernetes/EKS).
Certifications & Tools
- Exposure to enterprise data platforms (Databricks, Snowflake, BigQuery, or similar).
- Cloud certifications (AWS, Azure, GCP) are beneficial but not required.
Other Requirements
- Minimum 3 years' experience working on data engineering or large-scale data solutions.
- Comfortable working in Agile delivery teams.
- Strong communication skills and ability to collaborate with technical and non-technical stakeholders.
Desirable
- Experience in client-facing or consulting environments.
- Professional cloud or data engineering certifications.
- Experience mentoring or supporting junior engineers.
- Background in designing or operating real-time, low-latency systems.
About the company
Our Advanced Technology Centre is a hub of innovation where we deliver high-quality data and technology services to clients across both the public and private sectors. You'll join a collaborative culture that values diverse thinking, continuous learning, and opportunities for career growth within a global network of experts.
If you're looking for a dynamic that offers hands-on experience with modern data technologies and the chance to shape large-scale data solutions, this position offers you the opportunity to develop and progress rapidly.