Data DevOps Engineer

Addition Solutions Ltd
Watford, United Kingdom
18 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 85K

Job location

Watford, United Kingdom

Tech stack

Amazon Web Services (AWS)
Amazon Web Services (AWS)
JIRA
Cloud Computing
Cloud Engineering
Continuous Integration
Data as a Services
Information Engineering
ETL
Data Warehousing
DevOps
Distributed Systems
Github
Python
Release Management
Power BI
Prometheus
SQL Databases
Datadog
Data Logging
Data Processing
Grafana
Spark
Kubernetes
Cloud Integration
Cloudwatch
Terraform
Software Version Control
Data Pipelines
Docker
Jenkins

Job description

  • Owning cloud integration across AWS for BI workloads, ensuring infrastructure is consistent, secure, and scalable.
  • This role is brand new and it's being introduced to bridge the gap between DevOps and Data with the data departments.
  • Building and maintaining CI/CD pipelines that support ETL and reporting releases.
  • Managing code promotion processes, version control standards, and Jira integrations.
  • Overseeing non-production environments to ensure data freshness, alignment, and smooth testing.
  • Orchestrating data provisioning, refreshes, and automated workflows for analytics teams.
  • Optimising ETL and Power BI code to improve performance, efficiency, and reliability.
  • Implementing observability and logging frameworks to monitor data services and deployments.
  • Partnering with engineering, data, and reporting teams to coordinate releases and resolve technical challenges.
  • Embedding security, governance, and compliance practices across AWS environments.
  • Monitoring performance and cost usage, recommending improvements and efficiencies.
  • Driving continuous improvement across pipelines, tooling, automation, and release processes.

Requirements

  • Strong hands-on AWS experience across Redshift, S3, EMR, Lambda and infrastructure-as-code.
  • Demonstrated ability to align with DevOps while maintaining distinct responsibilities within the BI function.
  • Proficient with CI/CD tooling such as Jenkins or GitHub Actions.
  • Advanced Python scripting and solid SQL capability.
  • Experience with large-scale data processing (Spark/EMR) and data warehousing concepts.
  • Knowledge of Docker/Kubernetes and containerised deployment workflows.
  • Familiarity with Jira integrations, release management, and environment refresh processes.
  • Skilled in optimising ETL pipelines and Power BI models, DAX, and refresh strategies.
  • Strong troubleshooting skills across cloud, data pipelines, and distributed systems.
  • Experience with observability tools such as CloudWatch, Datadog, Prometheus or Grafana.
  • Comfortable working in agile environments with multiple concurrent release cycles.

What's in It for You:

  • The chance to work on a major national-scale transformation with significant technical scope.
  • A forward-thinking environment that embraces automation, innovation, and modern tooling.
  • Supportive teams, strong cross-functional collaboration, and room to influence best practice.
  • Career development across cloud engineering, DevOps, data engineering, and BI.
  • Inclusive culture where your contribution directly supports meaningful social impact., * Data
  • AWS
  • DevOps
  • Kubernetes
  • Terraform
  • Data DevOps

Apply for this position