Senior Data Engineer

Piper Maddox
Charing Cross, United Kingdom
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Charing Cross, United Kingdom

Tech stack

Amazon Web Services (AWS)
Azure
Continuous Integration
ETL
Data Warehousing
Distributed Systems
Python
SQL Databases
GIT
Data Lake
Data Management
Data Pipelines
Databricks

Job description

An established renewable energy and digital solutions business is expanding its Asset Performance Management (APM) technology team and is hiring an experienced Data Engineer to support large-scale operational renewable assets.

This role sits within a product-focused engineering group responsible for building and scaling data platforms used to monitor, optimise, and improve the performance of wind, solar, and energy storage assets globally.

You will work closely with software engineers, data scientists, and platform teams to design and operate high-quality data pipelines that directly underpin operational decision-making and analytics for live energy assets., * Design, build, and maintain scalable data pipelines using Databricks (including Delta Live Tables).

  • Develop robust ETL/ELT workflows ingesting data from operational, telemetry, and third-party systems.
  • Optimise pipeline performance, reliability, and cost efficiency in cloud environments.
  • Ensure data quality, lineage, governance, and documentation across production systems.
  • Collaborate cross-functionally with analytics, product, and platform teams.
  • Support CI/CD automation for data pipeline deployment.
  • Contribute to reusable frameworks and engineering best practices within the team.

Requirements

Candidates must have prior, hands-on experience working with at least one of the following APM platforms:

  • Power Factors
  • Bazefield
  • GPM

This experience is critical, as the role involves working directly with data models, integrations, and operational outputs from these platforms.

Technical requirements

  • Proven experience as a Data Engineer in production environments.
  • Strong Python and SQL skills.
  • Hands-on Databricks experience (DLT, Delta Lake; Unity Catalog desirable).
  • Solid understanding of data modelling, data warehousing, and distributed systems.
  • Experience with cloud data platforms (Azure preferred; AWS or GCP acceptable).
  • Familiarity with Git-based workflows and CI/CD pipelines.
  • Exposure to analytics or ML-driven use cases is beneficial.

Nice to have

  • Databricks certifications (Associate or Professional).
  • Experience supporting asset-heavy or industrial environments.
  • Background in energy, utilities, or infrastructure data platforms.

Why this role

  • Work on live, utility-scale renewable assets rather than abstract datasets.
  • High-impact role within a mature but fast-evolving digital platform.
  • Strong engineering culture with real ownership and technical influence.
  • Long-term stability combined with ongoing platform growth and investment.

Apply for this position