Senior Algorithm Engineer (Python/Spark-Distributed Processing)
Xcede
Charing Cross, United Kingdom
9 days ago
Role details
Contract type
Permanent contract Employment type
Part-time (≤ 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Remote
Charing Cross, United Kingdom
Tech stack
Amazon Web Services (AWS)
Business Logic
Azure
Cloud Computing
Distributed Data Store
Distributed Systems
Python
NumPy
Performance Tuning
SQL Databases
Data Processing
Spark
Pandas
PySpark
Dask
Databricks
Microservices
Job description
- Design, build and deploy algorithms/data models supporting pricing, forecasting and optimisation use cases in production
- Develop and optimise distributed Spark / PySpark batch pipelines for large-scale data processing
- Write production-grade Python workflows implementing complex, explainable business logic
- Work with Databricks for job execution, orchestration and optimisation
- Improve pipeline performance, reliability and cost efficiency across high-volume workloads
- Collaborate with engineers and domain specialists to translate requirements into scalable solutions
- Provide senior-level ownership through technical leadership, mentoring and best-practice guidance
Requirements
- Proven experience delivering production algorithms/data models (forecasting, pricing, optimisation or similar)
- Strong Python proficiency and modern data stack exposure (SQL, Pandas/NumPy + PySpark; Dask/Polars/DuckDB a bonus)
- build, schedule and optimise Spark/PySpark pipelines in Databricks (Jobs/workflows, performance tuning, production delivery)
- Hands-on experience with distributed systems and scalable data processing (Spark essential)
- Experience working with large-scale/high-frequency datasets (IoT/telemetry, smart meter, weather, time-series)
- Clear communicator able to influence design decisions, align stakeholders and operate autonomously
Nice to have
- Energy/utilities domain exposure
- Cloud ownership experience (AWS preferred, Azure also relevant)
- Experience defining microservices / modular components supporting data products