Senior Databricks Consultant
Role details
Job location
Tech stack
Job description
We're supporting a major financial-systems transformation programme and are looking for a Senior Databricks Engineer to help build and enhance a scalable data-engineering framework used across critical reporting processes. You'll operate as the senior hands-on engineer, shaping reusable libraries, optimising PySpark pipelines, and guiding offshore developers to deliver high-quality, production-ready code.
You'll work across ingestion, validation, transformation, and mapping layers within a Databricks-on-Azure environment, helping to establish consistent engineering patterns and ensuring all deliveries meet high standards of performance, traceability, and maintainability.
What you'd be doing:
- Building and extending reusable Databricks/PySpark libraries (ingestion, validation, transformation, mapping).
- Developing scalable, optimised PySpark pipelines aligned to metadata-driven or hybrid data models.
- Implementing robust validation and control logic suitable for regulated financial environments.
- Leading engineering quality: code reviews, mentoring offshore teams, ensuring standards across readability, testing, and performance.
- Supporting integration paths into downstream reporting systems and managing return-feeds.
- Producing clear documentation, runbooks, and developer guides., This is an urgent role, will be remote with once monthly visits to client site and up to £400 per day OUTSIDE IR35.
Requirements
- Strong hands-on experience with Databricks and PySpark (optimisation, partitioning, orchestration).
- Solid Azure data-platform experience.
- Proven ability to build reusable frameworks/libraries with clean abstractions.
- Experience guiding or leading offshore engineering teams.
- Comfortable working with evolving requirements while maintaining strong engineering discipline.
- Bonus: exposure to financial or regulatory reporting, Tagetik, or metadata-driven/EAV models.