Senior Data Engineer
Pgmbm Law Ltd
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Tech stack
Unity
API
Artificial Intelligence
Azure
Content Analysis
Data Architecture
Information Engineering
Data Governance
Data Infrastructure
Python
Standard Sql
Azure
Salesforce
Search Technologies
Privacy Controls
Large Language Models
Spark
Data Lake
AI Platforms
Machine Learning Operations
Key Vault
Databricks
Job description
You will architect and build the data foundations that power litigation to support millions of Clients across the world, create AI services that augment our lawyers and analysts, and deploy intelligent tooling directly into client-facing journeys. The impact is real, the scope is huge, and the team is exceptional., Data Platform Engineering
- Own the design and build of our Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and more.
- Build and maintain production-grade pipelines ingesting data from Litify/Salesforce, client submissions, and third-party sources.
- Champion medallion architecture (bronze/silver/gold) and enforce data quality at every layer.
- Optimise for performance and cost.
AI & Machine Learning Engineering
- Design and deploy internal AI services: RAG systems, LLM-powered document analysis, and intelligent data extraction pipelines.
- Build and integrate AI features into client-facing products.
- Work with data scientists and legal SMEs to translate complex quantification models into scalable, production-ready services.
- Stay ahead of the curve on the AI/ML tooling landscape (LangChain, MLflow, Databricks Model Serving, Vector Search).
Data Governance & Quality
- Implement and enforce data governance through Unity Catalog; lineage, access control, classification, etc.
- Define and maintain data quality frameworks.
- Ensure compliance with GDPR and relevant data privacy regulations across all pipelines and services.
Collaboration & Leadership
- Partner tightly with data colleagues, analysts, legal operations, and product teams to turn requirements into reality.
- Contribute to architectural decisions in Data and in Tech.
- Mentor junior engineers and help raise the bar across the team.
- Document what you build.
Requirements
- Wanting to understand our business inside out. Processes, history, systems…not just the data!
- 5+ years in data engineering, with at least 2 years hands-on with Databricks (Delta Lake, Spark, notebooks, workflows).
- Strong Python.
- Solid SQL skills and deep understanding of data modelling for analytics workloads.
- Azure fluency: Data Factory, Azure Data Lake Storage, Key Vault, and ideally some Synapse/Fabric exposure.
- Real understanding of data governance, security, and privacy by design.
- A bias for action.
Strong Advantages
- Experience with Databricks Unity Catalog, MLflow, or Model Serving.
- Experience building and shipping AI/ML-adjacent services: LLM integration, vector stores, embedding pipelines, or similar.
- Exposure to legal tech, case management systems (Salesforce/Litify a bonus), or heavily regulated industries.
- Knowledge of RAG architectures, LangChain/LlamaIndex, or OpenAI/Azure OpenAI APIs.
- Experience designing client-facing data products or APIs.
- Background in financial quantification, insurance, or litigation analytics.
Benefits & conditions
- 25 days' annual leave (plus 8 Bank Holidays)
- Private medical insurance
- Private pension scheme
- Life assurance
- Enhanced maternity and paternity leave
- Employee assistance programme
- Employee referral bonus
- E-bikes and gym discounts (available through salary sacrifice scheme)
- Season ticket loans