AWS Data Engineer (contract)

Opus Recruitment Solutions
Charing Cross, United Kingdom
2 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 148K

Job location

Charing Cross, United Kingdom

Tech stack

Microsoft Excel
Airflow
Amazon Web Services (AWS)
Azure
Spreadsheets
Information Engineering
Data Governance
ETL
Data Warehousing
Disaster Recovery
Python
PostgreSQL
Software Engineering
SQL Databases
Cloud Platform System
Spark
Database Performance
Indexer
Data Lake
OPUS (Software)
Databricks

Job description

Opus have partnered with a London Financial Services company who're seeking a Senior AWS Data Engineer with a background in Application Development to support in architecting and delivering a new PostgreSQL driven data platform which will be used to support core functions across investment reporting, risk analysis, regulatory output, and performance measurement.

This is a high impact greenfield build. You'll take full ownership of the platform's design and lead the transition away from Databricks and Excel-centric processes-shaping critical infrastructure that underpins the organisation's investment operations.

What You'll Work On You'll be responsible for designing and developing a scalable, secure PostgreSQL environment capable of supporting:

Portfolio valuation and holdings data Performance and attribution reporting Risk and analytics outputs Regulatory and trustee disclosures Data governance and operational controls

You'll also oversee the migration of structured datasets from Databricks (Delta Lake/Spark) and replace manual Excel workflows with automated, well-governed pipelines to meet audit and regulatory standards., Build and maintain a secure PostgreSQL-based data platform Lead the shift away from Databricks and spreadsheet dependent reporting Create dimensional data models covering investments, pricing, performance and related domains Develop reliable ETL/ELT processes in Python Implement data quality, reconciliation and validation controls Optimise database performance for analytical and reporting workloads Ensure compliance with FCA and TPR regulatory guidelines Set up access controls, security standards and permissioning Establish monitoring, backup and disaster recovery solutions Collaborate closely with investment, risk and finance teams

Requirements

5+ years' experience in data engineering, application development or data platform roles Strong PostgreSQL knowledge, including indexing, optimisation and partitioning Background in financial or investment data environments Advanced SQL and Python skills Experience migrating data from platforms like Databricks Confident with dimensional modelling, star schemas, SCDs etc. Experience within regulated financial services

Desirable

Experience in pensions, asset management or institutional investing Understanding of performance measurement and attribution Exposure to Airflow or dbt Cloud platform familiarity (Azure or AWS) Knowledge of data governance best practice

About You

Highly detail driven with a strong approach to data quality Comfortable operating in tightly regulated sectors Able to explain technical concepts clearly to non technical audiences Practical, proactive and solution focused

Apply for this position