Data Engineer
Tenth Revolution Group
Charing Cross, United Kingdom
4 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Compensation
£ 62KJob location
Charing Cross, United Kingdom
Tech stack
Artificial Intelligence
Amazon Web Services (AWS)
Azure
Cloud Computing
Information Engineering
ETL
Data Warehousing
Identity and Access Management
Python
Microsoft SQL Server
Power BI
DataOps
SQL Databases
T-SQL
Data Logging
Data Processing
Data Ingestion
Azure
Data Lake
Data Lineage
Deployment Automation
Data Pipelines
Databricks
Job description
- In this role, you will:
- Build and maintain scalable data workflows that connect various internal platforms, such as customer management, HR, and financial systems.
- Automate data ingestion and transformation processes using contemporary cloud-based technologies.
- Apply rigorous data quality checks and cleansing procedures to maintain integrity across datasets.
- Work closely with architecture teams to define unified data representations across business areas and develop comprehensive datasets tailored for reporting and advanced analytics.
- Harmonize performance indicators to enable consistent insights across departments.
- Ensure data workflows are modular, version-controlled, and integrated with automated deployment pipelines.
- Enhance the efficiency of data processing across both scheduled and real-time operations.
- Implement systems for tracking data lineage, tagging, and operational logging.
- Prepare structured datasets to support predictive modeling and other intelligent applications.
- Empower business users with reliable, self-service data assets and partner with analytics teams to transition prototypes into scalable solutions.
- Safeguard sensitive information in line with regulatory standards and enforce robust access management, encryption, and data lifecycle policies within data operations.
Technologies:
- Azure
- Cloud
- Data Warehouse
- Databricks
- ETL
- Support
- Python
- SQL
- AI
- AWS
- Power BI
- Security
Requirements
- We are looking for a candidate with extensive experience in implementing solutions around Databricks, Azure Data Factory, SQL, and Python. A strong understanding of Microsoft SQL Server and data modeling is essential, along with knowledge of SQL, T-SQL, ETL/ELT, and Data Pipelines. Previous experience in a Data Engineering capacity is highly sought after. Additionally, candidates should possess strong hands-on experience in Data Warehouse and Data Lake technologies, preferably around Azure.
Benefits & conditions
We provide a hybrid working model, allowing you to work three days in the office. Our benefits include 28 days of holiday plus bank holidays, private medical health, a pension scheme, and more.