Data Engineer

Prosource
Charing Cross, United Kingdom
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Remote
Charing Cross, United Kingdom

Tech stack

Unity
API
Azure
Databases
Data Governance
JSON
Python
Powershell
SQL Databases
T-SQL
Azure
Data Lake
PySpark
Infrastructure Automation Frameworks
Deployment Automation
Azure
Databricks

Job description

Reporting to the Data Platforms Manager, you will design, develop, and maintain scalable, cost-effective data solutions on the Azure Data Platform. You'll support the creation and management of Gold layer data models following Medallion Architecture, ensuring alignment with platform best practices and governance standards. Working collaboratively, you'll help review, document, and migrate data from a recently acquired company, while identifying opportunities to automate inefficiencies and optimise non-compliant solutions., * Design and implement scalable data solutions using Azure services including Data Factory, Data Bricks, and Data Lake Gen2

  • Build and maintain metadata managed data pipelines supporting the business by working to centralise critical data within a single platform
  • Collaboration with business analysts, data scientists and business stakeholders, translating requirements into technical solutions
  • Optimise performance and cost-efficiency of data workflows and adhere to data governance requirements around storage and retention
  • Working with stakeholders and business analysts to identify data quality requirements and develop rules to meet those requirements
  • Utilise Azure DevOps to manage workload, update tasks and develop CI/CD Pipelines for deployment into other environments

Requirements

  • Strong Proficiency in Azure Data Factory, Synapse Analytics, SQL Databases & Data Lake Gen 2
  • Demonstratable understanding on how to apply Medallion Architecture to an Azure Datawarehouse
  • Advanced Knowledge around CI/CD pipelines, deployment automation, Infrastructure as Code and work management within Azure DevOps
  • Knowledge of development within Databricks, PySpark, Delta Lake, Unity Catalog and Notebook Development
  • Demonstratable experience in SQL, T-SQL, JSON, Python & data consumption via API's, with some understanding of Dax and PowerShell
  • Understanding and experience of Data Quality & Validation during the data load process
  • Ability to analyse complex data requirements and design efficient data platform solutions
  • Strong problem-solving skills to troubleshoot and resolve database issues
  • Excellent communication skills to collaborate with stakeholders and explain technical concepts to non-technical users
  • Meticulous attention to detail to ensure data accuracy and integrity
  • Ability to manage multiple projects and prioritize tasks effectively
  • To work effectively with other IT professionals and departments
  • Flexibility to adapt to new technologies and changing business requirements

Benefits & conditions

For employees, we're committed to recognising and rewarding hard work. Our competitive salary and benefits package includes; Company Pension Scheme, Private Medical & Dental Insurance, Group Income Protection, Group Life Assurance, Cycle to Work and Electric Car Salary Sacrifice Scheme. We also invest in your development. If you choose to self-study in your own time, we'll fund your study materials and exam fees - and once you pass, you'll receive an incentive bonus.

Apply for this position