Data Engineer
Role details
Job location
Tech stack
Job description
ARAG UK Group have an exciting opportunity to join our high-performing Digital Services team as a Data Engineer in our Bristol office., As Data Engineer you will be responsible for supporting the Data Engineering Manager with the design and implementation of the enterprise data lakehouse, data movement and data model design and implementation to support the needs of the business. You will ensure it is trusted, secure, scalable, performant and fit for purpose, as well as:
- Leveraging contemporary technologies and data paradigms as appropriate
- Supporting a single analytical view of our data and information
- Implementing to agreed and defined metrics
- Delivering business value by supporting action-oriented insights
- Ensuring it is built in line with our information management strategy and guiding principles.
- Ensures "explainability" fit for audit in a regulatory controlled environment through appropriate data lineage and documentation.
In our collaborative environment and as part of the data engineering team you will work closely with many areas within the business. Working particularly with Finance, our Reporting & Analytics team and Data Platform teams and taking responsibility and accountability for the collaborative design and build of data solutions. These will provide a secure, dependable, and well performing platform for information & analytical purposes.
Requirements
Good communication skills, You will be able to confidently address a wide range of different business areas with varied technical understanding whilst adapting your delivery to effectively convey difficult technical problems and solutions to non-technical colleagues. You will have experience implementing cloud centric data solutions using platforms such as Azure and have the ability and desire to pick up new technology, tools, paradigms and develop it further. You are comfortable building out and administrating data models, ensuring that they are accessible and used appropriately whilst leveraging platforms such as Databricks, dbt and Power BI. You will possess the knowledge of automating deployments using a combination of tools such as PowerShell and Azure DevOps and the ability to use Python both to manipulate data and building additional process to support data processing activities is fundamental. You will exhibit a passion for data and information with a strong understanding of data architecture principles and information "story telling" to maximise the value of our raw data. As well as a familiarity with data processing paradigms such as ETL/ ELT, Kimball, Medallion Data Lakehouse.