Analytics Engineer
Role details
Job location
Tech stack
Job description
As an Analytics Engineer, you will be responsible for designing, developing, and optimising our data infrastructure and analytical tools. You will work closely with data engineers, data analysts, and business stakeholders to ensure that data is reliable, accessible, and actionable. Your role will involve transforming raw data into a more refined form, creating data models, and ensuring the data pipelines are efficient and scalable., * Data Pipeline Development : Design, build, and maintain robust, scalable, and efficient data pipelines using tools like DBT, SQL, Python, and ETL frameworks.
- Data Modelling : Develop and maintain logical and physical data models that support the company's reporting and analysis needs. Implement best practices for data warehouse design.
- Data Transformation : Transform raw data from various sources into clean, reliable datasets that can be used for analysis and reporting, ensuring data quality and consistency.
- Collaboration with Stakeholders : Work closely with data analysts, data scientists, and business stakeholders to understand their data needs and translate these requirements into technical solutions.
- Performance Optimization : Optimise existing data processes for performance and scalability, ensuring that data can be processed quickly and efficiently.
Requirements
- You have experience with analytics engineering delivering analytical solutions (e.g. in the Databricks stack)
- You have experience with data modelling using tools such as DataForm or DBT
- You have experience orchestrating complex data processing pipelines
- You love building scalable, resilient analytical products
- You seek learning opportunities to deepen your expertise or broaden your knowledge
- You are a data engineer, analytics engineer or software engineer with an interest in the data domain (data modelling, data transformation, etc.) or highly motivated technical data analyst
- You are comfortable working in an agile development environment that uses Terraform, continuous integration and continuous delivery best practices, and have experience of pair programming, CI/CD and deployment strategies
- You have experience designing and building large-scale data pipelines that utilise streaming technologies (e.g. Kafka Streams, Amazon Kinesis or similar) with an emphasis on the sourcing and transformation of data.
Benefits & conditions
- Pension Scheme
- Discretionary Bonus Scheme
- Private Medical Insurance + Virtual GP
- Life Assurance
- Access to Furthr - a Climate Action app
- Free Mortgage Advice and Eye Tests
- Perks at Work - access to thousands of retail discounts
- 5% Flex Fund to spend on the benefits you want most
- 26 days holiday
- Flexible bank holidays, giving you an additional 8 days which you can choose to take whenever you like
- Progressive leave policies with no qualifying service periods, including 26 weeks full pay if you have a new addition to your family
- Dedicated personal learning and home office budgets
- And more…
Even better? You'll have access to these benefits from day 1 when you join.
We want the best people
We're keen to meet people from all walks of life - our view is that the more inclusive we are, the better our work will be. We want to build teams which represent a variety of experiences, perspectives and skills, and we recognise talent on the basis of merit and potential.