Analytics Engineer
Role details
Job location
Tech stack
Job description
This critical role sits at the intersection of Data Engineering and Data Analytics, focusing on transforming raw data into high-quality, trustworthy, and easily consumable datasets and visualisations for business users. The ideal candidate will possess a strong background in SQL, dbt, data modelling, ETL/ELT principles and experience with modern cloud data platforms.
Core Responsibilities
Develop and Maintain Transformation best practices in ELT Pipelines: Design, develop, and maintain efficient and scalable workflows using tools like dbt to transform raw data loaded into our data warehouse (Snowflake) into clean, ready-to-use data models.
Data Modelling: Implement best-practice data modelling and software engineering techniques to ensure data structures are optimised for performance, accuracy, and ease of use in reporting and analytical applications (e.g. CI/CD, testing, lineage).
Data Quality and Testing: Write comprehensive data quality checks, tests, and monitoring scripts to ensure the accuracy, completeness, and reliability of all transformed data assets. Establish and maintain documentation for all data transformations and models.
Collaboration: Work closely with Data Analysts and business users to understand their reporting needs and optimise data models to support their analytical use cases. Collaborate with Data Engineers on data ingestion strategies and platform optimisation.
Performance Optimisation: Tune and optimise SQL queries and data models to reduce latency and improve the performance of our data warehouse and downstream applications.
Tool Adoption: Champion the adoption of modern data stack tools and practices (e.g. Git, CI/CD).
Requirements
- Experience: 3+ years of experience in a data-focused role (e.g., Analytics Engineer, Data Analyst, BI Developer).
- SQL Mastery: Expert-level proficiency in writing and optimising complex SQL queries.
- Data Transformation Tooling: Hands-on experience with dbt (Data Build Tool) or similar data transformation frameworks is essential.
- Data Warehousing: Experience working with cloud-based data warehouses such as Snowflake, Google BigQuery, or Amazon Redshift.
- Data modelling: Solid understanding of data warehousing concepts, ETL/ELT principles, and dimensional modelling techniques.
- Version Control: Proficiency with Git for collaborative development and version control.
- Familiarity with reporting/ BI tools like Looker.
Preferred Qualifications
- Python expertise for automation, integration, and orchestration
- Experience with Semantic layers
- Experience with orchestration tools like Airflow.
- Knowledge of modern software engineering practices applied to data (e.g., modularity, code review, testing).