Data Engineer

Postaladdress Uk
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 70K

Job location

Tech stack

Artificial Intelligence
Amazon Web Services (AWS)
Cloud Computing
Computer Security
Continuous Integration
Data Architecture
Data Validation
Data Governance
ETL
Data Transformation
Data Warehousing
Relational Databases
Python
PostgreSQL
Performance Tuning
Query Optimization
Role-Based Access Control
Power BI
Cloud Services
Standard Sql
Salesforce
SQL Databases
Data Logging
Data Ingestion
Delivery Pipeline
Snowflake
GIT
Data Lineage
OSS/BSS
Tools for Reporting
Terraform
Data Pipelines

Job description

  • Design, build, and maintain high-quality data pipelines and models in Snowflake to support business analytics, BI, and operational reporting needs.
  • Translate the defined data architecture and standards into implemented solutions, including ingestion, transformation, storage, and performance optimisation.
  • Develop robust ELT/ETL pipelines using dbt and workflow/orchestration tools, ensuring reliability, maintainability, and adherence to engineering best practices.
  • Implement Snowflake warehouse configurations and query optimisation techniques to ensure efficient usage and predictable cost.
  • Apply data quality checks, lineage tracking, and security standards across the data estate, ensuring compliance with data policies, InfoSec controls, and regulatory requirements.
  • Leverage Snowflake capabilities to improve automation, reduce manual effort, and enhance data accessibility across the business.
  • Work closely with analysts, data consumers, and business stakeholders to support data product delivery, troubleshoot data issues, and enable effective usage of Snowflake datasets.
  • Implement dimensional models that provide clean, well-structured, reusable datasets for reporting and emerging ML/AI use cases.
  • Implement and maintain monitoring, alerting, logging, and cost-management processes for Snowflake and data pipelines.
  • Contribute to shared engineering standards to simplify development and accelerate delivery across the team.

Technologies:

  • AI
  • AWS
  • CI/CD
  • Cloud
  • CRM
  • Data Warehouse
  • ETL
  • Fivetran
  • Git
  • Support
  • OSS
  • PostgreSQL
  • Power BI
  • Python
  • RBAC
  • SQL
  • Salesforce
  • Security
  • Snowflake
  • dbt
  • Terraform

Requirements

  • Proven experience in delivering cloud-based data engineering solutions, ideally with Snowflake.
  • Strong hands-on proficiency with SQL, Python, and dbt for data transformations, modelling, and pipeline automation.
  • Practical experience with Snowflake and RBAC management.
  • Experience with data ingestion and replication tools such as Airbyte, Fivetran, Hevo, or similar.
  • Working knowledge of cloud services (AWS preferred).
  • Strong understanding of data modelling and data governance principles.
  • Experience supporting BI/reporting tools (Power BI) and enabling them through well-designed Snowflake data models.
  • Solid knowledge of CI/CD and version-controlled development practices in git.
  • Exposure to CRM (Salesforce), BSS/OSS (Netadmin), Call Centre, Telephony, or similar enterprise data sources (desirable).
  • Participation in migrating data platforms (e.g., PostgreSQL or other cloud RDBMS) into a data warehouse like Snowflake with minimal disruption and strong data validation controls (desirable).

About the company

At Gigaclear, we are a growing Fibre Broadband company focused on developing our fibre-to-the-premises broadband infrastructure to serve some of the most challenging areas in the UK. By empowering communities with high-quality broadband, we aim to rival urban connectivity. Join our Data Engineering team to play a crucial role in enhancing our data platform, working with advanced technologies, and contributing to decision-making processes that shape our future.

Apply for this position