Data Engineer | Remote

Datatech Analytics
Manor Park, United Kingdom
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate
Compensation
£ 62K

Job location

Remote
Manor Park, United Kingdom

Tech stack

Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Azure
Cloud Computing
Continuous Integration
Information Engineering
ETL
Data Warehousing
Python
SQL Databases
Large Language Models
Snowflake
Data Strategy
GIT
Core Data
Data Management
Software Version Control
Databricks

Job description

A high visibility opportunity to join an ambitious, values led organisation as it refreshes its data strategy and modernises its intelligence platform. You'll be trusted early, work closely with stakeholders, and help build the foundations that drive better insight, smarter decisions, and genuine impact, using data for good., * Helping shape and deliver a refreshed data strategy and modern analytics platform

  • Building reliable, scalable ELT/ETL pipelines into a cloud data warehouse, Snowflake, Databricks, or similar
  • Designing and optimising core data models that are dimensional, analytics-ready, and built to last
  • Creating trusted data products that enable self-service analytics across the organisation
  • Improving data quality, monitoring, performance, and cost efficiency
  • Partnering with analysts, BI teams, and non-technical stakeholders to turn questions into robust data assets
  • Contributing to engineering standards, best practice, and reusable frameworks
  • Supporting responsible AI tooling, including programmatic LLM workflows where appropriate

Requirements

This role is well suited to someone early in their data engineering journey, around 2+ years' experience, who's ready to step up. You'll join a supportive, encouraging environment with real runway to grow technically, while gradually developing ownership and leadership as your influence across the business increases., * 2+ years' experience in data engineering within a modern data stack

  • Strong SQL with a solid foundation in data modelling
  • Python preferred, or similar, for pipeline development and automation
  • Cloud experience across AWS, Azure, or GCP
  • Familiarity with orchestration and analytics engineering tools such as dbt, Airflow, or equivalents
  • Good habits around governance, security, documentation, version control (Git), and CI/CD

The kind of person who thrives here

Confident, curious, and motivated. You care about doing things properly, enjoy being trusted and visible in the business, and are genuinely interested in using data to create positive outcomes.

Apply for this position