Senior Data Engineer

MLabs
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
£ 240K

Job location

Remote

Tech stack

API
Airflow
Data analysis
Google BigQuery
Information Engineering
Data Infrastructure
Data Integrity
ETL
Data Structures
Data Systems
Data Warehousing
Python
Blockchain
SQL Databases
Data Processing
Snowflake
Backend
Core Data
Data Management
Redshift

Job description

This is a rare chance to be the first dedicated Data Engineer at a fast-growing fintech operating at the intersection of payments, cards, and blockchain infrastructure.

The platform is already live, already scaling, and already moving real money across borders. What's missing is you: the person who designs and owns the data systems that everything else depends on.

The Opportunity

You'll be responsible for architecting and building the entire data platform from the ground up.

As the business scales to millions of users across card programs, payment rails, and blockchain networks, your pipelines will power:

  • Analytics and decision-making
  • Customer and partner reporting
  • Operational tooling
  • Compliance and risk workflows
  • On-chain and off-chain integrations

This is not a maintenance role. This is a 0*1, platform-defining role with real ownership and real impact.

You'll work directly with the CTO and partner closely with Product, Engineering, Operations, Compliance, and Analytics.

Requirements

What You'll Do

Design, build, and maintain core data pipelines, including ingestion from payments processors, card issuers, blockchain nodes, internal services, and third-party APIs.

Own orchestration and workflow management, implementing Airflow, Dagster, or similar tools to ensure reliable, observable, and scalable data processing.

Architect and manage data warehouse (Snowflake, BigQuery, or Redshift), driving performance, cost optimization, partitioning, and access patterns.

Develop high-quality ELT/ETL transformations to structure raw logs, transactions, ledgers, and on-chain events into clean, production-grade datasets.

Implement data quality frameworks and observability (tests, data contracts, freshness checks, lineage) to ensure every dataset is trustworthy.

Partner closely with backend engineers to instrument new events, define data contracts, and improve telemetry across the infrastructure.

Support Analytics and cross-functional teams by delivering well-modeled, well-documented tables that power dashboards, ROI analyses, customer reporting, and key business metrics.

Own data reliability at scale, leading root-cause investigations, reducing pipeline failures, and building monitoring and alerting systems.

Evaluate and integrate new tools across ingestion, enrichment, observability, and developer experience-raising the bar on performance and maintainability.

Help set the long-term technical direction for Rain's data platform as we scale across new products, regions, and chains.

Requirements

  • Data infrastructure builder - You thrive in early-stage environments, owning pipelines and platforms end-to-end and choosing simplicity without sacrificing reliability.

  • Expert data engineer - Strong Python and SQL fundamentals, with real experience building production-grade ETL/ELT.

  • Workflow & orchestration fluent - Hands-on experience with Airflow, Dagster, Prefect, or similar systems.

  • Warehouse & modeling savvy - Comfortable designing schemas, optimizing performance, and operating modern cloud warehouses (Snowflake, BigQuery, Redshift).

  • Quality-obsessed - You care deeply about data integrity, testing, lineage, and observability. Systems thinker - You see data as a platform; you design for reliability, scale, and future users.

  • Collaborator - You work well with backend engineers, analytics engineers, and cross-functional stakeholders to define requirements and deliver outcomes.

  • Experienced - 5-7+ years in data engineering roles, ideally within fintech, payments, B2B SaaS, or infrastructure-heavy startups.

Nice to Have

  • Experience ingesting and processing payments data, transaction logs, or ledger systems.
  • Exposure to smart contracts, blockchain data structures, or on-chain event ingestion.
  • Experience building data tooling for compliance, risk, or regulated environments.
  • Familiarity with dbt and/or semantic modeling to support analytics layers.
  • Prior experience standing up data platforms from 0*1 at early-stage companies.

Benefits & conditions

Compensation & Benefits

  • $140K-$240K base salary + meaningful equity
  • Unlimited PTO (with a real minimum encouraged)
  • Flexible remote-first working
  • Home office stipend
  • Comprehensive health, dental & vision (US)
  • 401(k) with company match
  • Wellness budget (gym, fitness, recovery, etc.)
  • Regular team offsites (US + international)

Interview Process

  1. Intro call (culture & alignment)
  2. Technical interview (past projects & impact)
  3. 2-hour technical session (architecture + hands-on)
  4. Final conversation with the founder/CEO

Apply for this position