Data Engineer
Role details
Job location
Tech stack
Job description
Zilch is a fast-growing fintech innovator transforming the way consumers access credit and pay for everyday purchases. By combining responsible lending, real-time data intelligence, and a frictionless customer experience, we empower millions of people with transparent, interest-free payment options. We are scaling rapidly and investing in a world-class data foundation to support our next phase of growth., We are seeking a Data Engineer to take ownership of the design, development, and evolution of our data platform. This role is critical to closing a skills gap within the Data team and ensuring that our data infrastructure is reliable, cost-efficient, observable, and fit for Zilch's long-term ambitions.
You will work closely with Analytics Engineers, Data Scientists, Software Engineers, and Platform teams to build engineered solutions rather than relying solely on third-party tools or ad-hoc processes.
Day-to-day responsibilities.
Data Platform & Pipeline Engineering:
- Design, build, and maintain scalable, reliable data pipelines across Snowflake, AWS, and supporting services.
- Replace fragile or manual ingestion processes (e.g. user-generated files) with engineered, validated solutions, rejecting errors at the point of entry.
- Own and improve ingestion from internal services, affiliates, and third-party providers.
- Overhaul and standardise legacy pipelines and DAGs to improve maintainability and reliability.
Orchestration, CI/CD & Infrastructure:
- Lead the evolution of our orchestration platform (Airflow), including performance improvements, standardisation, upgrades, and event-driven capabilities.
- Containerise Airflow tasks and supporting services to improve isolation, reproducibility, and CI/CD integration.
- Design and maintain CI/CD pipelines for data and ML workloads, balancing development speed with increasing scale and complexity.
- Partner with Platform Engineering to align infrastructure patterns while retaining strong data-domain ownership.
Observability, Reliability & Cost Optimisation:
- Design and implement end-to-end monitoring, alerting, and observability, including integration with Datadog and Slack.
- Improve alert reliability and error visibility across the data platform.
- Address performance issues such as high CPU usage and inefficient resource utilisation.
- Lead Snowflake cost optimisation, applying deep platform knowledge to balance performance and spend.
Data Governance & Quality:
- Design and implement data access controls and governance frameworks aligned with business and regulatory needs.
- Ensure strong data quality, lineage, and documentation across the platform.
- Automate lineage and exposure management (e.g. DBT exposures via Looker APIs).
Machine Learning & Advanced Analytics Enablement:
- Support Data Science and ML Engineering by improving MLOps practices, deployment workflows, and infrastructure.
- Help standardise ML CI/CD and environment parity across development and production.
- Establish or support feature store patterns (e.g. Snowflake feature store, Feast, or AWS-native solutions).
- Enable reproducible feature pipelines, backfills, and alignment with DBT and warehouse models.
Technical Leadership:
- Act as a senior technical contributor within the Data team, setting best practices and raising engineering standards.
- Influence platform tooling, architectural decisions, and long-term data strategy.
- Reduce technical debt and improve the overall health of the data estate.
Requirements
- Proven experience as a Senior Data Engineer or Software Engineer working on data-intensive platforms.
- Strong SQL skills and proficiency in Python.
- Hands-on experience with Snowflake (including performance and cost optimisation).
- 2+ years' experience with AWS, including data, compute, and container services.
- Strong experience with Airflow and data orchestration.
- Experience working with containerised workloads (Docker; ECS or Kubernetes).
- Familiarity with modern data stack tools such as DBT, Fivetran, and similar.
- Experience designing CI/CD pipelines and applying software engineering best practices.
- Strong understanding of data warehousing, schema design, and distributed systems.
- Focus on reliability, observability, and maintainability.
Benefits & conditions
Compensation & Savings:
- Pension scheme.
- Death in Service scheme.
- Income Protection.
- Permanent employees enjoy access to our Share Options Scheme.
- 5% back on in-app purchases.
- £200 for WFH Setup.
Health & Wellbeing:
- Private Medical Insurance including;
- GP consultations (video, telephone or face-to-face).
- Prescribed medication.
- In-patient, day-patient and out-patient care.
- Mental health support.
- Optical, dental & audiological cover.
- Physiotherapy.
- Advanced cancer cover.
- Menopause support.
- Employee Assistance Programme including:
- Unlimited mental health sessions.
- 24/7 remote GP & physiotherapy.
- 24/7 helpline for emotional & practical support.
- Savings & discounts on everyday shopping.
- 1:1 personalised well-being consultations.
- Gym membership discounts.
Family Friendly Policies:
- Enhanced maternity pay.
- Enhanced paternity pay.
- Enhanced adoption pay.
- Enhanced shared parental leave.
Learning & Development:
- Professional Qualifications.
- Professional Memberships.
- Learning Suite for e-courses.
- Internal Training Programmes.
- FCA & Regulatory training.
Workplace Perks:
- Hybrid Working.
- Casual dress code.
- Workplace socials.
- Healthy snacks.