DataOps Engineer

Paymentology
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Remote

Tech stack

Airflow
Amazon Web Services (AWS)
Bash
Cloud Computing
Cloud Engineering
Information Engineering
Data Infrastructure
DevOps
Disaster Recovery
Python
DataOps
Data Logging
Scripting (Bash/Python/Go/Ruby)
Spark
Kubernetes
Data Management
Machine Learning Operations
Terraform
Go

Job description

You'll work closely with data engineers and senior technical stakeholders to design, implement, and operate the foundations of our data stack. This role is ideal for a mid-level engineer with strong DevOps fundamentals who is eager to deepen their expertise in data platforms, cloud infrastructure, and observability within a high-impact, global fintech environment., * Design and implement cloud infrastructure for a modern data platform using Infrastructure as Code, with a strong focus on scalability, security, and reliability.

  • Build and maintain CI/CD pipelines that support data engineering workflows and infrastructure deployments.
  • Implement and operate observability solutions including monitoring, logging, metrics, and alerting to ensure platform reliability and fast incident response.
  • Collaborate closely with data engineers to translate platform and workflow requirements into robust infrastructure solutions.
  • Apply best practices for availability, disaster recovery, and cost efficiency, while documenting infrastructure patterns and operational procedures.

Requirements

Do you have experience in Terraform?, What it takes to succeed:

  • 3-5 years of hands-on experience in DevOps, Platform Engineering, or DataOps roles.
  • Experience supporting or contributing to data platforms or data infrastructure projects.
  • Exposure to modern data engineering tools such as dbt, Airflow, Apache Spark, or similar technologies is an advantage.
  • Hands-on proficiency with Infrastructure as Code, particularly Terraform.
  • Experience working with AWS or GCP and common cloud architecture patterns.
  • Practical experience or strong understanding of Kubernetes and containerised workloads.
  • Familiarity with observability tooling across monitoring, logging, metrics, and alerting.
  • Strong scripting skills in Python, Bash, or GoLang to automate operational processes.
  • Excellent problem-solving skills and the ability to work effectively in a collaborative, fully remote environment.
  • A strong inclination to develop DataOps and MLOps knowledge and capabilities.
  • Exposure to modern data engineering tools such as dbt, Airflow, Apache Spark, or similar technologies is an advantage.

About the company

At Paymentology, we're redefining what's possible in the payments space. As the first truly global issuer-processor, we give banks and fintechs the technology and talent to launch and manage Mastercard and Visa cards at scale across more than 60 countries. Our advanced, multi-cloud platform delivers real-time data, unmatched scalability, and the flexibility of shared or dedicated processing instances. It's this global reach and innovation that sets us apart. We're looking for a DataOps Engineer to join our Data Engineering team and help build a modern data platform from the ground up. This is a greenfield opportunity focused on infrastructure, automation, and observability, playing a critical role in enabling reliable, scalable, and secure data systems.

Apply for this position