Senior Data Engineer
Role details
Job location
Tech stack
Job description
We are seeking a highly skilled and experienced Senior Data Engineer to lead the development and management of our data platform. This pivotal role will focus on supporting critical data needs and developing foundational data models that are essential for advancing our cross-border remittances business. You will leverage your expertise in data warehousing, transformation, and modeling to build robust and scalable data solutions. Your responsibilities will encompass the entire build and deployment lifecycle of data movement and transformation, including managing users, scaling compute resources, and ensuring the platform operates optimally and efficiently. A key focus will be on maintaining robust data governance, ensuring data is well-defined, searchable, and trusted across the entire organization., Data Modeling & Management
- Design, implement, and evolve data models for core datasets, including creating new models and enhancing existing ones to meet business requirements.
- Collaborate with stakeholders across the organization to translate commercial objectives into trusted data outcomes and dependable solutions.
- Establish and enforce data governance policies to guarantee data quality, security, and compliance.
- Implement and maintain data cataloging and metadata management solutions for easy data discoverability and self-serve.
Platform Operations & Optimization
- Manage all aspects of the build and deployment lifecycle for data movement and transformation processes.
- Oversee user access and resource allocation for compute infrastructure, ensuring efficient scaling and utilization.
- Ensure effective use of platform tools and resources for efficient ETL/ELT processing, providing customized tooling where appropriate.
- Manage the costs associated with all data platform technologies through effective tooling.
Technical Leadership & Innovation
- Provide technical leadership and mentorship to other data engineers within the team.
- Evaluate and integrate new data technologies and tools to enhance the capabilities and efficiency of the data platform.
Requirements
Do you have experience in Terraform?, * Experience: At least 7 years of experience in data engineering, with a strong emphasis on developing data platforms, data modeling, and managing data assets.
- Data Platform Expertise: In-depth, hands-on experience with technologies for data ingestion, job orchestration, data warehousing, and reporting.
- ETL/ELT: Proven ability to design, implement, and manage robust ETL/ELT pipelines using various tools (e.g., Spark, dbt).
- Cloud Platforms: Strong experience with cloud data platforms (e.g., AWS, GCP, Azure), including compute, storage, and database services.
- Programming: Proficient in SQL and advanced programming in at least one language (e.g., Python, Scala, Java).
- Orchestration: Experience with workflow orchestration tools (e.g., Astronomer, Apache Airflow).
- Data Governance: Solid understanding and practical experience with data governance principles, data cataloging, and metadata management.
- Performance Optimization: Demonstrated ability to optimize data platform performance and manage compute resources efficiently.
- Problem-Solving: Excellent analytical and problem-solving skills with meticulous attention to detail.
- Communication: Strong communication and interpersonal skills, capable of articulating complex technical concepts to both technical and non-technical audiences.
Bonus Points
-
Experience in the financial services or FinTech industry, specifically with cross-border payments.
-
End-to-end experience building data platforms, connecting all components from ingestion to reporting, using industry-standard tools. These include: Fivetran (ingestion), Databricks (lakehouse), Astronomer/Apache Airflow (orchestration), and Metaplane/Datadog (observability).
-
Proficiency in data modeling tools and techniques (e.g., dbt).
-
Familiarity with modern data lake formats such as Delta Lake and Iceberg.
-
Experience with real-time data processing and streaming technologies (e.g., Flink, Epsio).
-
Experience with infrastructure as code (e.g., Terraform) for managing data platform resources.
-
Familiarity with data observability and monitoring tools (e.g. Metaplane, Datadog).
-
Proficiency in BI and data visualization tools; experience using Mode is preferred.
-
Experience using Blackline or Autorek reconciliation software
Benefits & conditions
Pulled from the full job description
- Annual leave
- Employee discount
- Company pension
- Private medical insurance, We have five core benefits for our talent in the US, UK, Philippines, Poland, and South Africa. specifically:
- Unlimited Annual Leave: Feel free to make the most of your time off and maintain a healthy work-life balance!
- Private Medical Cover: You can opt-in to a Private Medical Insurance scheme. This provides you with access to thorough medical coverage, so you can feel confident in your health and well-being.
- Retirement: We offer pension schemes to help you plan for and secure your future.
- Life Assurance: Life assurance is available to give you peace of mind and protect your loved ones in case of the unexpected.
- Parental Leave: We offer competitive parental leave schemes to ensure you are spending as much quality time with your new bundle of joy as possible.
We are also remote-first as an organisation, offering flexibility for you to work where you need to be most productive. In addition to the above, you will discover that we have a range of secondary perks (such as the cycle-to-work scheme and employee discounts) depending on your location, to help you thrive at Zepz!