Senior Data Test Engineer
Role details
Job location
Tech stack
Job description
Hippo is recruiting Senior Data Test Engineers to join our Hippo Herd. Senior Data Test Engineers work in multi-disciplinary teams that build, support & maintain User-Centred digital solutions that offer real value and work for everyone. Hippo's Senior Data Test Engineers design and implement quality check procedures for our products and services. You will be responsible for analysing features, designing test parameters, creating customised quality checks and writing up final procedures. You'll also use your experience and expertise to enable our clients to continue to iterate and improve their products and services. You will support in driving the team's technical deliverables, maintain client relationships and be passionate about developing and upskilling others. Please note, we are looking for candidates who are looking for growth at this level (Senior), therefore the advertised salary band is the lower end of our full banding for this level of position, allowing for progression in the role.
Your Role in a Nutshell:
- Collaborate early in the development lifecycle to define clear, testable requirements, especially around data transformations and end-user expectations.
- Write comprehensive test plans, including reconciliation, edge case, and business logic validations.
- Independently create and execute SQL queries to validate data pipelines, transformations, and outputs.
- Participate in Three Amigos, ticket refinements, and sprint planning to ensure QA considerations are represented from the start.
- Identify data issues and investigate the root causes by understanding the flow and transformation of data across systems.
- Proactively challenge assumptions and work closely with engineers, analysts, and product managers to clarify scope and intent.
- Document test cases and results clearly, ensuring bugs are actionable and reproducible.
Requirements
It, Red Flags, Etl, User Requirements, Communication Skills, Snowflake, Athena, Dbt, Python, Uniqueness, * SQL expertise - able to read, write, and debug complex queries; comfortable working in Snowflake, Athena, or equivalent.
- Business logic understanding - capable of interpreting user requirements and validating whether logic has been correctly applied in ETL.
- Experience testing ETL processes and understanding data transformation flows from source to report.
- Ability to think like a data consumer - what will users want from this data and how should it behave
- Strong communication skills - able to clearly explain quality concerns, raise red flags, and contribute to a shared understanding of "done."
Nice-to-Have Skills
- Familiarity with DBT or similar tools (understanding where they fit in the ETL process).
- Awareness of data quality frameworks and concepts like null checks, uniqueness, and referential integrity.
- Exposure to Python (basic familiarity helpful on some platforms like Databricks).
- Understanding of event-driven data flows or having worked on data-centric projects.