Data Quality Engineer
Role details
Job location
Tech stack
Job description
In this role, you'll operate at Feature Team level and focus on building automation framework, supporting other engineers in establishing & running quality gates using test pyramid and other modern test automation practices. You'll be involved in design, development, and maintenance of software applications from test automation perspective and help in successful product delivery. This is a full-stack individual contributor Sr. QE/SDET role with primary focus on modern data technologies built on GCP along with event-driven architecture experience and lead the feature team from test automation front. Why Lloyds Banking Group We're on an exciting transformation journey and there could not be a better time to join us. The investments we're making in our people, data, and technology are leading to innovative projects, fresh possibilities and countless new ways for our people to work, learn, and thrive. What you'll need
Requirements
Experience of driving advanced software testing techniques applying modern automation test approaches in Data Engineering project, ensuring robust data quality Hands-on experience of developing BDD based Automation Frameworks for Data, ETL & Event-Driven applications using Python, Java or Typescript Ability to solve complex automation use cases for new Data Products built on Data-Mesh, Lakehouse and streaming architecture Working experience of modern data & event-driven technologies, such as: Testing & Automation: PyTest, Cucumber, Behave, DBT, GreatExpectations, GCP DVT, Monte Carlo, Soda, Deequ, RestAssured etc Data Engineering & Orchestration: BigQuery, Spanner, Apache Kafka, Airflow, Spark, Cloud Composer, DAGs, Apache Beam, Pub/Sub, Dataflow, DataStage, Teradata, Snowflake, ETL, SQL etc Data Governance & Visualisation: Looker, Ataccama, Dataplex, Collibra, PowerBI AI/ Analytics: Exposure to TensorFlow, PyTorch, Scikit-learn, OpenCV, LangChain and GenAI tools Experience of designing & executing complex automation testing strategies and frameworks for Functional and Non-Functional requirements of Data and AI platforms Create or accurately request complex test data, taking into account referential integrity, data quality and governance Incorporate automated tests into the CI/CD pipeline and DevOps tooling like Jenkins, Harness, Terraform, Dynatrace etc Strong programming proficiency in Python(preferable), Java or JavaScript Experience with Agile tools (Jira, Confluence, Xray) Experience on Source code management tools (Git, GitHub)