SDET
Role details
Job location
Tech stack
Job description
Job Summary : We are seeking a senior SDET to own and drive end-to-end quality for our cloud-based data ingestion and processing pipelines. This role combines test strategy, automation development, and collaboration with engineering and delivery teams to ensure robust, scalable, and highly reliable data services built on AWS. The ideal candidate is hands-on, technically strong in Java/Groovy, proficient with AWS data services, and adept at designing maintainable test frameworks., 1. Develop and implement comprehensive test strategies for data ingestion, transformation, and analytics workflows in an AWS environment.
-
Design, build, and maintain automated test suites (API, end-to-end, integration, data validation, performance) using Java and Groovy.
-
Collaborate with product owners, software engineers, and data engineers to understand requirements and translate them into test plans and acceptance criteria.
-
Validate data quality and integrity across multiple data stores (S3, DynamoDB, Redshift) and data processing pipelines (AWS Glue jobs, ETL workflows).
-
Create and maintain test data management practices, including synthetic data generation and seed data provisioning.
-
Write, optimize, and maintain complex SQL queries for data validation and reconciliation tasks.
-
Implement and maintain automated CI/CD pipelines (GitLab) to enable rapid and reliable test execution in CI environments.
-
Contribute to the design and improvement of test frameworks (preferably Spock-based or equivalent Groovy/Java testing frameworks).
-
Perform exploratory testing and risk-based testing to identify edge cases and potential failures early in the SDLC.
-
Participate in code reviews, triage defects, and provide clear, actionable defect reports with reproducible steps.
-
Monitor and report test results, create dashboards, and communicate quality status to stakeholders.
-
Ensure compliance with security, privacy, and data handling standards relevant to cloud data platforms., 1. Build and maintain automated test suites (API, data, UI if applicable) using Java/Groovy.
-
Implement data validation checks across S3, DynamoDB, and Redshift; integrate with Glue job outcomes.
-
Design test data scenarios that cover typical, boundary, and failure modes.
-
Review user stories and acceptance criteria for testing implications; contribute to test plan and risk assessments.
-
Execute tests in CI pipelines, analyze failures, and drive root cause analysis.
-
Create and maintain test reports, metrics, and quality gates for release readiness.
-
Collaborate with DevOps to manage test environments and data provisioning.
Acceptance Criteria
-
Automated test suite with coverage for data ingestion pipelines and data validation runs reliably in CI/CD.
-
Data validation scripts accurately detect discrepancies across S3, DynamoDB, and Redshift outputs.
-
Glue job integration is verified through end-to-end tests with reproducible data scenarios.
-
SQL queries demonstrate correct data reconciliation results and performance within acceptable thresholds.
-
Clear, actionable defect reports with steps to reproduce and expected vs. actual results.
Requirements
Do you have experience in TestNG?, Do you have a Master's degree?, Min Experience: 7 years
Skills: Java, Groovy, Spock (preferred), JUnit, TestNG, AWS (Glue, S3, DynamoDB, Redshift), SQL databases, Data Lake concepts, Git, GitLab, Agile/Scrum, with a continuous improvement mindset.
Education: Bachelor's degree or Master's in Information Technology, Computer Science, Information Systems, Computer Engineering., 1. Experience: 8 years in IT with a strong focus on Testing and Automation.
-
Programming: Proficient in Java and Groovy for test automation and framework development.
-
Cloud Platform: Strong experience testing in AWS environments; practical knowledge of AWS Glue, S3, DynamoDB, Redshift, and related services.
-
Data Testing: Expertise in data validation, data quality checks, schema validation, and end-to-end data pipelines.
-
Frameworks: Hands-on experience with Spock framework; familiarity with other JVM testing tools is a plus.
-
SQL: Advanced SQL skills for data verification, joins, aggregations, window functions, and performance considerations.
-
Version Control & CI/CD: GitLab or similar CI/CD tooling; experience with pipelines, artifacts, and test automation integration.
-
Collaboration: Proven ability to work with cross-functional teams (developers, data engineers, product, operations) and drive quality decisions.
-
Communication: Strong written and verbal communication; ability to articulate test results, risks, and recommendations to stakeholders.
Nice-to-Have:
-
Experience with Redshift performance testing and data lakehouse concepts.
-
Familiarity with data modeling concepts and ETL/ELT patterns.
-
Experience with BDD/TDD approaches and test design techniques.
-
Knowledge of security testing practices related to data at rest/in transit.
-
Experience with containerization (Docker) and orchestration (Kubernetes) for test environments.