Duties and Responsibility
- Validate data pipelines/workflows delivered by Data Engineers and dashboards by the BI Engineers against user story requirements
- Ensure principles of privacy and security are assessed and designed into the data solutions
- Write high quality, generalised test datasets and test methods for verifying and validating new functional and non-functional data pipelines/ workflows and dashboards, enabling automated testing wherever possible
- Generate golden datasets to efficiently stress-test software and assure version control
- Report QA outcomes to Data Engineers, BI Engineers and Business Owners
- Develop standard QA rules that would be applied across all data pipelines/workflows and dashboards
- Maintain a high-quality audit trail of your data pipeline/workflow and dashboard testing
- Manage and mentor junior members of the team
- Create standards, conventions, processes, SLAs to reduce time and increase quality of delivery of the QA team
- Evaluate different tools and technologies to be used by the team.
What are the requirements of the role?
- Understanding of Quality Assurance data practices
- Hands-on experience writing unit tests and mocking the data
- Hands on experience to analyse regression reports, automated builds and KPI’s
- Strong experience of writing acceptance criteria or scenarios collaboratively with other stakeholders
- Participated in full development data lifecycle
- Experience with SQL, Python, data modeling, ETL development, and data warehousing
- Experience with AWS technologies including Redshift, Redshift Spectrum, Athena, Hive etc.