Participants and "customers" in the testing process part 2
At each stage, dealing with each specific task, it is important to use appropriate methods of interactions with the stakeholders of the development process. It can be efficient to alternate and/or mix different approaches and practices. That will help you fulfill the expectations of stakeholders with regard to software quality and as well as those of the testing team.
Let's describe, step by step, how the interaction of the testing team with other stakeholders can be organized.
QA in analyzing and working out the requirements
Customers and business analysts
- Discussing end-to-end checks and business processes
Analysts and developers
- Reviewing the check lists prepared by testers
- Designing a test model and test strategy
- Joint evaluation and planning of tasks
Outside IT teams
- Identifying integration dependencies between the systems
- Planning the joint usage of the testing environment
Testing environment support team
- Discussing the requirements to testing environments: configuration, data, performance
- Planning the work with the testing environment: times, access rights
QA in SW developing and debugging
Customers and business analysts
- Demonstrating new features
- Updating priorities in the course of development
Analysts and developers
- Discussing and forming data sets for testing
- Reviewing and testing documentation: specifications, user guides, help, etc.
- Analyzing errors found during the SW debugging
Outside IT teams
- Obtaining data and conditions for checking integration points
- Configuring and testing the integration using "stubs"
Testing/production environment support team
- Installing SW in the testing environment and setting up the integration with external systems
- Elaborating and checking the installation guides for the testing environment
- Planning the preparation of the environment preview (access rights, roles, data) for integration and acceptance
QA during the integration check and UAT
Customers and users
- Organizing the User Acceptance Testing (UAT)
- Reproducing and analyzing errors found during the UAT
- Collecting the feedback based on the acceptance testing results
Analysts and developers
- Preparing data for the end-to-end testing
- Analyzing and correcting integration errors
Outside IT teams
- Performing joint end-to-end testing
Testing/production environment support team
- Installing the SW in the preview environment
- Elaborating and checking the installation guides for the production environment
- Setting up the integration with related systems
- Checking the SW performance and functionality in the preview environment
QA at the stage of pre-release and launch
Customers and users
- Participating in a pilot project (if needed).
Analysts and developers
- Elaborating installation guides for the production environment
Outside IT teams
- Participating in the end-to-end testing during DryRun
Production environment support team
- Conducting the DryRun and control smoke testing in the preview environment
- Providing support to the installation process during the DryRun and major release
QA for support of SW in the production environment:
Customers and users
- Obtaining and analyzing feedback based on the results of SW release and operation
- Correcting user guides
Analysts and developers
- Reproducing, analyzing, and correcting error in the released functions
- Analyzing and correcting "technical" incidents from support
- Reviewing the team work following the release results: internal and external
Outside IT teams
- Analyzing and correcting integration incidents
- Adding to the knowledge base new information on the systems integration
Support
- Correcting errors, conducting analysis: installation process, performance after the release, SW architecture
- Reviewing the installation guides for the production environment
Pavel Novikov
Program Manager