What is the ideal testing strategy for a Scrum team? Well, it depends. It depends on the maturity of the team, the team structure, whether it is distributed or not, and several other factors. Having worked with diverse organizations that operate with different styles, I have composed a set of daily tasks and expectations that I have seen to work well for testing in Scrum teams.
Below are some assumptions to begin with:
- Sprints are two weeks in duration.
- The Scrum team consists of eight development team members (offshore/near shore), a product owner, and a ScrumMaster.
- The software is a complex, multitier web-based product with certain data in the database.
- The role of the tester includes testing software in a “QA environment,” while developers are still doing unit tests in their environment.
- The build and deployment process may take up to 30 minutes.
- Build responsibility is on the development side only (in coordination with the continuous integration team).
- Below is a branching structure for reference:
8. Below is the bug life cycle, with the respective ownership:
Given the assumptions above, I am dividing the sprint period into the following segments, from a QA perspective
Day 1: Sprint planning
On the first day of the sprint, the entire development team is involved in the planning and in getting to know the
features in the scope for the upcoming sprint. Everyone should ask questions to get an understanding of the requirements. The test resources have the same participation in user story clarification and estimation as any other development team members.
Below are the elements of the main focus, from a QA perspective:
- Be part of the planning meeting.
- Review and clarify the acceptance criteria discussed by the PO and the team business analyst.
- Independently provide estimates for each user story, using affinity estimation and experience with the application.
- Raise your concerns if there is any discrepancy between the estimates proposed by peers and your own.
- Break down the tasks for each user story if possible; for reference, the list of QA tasks could be as follows:
- Analysis of requirements and ACs
- Design of TCs
- Peer teview
- Review of TCs by business analyst
- Execution of TCs - Manual tester 1 (browser specific)
- Execution of TCs - Manual tester 2 (browser specific)
- Retest of defects
- Create automation logical structure
- Development of automation test scripts
- Automation code review
- Functional review and parallel execution
- Execution of automation test scripts
- Add estimates for each task if possible (the Scrum Guide does not expect the breakdown of tasks and estimates to be completed during sprint planning itself).
Remember that these are all estimates, and the daily stand-up should be used to inspect and adapt the tasks as the sprint unfolds.
Days 2 – 4: Development of test cases and automation logical structure
The next three days are mostly spent by the manual testing team in creating the test cases, and the automation team in creating the "automation logical structure." The automation test scripts are created in reference to the manual test cases and thus are a prerequisite for the automation team. Meanwhile, they create the logical structure, which is basically the mapping process or the creation of the classes in the automation framework that represents the objects of the page.
The manual test cases are written in reference to the acceptance critieria defined by the product owner or the business analyst. These are mostly in a "Given, when, then" format. In relation to automation, TCs often have priority assigned with a 1, meaning, "To be automated," 2 being "Nice to automate," and 3 being "No automation needed."
Overall during this period, the general objectives are transformed into tangible test conditions and test cases. During this phase, the test team must:
Days 4 – 7: Manual testing and automation script creation and execution
- Review the PBIs
- Evaluate the testability of those PBIs
- Identify and prioritize test conditions based on the analysis of test items, specifications, structure, and behavior of the software
- Design and prioritize high-level test cases
- Perform a peer review of the test cases with the business analyst
- Identify the necessary test data to support the test cases
- Design the test environment set-up and identify the required infrasturucture and tools
- Create traceability between the PBIs and the test cases
Since the development executes from Day 1, it is likely to to have the changes deployed to the testing environment on Day 3 itself. The dev team is expected to send the release notes as soon as the changes are deployed to a dedicated testing environment for testing to start.
At this stage, the automation team should have all the manual test cases to help them write the automation suites. The automation test-case creation takes the majority of their time, though the actual automation suite execution can all take place in a day.
Overall during this period, test procedues or scripts are specified by combining test cases in a particular order. The the main tasks include:
Days 8 – 9: PBI closure tasks/Bug fixes/Minor updates
- Finalize, implement, and prioritize test cases
- Develop and prioritize test procedures, creating test data and writing automated test scripts
- Create test suits
- Verify that the test environment has been set up correctly
- Verify and update traceability between PBIs and test cases
- Execute test procedures
- Log the outcome of the test execution
- Compare actual results with expected results
- Notify test results
In most mature Scrum teams, the product owner does not wait till the sprint review to see the increment developed by the development team. By the seventh day, the PO ideally should have a first pass at the changes and should be able to advise if there are any minor updates that need to be made. The magnitude of these changes should obviously be small enough that they can be completed in the remaining sprint without having to reprioritize any effort.
This period should mostly be used to verify these updates, pending bug fixes and complying with the remaining tasks listed per the Definition of Done.
Overall during this period, the data from completed test activities is collected to consolidate the information. The main tasks are:
- Focus on closing all open bugs.
- Go through the sprint backlog and note down any missed user stories for the retrospective meeting day.
- Discuss with the development teams if bugs are still open.
- Check which planned deliverables have been delivered.
- Close or update defect reports.
- Document the acceptance of the system.
Another task that helps is to prepare for the retrospective on this day. The retrospective may involve a bigger audience from the organization, and it's advisable to do a dry run of the features to make sure any setup/caching/environnmental issues are resolved.
Days 10 – Retrospective meeting
In order to demonstrate the features in the retrospective meeting, they should comply with the Definition of Done. The showcased features should be in a potentially releaseable state. The testers and/or business analyst should be able to demonstrate the features as meeting the acceptance critieria defined.
There are several articles on strategies and expectations from a retrospective, so I will simply end this article by reviewing the purpose of this meeting as discussed in the Scrum Guide
- Inspect how the last sprint went with regard to people, relationships, process, and tools.
- Identify and order the major items that went well, and potential improvements.
- Create a plan for implementing improvements to the way the Scrum team does its work.