Get certified - Transform your world of work today

An Agile Process for User Acceptance Testing

07/03/2013 by Bala Lakshminarayan

As a ScrumMaster, I observed that while the QA team in my prior company was working hard and focused on testing and automation, we were observing significant customer-reported issues. Even though product managers and other subject-matter experts (SME's) were also testing the product, their efforts were largely uncoordinated. Due to my prior background in software quality assurance management, where I managed user acceptance testing (UAT) in addition to QA, and due to my current role in Agile/Scrum project management, I initiated a process to implement UAT at this company.
What is user acceptance testing?
User acceptance testing is a process that obtains confirmation that a system meets mutually agreed-upon requirements. The UAT acts as a verification of the required business function and proper functioning of the system, emulating real-world usage conditions on behalf of customers.
Who typically performs UAT?

  • Product managers (mandatory)
  • SME's from client care, compliance, professional services, beta customers depending on interest level and availability
When is UAT testing done?
UAT stories/tasks should be started during regular sprints. UAT testing tasks should be completed and marked done during the sprint. Relevant user stories should be cloned and the clones reopened during the stabilization sprints, for execution and closure again during the stabilization sprints. The goal of the first UAT is to perform initial validation during the feature creation sprint, and the goal of the second UAT (which is to be done during stabilization sprints) is to validate business functionality on the final integrated code, prior to deployment at a customer site.
What are stabilization sprints?
These are hardened sprint(s) that we do just prior to a major product release. During stabilization sprint(s), no new code is written. Quality assurance does regression testing, and product owners and SME's do UAT testing. Development fixes whatever issues are found.
Flow chart describing Agile process for UAT

Template for a user story for UAT
The user story for UAT will typically be written by a ScrumMaster. Only 1 story for UAT needs to be written (along the lines of the template indicated below) for a release. All UAT tasks to validate the various user stories in the sprint can be created under this UAT story: As a product owner or a SME, I want to ensure that the code developed for this release performs the required business function defined by the user stories and acceptance criteria, so that I can ensure that the released software meets user requirements.
Acceptance criteria:
  • UAT tests perform the final verification of the required business function and validate proper functioning of the software, emulating real-world usage conditions.
  • UAT tests also ensure that business function for the user stories are validated not only from the component level but also from the system integration perspective.
Structure of UAT in Jira
In order to explain how UAT can be structured for a release, we have created a sandbox epic, user stories, tasks in Jira under the Jira Playground Project. The chart describing the sample UAT structure is shown below:

Implementation process for UAT
I first presented to senior management my framework of an Agile process in UAT using our Jira/Greenhopper Atlassian tool, with diagrams and flow charts, and I pointed out the benefits of using an organized Agile UAT process. Based on their positive feedback, I conducted a meeting with all directors/managers of product management and product owners and presented the benefits, plus the process for implementing UAT. While there was some concern about the additional work that product owners must do to support an organized Agile UAT effort, they agreed that the overall benefits of having an organized Agile UAT program were significant, and I secured buy-in from product owner teams.
During UAT in our releases, performed by product owners and SME's, significant bugs involving feature design not conforming to user stories and acceptance criteria were discovered by the product owners. We also had participation from SME's who found significant system integration bugs during the stabilization sprints. These issues were resolved prior to release, before customers could find them.