Agile Performance Testing: An Experimental Approach

27 May 2013

Introduction

The performance of a product plays an important role in the quality of the product, including software applications. Quality brings excellence and popularity to the product, followed by more business for the organization.

A user can't wait for a long time in a website to get the cricket live score, or the items list in an online shopping store. Rather, he or she switches to other sites that render the required information much faster. The credibility of products is highly influenced by the performance, along with its features and functionalities.

The performance requirement takes top priority among regulated and highly critical applications, such as health care and financial products. This enables the rigorous testing effort of the software product at the system level so that it meets its performance requirements and satisfies the customers.

When to do performance testing?

We're moving away from the traditional Waterfall model to Agile methods in the software development life cycle. Even the performance testing activity needs to be part of an Agile method, like any other activity of software development. In an Agile world, when should performance testing be conducted? Every sprint or only at the end of the product development, as it used to be conducted in the Waterfall model? How do we test for performance within sprint after sprint? How efficient is it to conduct performance testing at the end of the product development?

There are plenty of questions and no clear answers. Every product team tries to adopt its own approach for performance testing, depending on the type of product, team, project timelines, stakeholders, organization, etc., and some don't consider performance activity in their plans.

The Definition of Done

The Scrum team, along with the product owner and ScrumMaster, commonly defines the Definition of Done to claim a product backlog to be completed. Usually it includes incremental product architectural design, feature design, development, unit testing, integration testing, functional testing, and documentation. Most teams have no clear requirement for performance and assume it is taken care of as part verification and validation (V&V) activities. I personally recommend that teams identify the performance requirements for the product and create a separate product backlog for it. Sometimes Scrum teams include performance testing as part of the Definition of Done for the sprint if they have separate backlogs for performance. The question is, how effective is this?

If performance testing is part of the Definition of Done, then that testing can happen only during the last few days of the sprint cycle. The result:

  • Development activity stops before the sprint ends. Waterfall within Scrum?
  • The performance testing is done only for individual features by feature sprint teams. Performance testing is not done at the system level, integrating all the features developed in the sprint. Building performance risk at product level?
  • If feature development continues, these features aren't included in the performance testing of the same sprint. Not all the features developed in the sprint are tested. Conflict in meeting the Definition of Done?
  • Possible chance that all team members aren't 100 percent effective throughout the sprint. Team's velocity fluctuates.

An alternate empirical process

We evaluated a strategy to meet the performance requirements within a sprint with 100 percent effectiveness of the Scrum team members and still meeting the Definition of Done. The team followed a simple approach that was best suited for our application, because the performance was one of the main requirements for the health care product, and we needed to test from the initial phases of the product.

This was achieved by remodeling the sprint's cycles and teams. We introduced a new sprint team consisting of performance engineers. This sprint team took the build that had all the features integrated and delivered in the previous sprint. The Definition of Done for the feature team did not include performance testing, whereas the new performance sprint team had only performance testing activities for all the features as its Definition of Done.

Thus we realigned the sprint teams and redefined the Definition of Done for feature sprint teams and performance test sprint teams.

Responding to change is one of the values of the Agile manifesto. Respecting the value is what was changed in our sprint teams:

  • We had a separate sprint team consisting of performance test engineers.
  • This team took the sprint build that got released in the previous sprint.
  • The team continued to perform performance testing at the feature level and system level as and when the features were developed and added to the system in every sprint.
  • Any issues related to performance were addressed in the subsequent sprints.
  • The team completed the reliability testing in between, with the initial features to address early being issues on reliability, and continued as and when features were added to the system.
  • The project had scheduled, additional sprints called "hardening sprints" before the actual release of the product to take care of activities such as additional documentation, release documentation, and remaining system-level V&V activities, if any. The functional testing would have been completed during every sprint; it's always good to cross-verify the system as a whole before it gets released.

The following picture summarizes and elaborates our approach toward performance testing of a product:

An alternate approach to represent how feature sprint teams and performance sprint teams worked together

The product grew with features added and functionally tested. It met the performance requirements of the features. Any issues related to features developed in the last sprints or in the whole system were addressed in the initial days of hardening sprints. The hardening sprint was not meant to address any issues related to development, V&V activities, or performance but to take care of additional documentation that was required to get approval from the government before releasing the product.

Conclusion

The application was critical and mandated no compromise in meeting the performance requirements. We practiced the strategy for performance testing using this approach for our product, where the time taken to display the initial objects on screen was a critical use case.

Playing the role of a ScrumMaster, I facilitated the team to burn down the product backlogs related to performance requirements and deliver the product to meet its crucial milestone. The product was successfully showcased in a trade show late last year and will be deployed for customers this year.

The customized, experimental approach practiced during the last eight months worked for us to meet the performance goal of the product in Agile world. This approach is highly influenced by Scaled Agile Framework, and our Scrum team was one of the system teams as specified by this framework. You can find more detailed information about the Scaled Agile Framework at www.scaledagileframework.com.

We are continuing to practice this approach even now, to deliver the nonfunctional requirements of the product.

Article Rating

Current rating: 5 (1 ratings)

Comments

Rui Miguel Ferreira, CSP,CSM, 5/28/2013 2:46:27 PM
Hi, this is a very interesting approach.
In terms of size, how do you determine the right one for this team? How do you achieve 100% of members allocation?

If i understand, this can be a Typical QA Team, which run tests every sprint. Is that correct?

Thanks a lot.
Sandeep Bramhe, CSM, 5/28/2013 3:54:24 PM
No matter what you have overhead. Looks like luckily you had manageable issues and time luxury during approval. It can get pretty sloppy pretty quick if DoD does not mandate it. The other approach I can see her is automating some part of testing via TDD. Thanks for sharing and Good luck.
Prakash Mallappa Pujar, CSM, 5/28/2013 11:21:52 PM
Thanks Rui Miguel and Sandeep for your comments.
To answer Rui Miguel's query; Yes, there is a separate QA team to execute DoD as per QA stand point every sprint that was automated. since this is dedicated QA team, it was 100% allocated. This team is a typical system team (as per scaled agile framework) and hence concentrating on the product rather than a feature.

You must Login or Signup to comment.