Agile Support to Testing

17 January 2013

Tina Johnson
Engility

January 5, 2012, 1030 hours (10:30 a.m.): A government test organization's data management Scrum team was asked to support a testing effort a day after we had started our two-week sprint. We were asked to set up a website/fresh data management tool database, build a data collection form, test it, and have it available for a very short window of opportunity. The effort consisted of testing and evaluation for an email migration from one server in Hawaii to another location Outside the United States (OUS). The Scrum team replied, "Yes, and . . ." in the spirit of Scrum, in that we can always support, and we needed the flexibility to let other tasks drop off the sprint backlog. The product owner gave us the go-ahead. We were then able to test at the speed of need and practice true testing agility.

On January 5, we had a website and fresh database up, accessible, and live.

We built, tested, and updated (with several changes) a draft survey from January 10 through January 24.

The evaluators collaborated with the data management Scrum team to work out details on connectivity and decide what exporting functions to create to feed analysis and evaluation. Originally, we needed the final survey ready for prime time no later than January 17; that later slipped to January 24. We would issue surveys to a targeted 500 users prior to their "fail-back" to the OUS mini-pod. The goal was to establish a "before-baseline" so evaluators would have something to compare to after the switch.

We had to keep the time difference in mind as well and ensure we backward-planned to shift to accommodate OUS dates (Arizona time plus 16 hours).

We also had to keep information assurance (IA) issues in mind, and we asked our resident IA subject matter expert (SME) if there were any issues. There weren't, as the survey traffic would all (or mostly all) be originating from military (.mil) addresses.

There were concerns about server performance and load testing. We answered that we could load test. As teammate, CSM, and coauthor Marc Rader stated, "This is one of those 'Yes, and . . . ' answers." The auto test team had the ability to do it, though it would take them some time to get it configured, plus we needed to be conscious of live test teams using the site as well. Rader notes, "Regarding loading of the server with survey takers, it was a concern because it had never been loaded that heavily. From a network point of view, we expected that bandwidth would be the choke point — the servers themselves were new and beefy."

We talked to our automated test team about creating sample/practice data. To get what was wanted would take a day or so. Sample data and survey load testing was completed on January 16, hitting the data management tool's test server with 275 simulated survey takers using an automated test tool to "pound" the user survey to load it up with a lot of dummy data. This also would provide dummy data to test exporting the raw survey data to Excel format for analysis.

The evaluators and data management Scrum team worked hand in hand in deciding what Structured Query Language (SQL) queries to create and in exporting routines to conduct in aiding the analysis. All this was done up front, before the testing ever began, which helped get the analysis and reporting done much faster.

On January 24, the survey was live at 0801 when the evaluators sent out an e-mail inviting the OUS users to complete it. We reported the survey status at noon (our time) each day on January 24 and 25. We also followed up on several surveys that needed to be redone because the survey takers had not completed them and needed them reset.

So that there wouldn't be any surprises, and so that we could ensure quick turnaround results and findings gleaned from the OUS data, we needed to thoroughly dry-run and war-game our analysis approach and methods by knowing ahead of time what answers we could hope to provide and how we'd arrive at those answers. We were to report when each tool was monitoring and the ranges/averages of metrics collected for each.

As of 1315 hours, January 25, we had 39 responses to the survey, about three-fifths of which were anonymous (two-fifths had actual respondent email addresses).

As of 0600, January 26, we turned off access to the survey data and exported the data in Excel format for the evaluators. We created two SQL statements that exported for Excel-based analysis. From 0600 to 1200, the evaluators conducted analysis based on the Scrum team's collection and exporting routines to validate the surveys and establish a baseline of user-perceived test performance. The completed report was released at 1400 hours.

In summary, the quick testing effort showed how our Scrum team took an important, unexpected effort and applied Agile principles in one government area's migration to Agile testing. The Scrum framework allowed our team to adjust to customer needs quickly, working software and collaborating with other teams. We did not need to worry about processes, documentation, contracts, or following a rigid plan set in stone. Our product owner allowed us to deviate from the planned sprint backlog and improvise and overcome to support a fast-paced and quickly changing testing environment. We could embrace change by being flexible and practicing our "Yes, and . . . " philosophy to support the customers. We delivered early, not only in setting up the website and database but in implementing the survey changes, SQL export routines, and flexible working hours to accommodate customers. Truly, we enabled the test organization to "test at the speed of need."


Opinions represent those of the author and not of Scrum Alliance. The sharing of member-contributed content on this site does not imply endorsement of specific Scrum methods or practices beyond those taught by Scrum Alliance Certified Trainers and Coaches.



Article Rating

Current rating: 0 (0 ratings)

Comments

Be the first to add a comment...


You must Login or Signup to comment.