Case Study: Scrum Followed Properly Versus Not

18 March 2013

It's not the methodology or the tool that fails to deliver quality software — it's the people involved. I strongly believe this and am here to share a case study of when Scrum was followed properly versus when it was not followed properly. But first, let me share the essence of Scrum as documented in Do Better Scrum:

Most teams have common knowledge about the basics of Scrum, such as:

But more often than not, teams overlook the essence of Scrum and fail to follow it properly. This case study is about two engineering teams, "Team A" and "Team B," both following Scrum . . . but one team misses practicing the very essence of Scrum, while the other team practices it to the core.

Team A believed it was delivering software using Scrum by having the team, a ScrumMaster, and a product owner. The team had Scrum artifacts: a product backlog, a sprint backlog, and Scrum ceremonies including sprint planning, a daily Scrum, and a sprint cycle of four weeks. But still the team was struggling to deliver software incrementally that was shippable and could be rolled out to production. To its advantage, Team A as collocated, with the product owner and most of its stakeholders on site. But still, results were not there in the form of regular delivery of the most valuable features in production. More important, the stakeholders weren't very happy.

The latest project was then handed out to a virtual team, Team B. Team B not only practices core Scrum principles but also believes in the essence of Scrum. Team B, being virtual, couldn't have daily Scrums because of time constraints, but, using collaboration tools, the team members were able to effectively and efficiently share what they had worked on the previous day, what was their plan was for this working day, and whether there are any impediments. Team B could also communicate robustly with the product owner through collaboration tools about user stories, acceptance criteria, design of the new features, and the Definition of Done. Team B did have weekly calls with stakeholders to sync up on the status of the sprint. The team was disciplined in defining its sprint release cycle, including defining such milestones as the sprint planning meeting, code freeze, functional testing, integration testing, sprint reviews, stakeholder sign-off, and sprint retrospectives, each associated with its respective dates.

By sharing a transparent sprint release plan with dates and milestones, and following this plan meticulously, Team B demonstrated that it followed the essence of Scrum. As a result, it not only delivered the project on schedule and with quality but also was able to refactor the code written by Team A, getting rid of most of the technical debt.

So let's analyze the difference between the teams a bit more. Team A was less disciplined in ensuring visibility into its progress and honestly communicating about progress and risks. The team struggled to organize itself around the work and failed to deliver the most valuable features regularly. As a result, Team A wasn't able to receive valuable feedback from stakeholders. With no way to reflect on its work to improve, the team set itself up for failure. It was not a case of the team not working hard enough but more a case of a failure to properly plan, set expectations, and deliver frequently. Because the team was committing itself to four-week sprint cycles, the development team allowed itself some short-term code wins by providing the features without paying attention to long-term goals of insuring scalability, reliability, availability, and maintainability of the feature. The QA team was brought into the sprint cycles late, thus losing the advantage of giving early second opinions about the requirements and finding any gaps. By the time the QA team found any requirement gaps, the development team was in the middle of next sprint cycle. Team A also missed setting expectations with stakeholders on the timelines of signing off on sprint items, thus not allowing stakeholders to provide timely feedback and missing the opportunity to improve on the delivered features.

In contrast, Team B, though virtual, was disciplined in planning, set expectations by being transparent, communicated robustly using collaboration tools, and ultimately was able to frequently deliver most business-valued features.

I look forward to any feedback on how your team follows the essence of Scrum.

Article Rating

Current rating: 0 (0 ratings)

Comments

Gautam Roy, CSM, 3/18/2013 7:18:10 PM
Great insights!
Srinath Chandrasekharan, CSP,CSM, 3/20/2013 11:44:44 PM
A projects success is based on many parameters, the people and methodology being just 2 of these. Your article compares 2 projects on these 2 parameters . So while I agree that "It's not the methodology or the tool that fails to deliver quality software ΓÇö it's the people involved", I feel that the comparison could have been better illustrated by having 2 teams with at least one of these parameters as same. So a comparison of 2 teams following same practices, yet the people make the difference or 2 projects having people with very similar skill levels, but the process made a difference.
Sridhar Peddisetty, CSM, 3/22/2013 10:56:09 PM
Srinath,
I concur with your point about comparing using similar parameters and this is what I did in the article when I gave an example of two teams comprising of team members with more or less similar technical background. What point I was trying to drive here is that both teams 'practiced' scrum but one missed the essence of scrum while the other followed the scrum in practice. In other words, both teams differed in 'following' the process in practice.
Alan Atlas, CST,CSC,CSP,CSM,CSPO, 10/11/2013 12:15:44 PM
Hello Sridhar,

I'm a bit concerned for people who might not be too well-versed in scrum reading this. Some of what you describe regarding in particular Team B sounds a lot like waterfall and not scrum. Typically we do not have a concept of "code freeze" in scrum, nor do we separate functional and integration testing. Your process picture looks like it is quite influenced by waterfall, and I wanted to point that out to help people understand better.

alan
Sridhar Peddisetty, CSM, 10/26/2013 12:19:25 AM
Alan,
Thank you for sharing your points. There are lot of on-going debates on 'what constitutes as Scrum and not Scrum'. IMHO Scrum is a methodology and is lightweight framework that is adaptable. 'code freeze', 'functional or automated integration testing' are part of SDLC, which irrespective of delivery or project methodology (waterfall or agile ) can apply. Just because we have 'code freeze' or 'functional/integration testing' does not mean methodology is more waterfall. Some teams use concept of 'hardening' instead of 'code freeze'. Functional and automated integration testing are part of any iteration development cycle and using them as stage gates or milestones (even though continuous), in my opinion are permissible.
Wayne Moore, CSM, 11/20/2013 3:31:11 PM
Hi Sridhar,
Thanks for the article. I do agree with Alan on the concept of a 'code freeze' or even 'feature freeze' being a waterfall concept. My issues with it are that it defers integration/regression testing until the end rather than it being an iterative/integrated process of the development team. It also places a great deal of stress on QA to do all this testing at the end. And thirdly it reduces the amount of time that could be spent developing requested features. While it might be a temporary measure for new scrum teams, I would challenge them to eventually eliminate the freeze by doing continuous integration builds and and Test Driven Development.

Regards,

- wayne

You must Login or Signup to comment.