Get certified - Transform your world of work today


Common Failure Points in Scrum Implementation

27 September 2017

Niranjan Nerlige V
Exelplus Services

There are numerous articles and presentations about Scrum anti-patterns. Based on my experience in coaching, consulting, and training in Scrum, I have compiled a list of what are, in my perspective, common Scrum failure points. It can be used as a checklist of items to guard against during the implementation of Scrum.

Of course, this is not to say that every team engages in all of these anti-patterns. But they are common enough that every team should be careful that they do not fall into these habits.

During sprint planning

  • The development team does not estimate taking the past performance into consideration. This is essential for knowing the team’s ability to determine the quantity of work they can take on in a sprint (whether estimated in story points or ideal days).
  • The actual capacity, based on the team members’ availability, is not taken into account by the development team.
  • No product backlog refinement is done after the previous sprint. Product backlog items for an upcoming sprint should have been estimated, ordered, and defined in detail.
  • The sprint backlog is not clearly compiled or complete (the team might not even have a task board), and the development team does not collaborate to define the selected product backlog items and sprint tasks.
  • The objective of the sprint is not clearly identified, which makes it difficult to think about what needs to be demoed at the end of the sprint.
  • The product owner is not involved in the sprint planning meeting, so expectations are not clearly understood.
  • Sprint planning often takes a long time, and team members get into long discussions to decide on the work to be undertaken. The timebox for sprint planning is not respected.
  • Teams do not identify the infrastructure items, exploration items, spikes, and research items in the product backlog.
  • Capacity allocation for the development work in a sprint is not understood properly. The team must take into account emergency leaves, bug fixing, research, and other work that is not development work.

During the Daily Scrum

  • Teams lose focus during the meeting.
  • Teams take too long to complete the Daily Scrum, not respecting the timebox.
  • The ScrumMaster drives the whole meeting. The Daily Scrum is meant for the development team to discuss their progress.
  • Team members do not participate actively — some don’t even come to the meeting regularly.
  • The team does not adapt after the meeting by updating their daily task board to keep their sprint goal in the mind. The Daily Scrum is meant to support an inspect-and-adapt approach every day.
  • Teams don’t get together and treat this as an opportunity to discuss their shared commitment for the sprint.
  • It is not just about discussing three questions. The team can also use this opportunity to discuss:
    • How well everyone is working together to swarm around the backlog items.
    • What is expected for the demo.
    • Progress toward the sprint goal.
    • How to adapt for the next day through taskboard updates.
    • Impediments identified.

During sprint execution

  • Succumbing to “waterfalling” within sprints — teams often unwittingly adopt a Waterfall approach within a sprint.
  • Teams do away with testing activity at the end of the sprint.
  • Team don’t swarm when there is critical issue or an impediment needs to be resolved. Whenever there is a critical issue, it makes sense for all members of the team to collaborate and solve the critical issue together, with focused attention.
  • Outside help is not called upon when necessary. The team should avail themselves of subject matter experts, architects, etc.
  • The product owner is not involved in the frequent verification of the work during the sprint.
  • The Definition of Done is not tested at the completion of a user story, or not identified effectively.
  • The team is not T- or E-shaped, so its members cannot support each other effectively.
  • The Scrum team is not concerned with features but works only on layers (UI, middle, and database layers).
  • The team does not use the PDCA approach effectively (develop a hypothesis, determine the data needed to test the hypothesis, check/test, then act). In an Agile environment, use data to help drive decisions.
  • The team does not track its daily progress using task boards, burn-down charts, cumulative flow diagrams, etc.
  • The team does not run regression tests, due to lack of automation.
  • Continuous integration is not present; the build often takes a long time or is done at the end of the sprint.
  • Test-driven development and other technical practices are done partially or not done at all.
  • Teams do not limit the number of backlog item in progress during the sprint.
  • Team members don’t defer the story by keeping the PO informed when there is serious impediment and a probability of needing to work less on the user story.
  • Team members do not actively collaborate to resolve defects and issues discovered within the sprint.

During the sprint review

  • The demo is done without stakeholders present, and no one is empowered to take a measure of the sprint’s success.
  • The sprint review is done without the sprint goal in mind.
  • The team fails to mutually agree on what is left undone and then update the product backlog accordingly.
  • The release date for the remaining work is not clear; the burn-up chart is not clear.
  • Often the team doesn’t use the sprint review effectively by objectively considering:
    • Acceptance criteria for the user stories.
    • The Definition of Done for the sprint.
  • The first time the product owner sees the completed work is at the end of the sprint.
  • The product owner doesn’t ensure that key business stakeholders are available to attend and to assess the product or product increment.
  • The product owner and ScrumMaster do not focus on the review of the product or product increment presented but instead drift away into discussion of other areas.
  • The Development Team does not prepare well for the sprint review meeting by:
    • Ensuring that all user stories that are shown have been tried out before the meeting.
    • Ensuring that the work is really “Done.”
  • The entire group does not collaborate on what to do next, so that the sprint review fails to provide valuable input to properly done sprint planning.
  • The team forgets that the spring review can be highly useful if these points are followed:
    • Don’t shrink it to a demo only.
    • Have discussions, assess the plans for the next sprints (for example, review the product backlog), have end users play around with the new features.
    • Do whatever you can to make sure that time is not wasted in the upcoming sprints on doing the wrong stuff. So please, don’t call this meeting “the demo.” It is much more. It is the sprint review! Feedback and “feedforward” is an essential part of this meeting.
  • The “Happiness Metric” is not used as a tool to collect feedback (I note that this is not part of “official” Scrum, but it is highly useful).

During the sprint retrospective

  • The team doesn’t prepare to discuss what went well and what could be improved in this meeting.
  • The team identifies too many (an unattainable number of) improvements for the next sprint.
  • The team misses out on the most important potential improvements.
  • The meeting is done incompletely, without all the necessary people, relationships, processes, and tools.
  • The team does not create a plan for implementing improvements to the way they work in future sprints.
  • The ScrumMaster doesn’t facilitate the meeting, and the meeting goes in a different direction or lacks certain elements (such as identifying areas for improvement).
  • The product owner’s feedback is not taken into consideration for making improvements.
  • Team members do not identify and explain their expectations of themselves and of one another.

In general

  • The team does not follow estimation techniques but does everything in a more gut-based way.
  • Teams are not cross-functional.
  • Technical debt is left to increase.
  • Product backlog refinement meetings don’t happen often enough or are adequate, so that no preparation is done for the upcoming sprint.
  • The ScrumMaster is just a team administrator who does not truly understand his or her role.
  • The team does not track their past performance; i.e., there is no velocity tracking to help decide how much work they can take on.
  • The product owner is not available during the demos or the sprint planning and sprint review meetings.
  • The team does not understand that Scrum is not just a list of practices but constitutes a mindset for agility.
  • The team doesn’t understand that Scrum does not mean that no analysis should be done and no documentation should be written. They fail to understand that the way of documenting is just a bit different: it’s a continuous process.
  • The team doesn’t write user stories with the right amount of — but not too much — detail; it should be estimated in story points or ideal days.
  • There is lack of automation of testing activities.
Again, I would stress that not all teams fail in all these ways (or Scrum would never work!). But these are the areas where I have seen Scrum teams break down most often, and therefore it is useful to be aware of these pitfalls before a Scrum implementation.

Opinions represent those of the author and not of Scrum Alliance. The sharing of member-contributed content on this site does not imply endorsement of specific Scrum methods or practices beyond those taught by Scrum Alliance Certified Trainers and Coaches.

Article Rating

Current rating: 5 (2 ratings)


Be the first to add a comment...

You must Login or Signup to comment.

The community welcomes feedback that is constructive and supportive, in the spirit of better understanding and implementation of Scrum.


Newsletter Sign-Up