Measuring Alignment with the Principles of Agile

2 January 2014


What you cannot measure, you cannot manage.
What you cannot manage, you cannot optimize.

Most organizations and teams that have adopted Agile as a primary method for software delivery know the Agile Manifesto and the 12 Agile principles like the backs of their hands. However, they may not have taken a conscious effort toward measuring their level of adherence to the Agile principles. Often, a few important questions linger in the mind of the manager or leader: How do I know whether my teams are truly Agile? And when do I, as a manager or leader, make an informed decision to intervene, act at the right time, and help the team in the process of achieving common goals? Although the teams are self-organizing and self-managing in an Agile environment, often a team that starts practicing Agile methods for the first time could use a little help in the form of good coaching and leadership. In order for a coach and a leader to be effective, he or she needs data to make informed decisions.

Metrics are best when defined based on a clear set of objectives. Metrics are not the goal. Metrics are the means to monitor how one is performing against one's goals. Similarly, adopting Agile should not be the goal. Agile should be the means to achieve the organization's business-driven IT objectives. If adopting Agile practices is chosen as the means for achieving your objectives, then the journey starts by embracing and adhering to the Agile Manifesto and the 12 Agile principles.

Let's take a closer look at each of the 12 Agile principles and explore a suggestive set of qualitative and quantitative measures that can potentially provide the necessary insights into teams' adherence to them.

Recollecting the 12 Agile principles:
  1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  2. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage.
  3. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
  4. Business people and developers must work together daily throughout the project
  5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  6. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
  7. Working software is the primary measure of progress.
  8. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
  9. Continuous attention to technical excellence and good design enhances agility.
  10. Simplicity -- the art of maximizing the amount of work not done -- is essential.
  11. The best architectures, requirements, and designs emerge from self-organizing teams.
  12. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.

1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.

Key aspects: Customer satisfaction, early and continuous delivery, and delivery of software that is considered valuable by the customer

Customers typically have two levels of satisfaction: "satisfied in the near term" and then "satisfied in the long term." An example of near-term satisfaction could be meeting the objectives of a two-week sprint, and an example of long-term satisfaction could be the ability to meet the objectives consistently and also help the business achieve its objectives.

One of the preferred and effective techniques for gathering insight into customer satisfaction in the long term is by conducting surveys. It is also a common practice in many organizations to publish a customer satisfaction scorecard with the ability to drill down at portfolio or program level. Typically, the questions in surveys conducted at organization level tend to be generic, so that the questions are relevant for a wider population across the organization. Therefore, in order to effectively measure at the project level, customer satisfaction surveys conducted specific to the project could be helpful.

An analysis of milestones planned versus achieved (both milestones achieved on a planned date and milestones achieved but at a later date) is probably best done by measuring continuous delivery and adherence to schedule. In order to effectively measure the aspect of early and continuous delivery, the teams will have to consider the milestones associated with a release or deliverable of a work product or service. Alternatively, Agile earned value analysis metrics, such as the schedule performance index (SPI) -- considered a more mature technique -- can also be used to learn both the past and the forward-looking insights of schedule adherence and continuous delivery. Further, SPI comes in handy when milestone dates are either shifted or redefined due to progressive planning and adaptation.

The customer is in the best position to define what is considered high value. Of course, in technology-intensive projects, inputs from technical experts can also define what is considered high value. Calculating ROI at an item level for both features and risks provides insight into aspects of shipping valuable software. The practice of prioritizing and ordering the backlog with high-value items at the top of the stack significantly influences the aspect of "early" in early and continuous delivery. In addition to ROI, Agile earned value analysis metrics such as the cost performance index (CPI) can be used to learn both the past and the forward-looking insights on adherence to the budgeted cost and value realized.

Suggested metrics and measures:
  • Customer satisfaction survey/Scorecard
  • Release milestone analysis (planned versus achieved)
  • Agile earned value analysis
  • Return on investment (ROI)

2. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage.

Key aspects: Ability to adapt to changing requirements even late in the game

When teams and management who have decided to adopt Agile practices for the first time put the ideas to work, they tend to think that "Agile" means no up-front planning. However, the Agile methods rather advocate planning to the best possible extent upfront, planning continuously, and planning progressively. It is important to have a vision, prepare a project charter with the best information available, and have a groomed product backlog with story points estimated during planning.

Performing the planning activities upfront helps IT demonstrate to business the measure to which the teams have been adaptive and welcoming changes in requirements. The ratio of the initial number of product backlog items to the sum of the initial number of backlog items and backlog items that have been added, modified, or deleted provides insights into the degree to which the product backlog has changed. (Each of these changes in the traditional world would have potentially resulted in a change request with reestimation of effort and additional costs involved.) Alternatively, teams can measure the stability of the product backlog, in other words requirements stability, using story points. Requirements stability by story points can be defined as the ratio of the total number of story points in the product backlog at the end of a grooming exercise during planning to the total number of story points on a status date. Special effort may be required to gain insight into modified and deleted story points. Assuming that the product backlog is periodically revisited, groomed, and ordered, the metric can be indicative of the teams' ability to adapt to changing requirements.

Having discussed this metric, requirements stability need not be necessarily viewed as a negative indicator -- which is a typical traditional behavior. The measure to which the product backlog has changed can also be viewed as the number of opportunities the business had to strengthen the customer's competitive advantage of the product.

Requirements stability along with SPI and CPI may provide the ability for teams and leadership to ask the right questions during adaptive planning and/or retrospective sessions and take corrective and preventive actions. When should the leadership intervene? Should the leadership help the team if more than 30 percent of the product backlog has changed? These are some of the questions that can help manage stakeholder expectations. This should not be viewed as command and control but rather as informed decision making, tuning and adjusting behavior in alignment with the 12th Agile principle.

Suggested metrics and measures:
  • Requirements stability (by number of backlog items)
  • Requirements stability (by story points or effort)

3. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.

Key aspects: Delivering working software, regular/periodic delivery, shorter delivery cycle

Potentially shippable software is not as same as software that has been shipped and is in use with the value being realized by the business and the users. Therefore, it is important to measure whether value-adding working software was indeed delivered to the customer, and quickly.

There are a couple of ways to gain insights into this: One way is to log the defects detected as part of the product backlog and measure the percentage of defects to the number of user stories in the product backlog. This is indicative of the quality, in other words whether the software is working or not. The milestone analysis technique can again to be used to measure the frequency at which the team was able to achieve the planned milestones.

Suggested metrics and measures:
  • Percentage of defects in the product backlog
  • Release milestone analysis (planned versus achieved)
  • Release burn-down

4. Business people and developers must work together daily throughout the project

Key aspects: Collaboration between business and developers

Typically, the effort required by the product owner for a 2-week sprint could be anywhere between 11.25 hours to 70 hours, depending on the level of involvement. The 11.25 hours could include 4 hours of sprint planning, 15 minutes of daily stand-up adding up to 2 hours and 15 minutes for 2 weeks, 2 hours of sprint review, 2 hours of backlog grooming, and 1 hour of sprint retrospective. This is the bare minimum number of hours required for the product owner during the sprint, assuming that adequate effort was spent in having the backlog groomed and well defined before commencing the sprint. The effort spent by the business in the project is an indicative measure of the business and developers working together.

Once it is established that the business is spending an adequate number of hours on the project, the next objective is monitoring the effectiveness of business and developers working together. One of the ways to learn the effectiveness is by calculating the percentage of escaped defects. This percentage is the ratio of the number of defects that escaped into the production environment (in some cases the UAT environment) to the total number of defects for the same time period.

Suggested metrics and measures:
  • Project effort logged by business toward the project
  • Percentage of escaped defects

5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.

Key aspects: Motivate individuals; provide suitable environment plus support and trust

Similar to the customer satisfaction survey, one of the preferred and effective techniques for gathering insight into employee satisfaction is conducting surveys and collecting feedback. It is also a common practice in many organizations to publish an employee satisfaction scorecard. Numerous studies show that satisfied employees deliver better services than employees who are not satisfied. The sprint retrospective can be a great time to understand any dissent and dissatisfaction within the team. The number of nontechnical impediments logged can be indicative of the team's morale and issues that it is facing.

When you assign a task to an individual, you either receive the completed task back -- or you receive excuses. But if you trust an individual with a responsibility, you will most likely get more than you asked for. The aspect of allowing the teams to be self-organizing potentially creates an environment in which each individual team member takes responsibility to deliver the goods to the customer as promised. Providing a safe environment is the responsibility of the leader.

Suggested metrics and measures:
  • Employee satisfaction survey/Scorecard
  • Percentage of nontechnical Impediments

6. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.

Key aspects: Effective communication, co-location, collaboration among team members

Although it is agreed that the most efficient and effective way to communicate is face to face, seldom are teams co-located in today's world. If teams could be co-located all the time, industry models such as Scaled Agile Framework and Distributed Agile would not have come into existence.

However, it is possible to plan on building a close-knit team by understanding upfront the team's distribution by location (being aware of the cultural differences) and the number of communication channels. Other reactive indicators could be the number of impediments related to communication that are logged and the number of defects that were injected due to lack of communication. The lag or the reactive indicators may provide leaders with the ability to make course-corrective actions.

Suggested metrics and measures:
  • Percentage of team distribution by location
  • Number of communication channels
  • Percentage of impediments related to communication
  • Percentage of defects with lack of communication as the cause

7. Working software is the primary measure of progress.

Key aspects: Measuring progress; working software

All of us know that Agile relies on value-based planning and delivery, unlike traditional plan-based value and delivery. Working software alone is the primary measure of progress and is measured using the concept of velocity. Team velocity is simply the number of story points delivered in each sprint by the team. Another metric that potentially provides insights on top of velocity is measuring the delivery efficiency.

Delivery efficiency by sprint can be defined as the ratio of the number of story points delivered to the number of story points planned during sprint planning, expressed as a percentage. In addition to velocity (number of story points delivered), delivery efficiency provides the percentage of the number of story points that were delivered against those planned.

Suggested metrics and measures:
  • Velocity
  • Delivery efficiency

8. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.

Key aspects: Sustainability, constant pace

A common question that many managers ask is whether team velocity is improving or not, and whether the team has reached a sustainable and constant pace.

In order to provide insights into how the team is performing with respect to reaching a steady pace with velocity, I am introducing the concept of acceleration: As defined in physics, acceleration is the rate of change of velocity. Acceleration provides insight into whether the team has improved on its previous sprints' velocity or not. Say, for example, that the team delivered 10 story points at the end of sprint 2 and 15 story points in sprint 3, then the team has an acceleration of 5 story points. However, if the team delivered the same 15 story points in sprint 4, then the acceleration is 0. Please note that acceleration should not be equated to speed or velocity (just as in physics). Acceleration only indicates the rate of change and is especially useful for learning whether the team has reached a reasonably predictable steady state in terms of the number of story points that the team can consistently deliver. It can also depict negative acceleration.
Velocity-Delivery-Efficiency-and-Acceleration-(2).jpg

Velocity, delivery efficiency, and acceleration

Formulae:
v = u + at
a = (v – u) / t
Assuming "t" is constant (sprint duration is kept constant)

Where
v = Final velocity
u = Initial velocity
a = Acceleration
t = Time taken

The other metric that comes handy is the resource utilization rate. This provides insight into whether an individual or a team is used in an optimized manner. The leaders get an opportunity to make corrective measures if the team or an individual is frequently working extended hours.

Suggested metrics and measures:
  • Acceleration (introducing an old-new concept from physics)
  • Resource utilization rate

9. Continuous attention to technical excellence and good design enhances agility.

Key aspects: Technical excellence and good design, improve agility

Agile practices can be broadly classified into management practices, engineering practices, and business or product-related practices. Achieving technical excellence has many advantages, such as the ability for the system to accommodate changes without too much rework, to scale the system up, to build a product with fewer defects (if not free of defects), and other advantages such as longevity of the system, portability, maintainability, etc. All result in improved agility.

Design quality is sometimes subjective and is very much team and change dependent. A design quality index can be based on parameters such as the time needed to develop, maintainability, portability, scalability, etc. The design quality index is essentially a score that the teams can relate to for comprehending the quality of design. The effort required to refactor could be another useful indicator of the quality of design and source code. A lag metric that can be used to gain insights into design quality is percentage of defects injected due to poor design.

Cyclomatic complexity can be useful in determining the amount of decision logic incorporated in the code leading to increased code complexity. Another lag metric that is commonly used to gather insight into the quality of a module is the density of the defects per unit size (either lines of code or function points, etc.).

Suggested metrics and measures:
  • Design quality index
  • Effort required to refactor
  • Defect density
  • Cyclomatic complexity
  • Percentage of defects with design-related defect causes

10. Simplicity -- the art of maximizing the amount of work not done -- is essential.

Key aspects: Maximize amount of work not done

The best technique to monitor the team's ability to maximize the amount of work not done is by keeping the release burn-down up to date. Having a risk-adjusted backlog potentially helps ensure that the team has factored the risks and that any potential slips due to a risk are proactively managed.

Monitoring the effort required for the technical debt identified in the product backlog, and the number of defects that are open, provides additional insights into the backlog so that the product owner and the team can take course-corrective actions for maximizing the amount of work not done.

Suggested metrics and measures:
  • Release burn-down
  • Percentage technical debt in the product backlog
  • Percentage of open defects in the product backlog

11. The best architectures, requirements, and designs emerge from self-organizing teams.

Key aspects: Self-organizing team

The ability to gain insight into whether the team is truly self-organizing is probably the most challenging aspect to compute as a metric. However, it can be inferred through other means, such as discussion during the sprint retrospective and discussions with the team's ScrumMaster. Reviewing the list of impediments could also provide insight into the level of self-organization.

Caution: A few "creative" teams measure the number of times management intervenes. However, in many cases measuring this aspect has resulted in creating more nuisance value than any tangible benefit. The Agile coach and an experienced ScrumMaster are in the best positions to steer a team toward better self-organization.

Suggested techniques and measures:
  • Sprint retrospective
  • List of impediments due to team organizing issues

12. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.

Key aspects: Team retrospective, behavioral change, fine tuning, adjustments and adapting

This is, again, a team-specific measure and is related to self-organization.

Teams can start by measuring the efficiency at which the team, including the ScrumMaster, is able to close on action items identified during the sprint retrospective. A more mature practice is to weigh the benefit realized by moving on the action items identified during the sprint retrospective. The former aspect can be termed "retrospective efficiency," since the measure focuses on the ability to act on and close retrospective action items. The latter can be termed "retrospective effectiveness," since the metric focuses on the effectiveness of closing retrospective action items.

Suggested metrics and measures:
  • Retrospective efficiency
  • Retrospective effectiveness

Conclusion

Measuring adherence to Agile principles at both the organization and the team levels provides the team and management with a more structured approach for adhering to the principles of Agile. It allows proactive and course-corrective actions to reap the benefits of developing products using Agile methods.

What gets measured gets done.

Article Rating

Current rating: 4.9 (8 ratings)

Comments

Madhu Venantius Laulin Expedith, CSM, 1/3/2014 10:01:04 AM
Acceleration can also be useful when the team ends-up having varying sprint durations for whatever reasons and computing velocity becomes inaccurate because of the varying sprint duration. Acceleration takes into account the Time (Sprint Duration) as a multiple of “t”.
Zach Bonaker, CSP,CSM,CSPO, 1/3/2014 11:39:07 AM
This is a great article. Beyond the demonstration of metric ideas, the point you make about metrics being a measurement of goals is important.

Far too often, organizations get caught up in producing metrics without having a hypothesis or goal to test/measure.
Ganesh Shenbagaraman, CSM, 1/5/2014 1:21:17 AM
Great article, Madhu. Some of the new metrics you introduced connect the principles and practice in a very insightful way.
Madhu Venantius Laulin Expedith, CSM, 1/5/2014 6:29:47 PM
Thanks Zach and Ganesh

You must Login or Signup to comment.