Get certified - Transform your world of work today


Relaxing the Iron Triangle to Deliver Quality

4 May 2016

Vikas Jain
Infosys Technologies

One of the fundamental principles that underscore the IT industry is the concept of the Iron Triangle. The Iron Triangle provides an interesting way of looking at project delivery through the perspective of scope, schedule, and cost (resources) as the three variables. It states that to deliver a project with the expected quality, when a project is going out of bounds, a manager has to tinker with one of the variables — scope, schedule, or cost (resources).

Scope (what must be built): Scope encompasses tasks that were committed to the business within the specified time frame. Therefore, the idea of descoping is a difficult conversation and is usually avoided. Conversations related to undertaking extra work when there is available capacity are also typically avoided.

Schedule (when it must be built by): The schedule is the most difficult variable to influence, because changes to the delivery schedule are costly. Releases are already committed and, in some industries, advertisements and training are already planned.

Resources (how much it costs): Most of the time, we simply add resources to a late project to bring it back on track. I have often wondered whether, given this simplistic approach toward managing projects, I could brainstorm some middle ground.

While reading this, keep in mind that when executing projects, we have a commitment to an expected quality standard, depending on the risk appetite of the client.

The problem arises when we try to map everything to numbers. The first numbers that we provide are estimates, and after that, all future calculations are reverse calculations to make that estimate appear as the gospel truth. It is not difficult to understand that those numbers are somebody's best guess (yes, I used the word guess), made on the basis of available information from the current project (requirements, assumptions, etc.) and our mapping of the current project to past projects (similar or nearly similar).

This construct is fundamentally flawed, as information changes and ground-level experience with the project dictates future work of the project. Traditional methods of project management have always found it challenging to embrace change; change management has always been a problem.

Is there a different potential approach toward managing work? We need a method that has seamless and nimble acceptance of change.

Relaxing the Iron Triangle

Let's try some other variable. The schedule is difficult and almost everyone agrees on that, but what about scope? This is where the idea of cadence from a team comes into play.

Scope being a variable is a difficult idea to digest, but shouldn't the scope delivered from the same team increase over a period of time? Common practices have been to extrapolate the scope delivery linearly over a period of time, and in some mature cases, even add improved productivity (most of the time ad hoc or with minimal data). But the work method around the scope still involved calculations external to the team.

Scrum proposes a solution to this issue: Bifurcate the work in timeboxed iterations and create the product backlog by defining the scope in the form of prioritized requirements (which itself is dynamic). Then, give the challenge to the team to conquer. The team picks those user stories in the order given in the backlog, and starts delivering. The scope delivered over these timeboxed iterations varies, but in most cases it improves over time. The ownership of scope delivery is with the team and not with someone external.

That construct embraces change and ground-level realities in the work method itself. It empowers the team and has been successful in creating high-performing teams. We also minimize the risks of working in a big design up front. The work can start quickly (since the design evolves, scope evolves, and we don't wait for some superhuman (lead/manager/consultant) who can forecast the complete future of the project and only then start the project.

The myth about using 100% of resources

What happens when the network to which your computer is connected is running at 100% capacity? The network slows down significantly, doesn't it? The equipment moving the packets of data will experience numerous collisions and will have to send requests back to your computer to resend data. The equipment will also begin to simply "drop" packets because it can't process them quickly enough.

Similarly, knowledge work, such as software development, requires deep thinking (processing) and extensive collaboration (communication). It has always been challenging to estimate or account for this time (either we pad some time, or on an ad hoc basis assign some percentage to the estimate). The idea of 100% resource utilization has its roots in the manufacturing industry and is baggage from the past.

Overcoming Parkinson's law

The knowledge industry (IT) has mysteriously fallen prey to Parkinson's law, which states that work expands to fill in the time available. The implication is that if work expands to fill available time, then managers should overload backlogs to ensure that time is spent in the most efficient manner.

In traditional project management, some combination estimators, project managers, and team members build a list of tasks needed to deliver the project's requirements. These work-breakdown structures are ordered based on predecessors, successors, and the team's capacity. Each team member is meticulously balanced to a prescribed level (generally 100%). Once the project begins, the real world takes over, and wham! Something unanticipated crops up, or a group of tasks turns out more difficult than anticipated. These are considered scheduling errors. Rarely do additions to the schedule ever balance with the subtractions. As soon as the plan is disrupted, something has to give. And while replanning does occur, the typical approach is to work longer hours or to cut corners. Cutting corners and tired team members can and generally do lead to increased levels of technical debt.

In projects with high levels of transparency, these decisions reflect trade-offs that are based on business decisions. In some cases, the date might be more important than quality, cost, and the long-term health of the team. Making that type of decision rarely makes sense, but when it does, it must be made with knowledge of the consequences.

An Agile team's natural antidotes to Parkinson's law: the prioritized backlog, the burn-down chart, and the daily stand-up or Scrum meeting. Team members discuss daily the work they have completed and will complete. When the sprint backlog is drawn down, the team can (with the product owner's assent) draw new stories into the sprint. The burn-down chart is useful in helping the team understand how they are consuming their capacity to complete work.

Whether you use Agile or classic project management techniques, Parkinson's law can occur. However, the typical response to plan and insist on 100% utilization might lead to a situation in which the cure is not worth the pain delivered in the treatment. In all cases, slack must be planned to account for the oft-remarked "stuff" that happens, and teams must be both responsible and accountable for delivering value within the time at their disposal.

Opinions represent those of the author and not of Scrum Alliance. The sharing of member-contributed content on this site does not imply endorsement of specific Scrum methods or practices beyond those taught by Scrum Alliance Certified Trainers and Coaches.

Article Rating

Current rating: 4 (6 ratings)


Be the first to add a comment...

You must Login or Signup to comment.

The community welcomes feedback that is constructive and supportive, in the spirit of better understanding and implementation of Scrum.


Newsletter Sign-Up