How does an organization implement something that it has never done, for a client who has never done it prior, using team members who have never done it, for end users who have never used it before? That is the question that was posed to my company, Tantus Technologies (Tantus), when we set out to explore how to tackle a particular component of a federal customer's strategic plan: implementing Agile software development.
The following white paper details how, lacking the in-house expertise to implement Agile and unable to hire or contract out due to budgetary constraints, we went about educating ourselves, our internal customers, and our customer's end users in the first year of implementing an Agile approach for a federal customer.
Our federal customer's goal of Agile adoption in 2011 was very much in line with industry and sector trends at that time. Increased growth of Agile methods was found across both the private and public sector. Product companies as well as service providers were adopting Agile at higher rates than ever before. Even organizations more aligned to traditional software development had recognized this wave and were providing guidance to their membership on how to adopt Agile practices. The Project Management Institute (PMI), in response to its annual member survey conducted in 2011, found that the use of Agile by its members jumped from 12 percent of all projects in 2010 to 27 percent in 2011. This rise led the PMI to create and offer its Agile Certified Practitioner (PMI-ACP) certification. As a CMMI ML-3 appraised organization, we were cognizant of the fact that the Software Engineering Institute had also updated some of its process areas within the Capability Maturity Model Integration to help those using Agile interpret its practices.1
As a federal agency, our customer's foray into Agile software development was also in adherence with direction provided by the Office of Management and Budget (OMB) in 2010, which had advocated modular development as a chief component of its 25-Point Implementation Plan to Reform Federal Information Technology Management.2
The U.S. General Accountability Office (GAO) followed OMB's plan in 2012 by issuing guidance on effective practices and challenges in applying Agile methods in a federal environment.3
With these internal and external drivers providing background as to the "what" and "why" of implementing Agile software development, the next focus for our team was the "how." The "how" question led directly to the heart of our three-pronged challenge.
Our challenges in implementing an Agile approach on our program were plentiful and complex. We faced numerous roadblocks while attempting to help our customer achieve the strategic objective of implementing Agile software development. Implementing Agile practices and creating an Agile culture is a difficult task in any organization; however, we identified three specific categories of challenges that we had to overcome in order to be successful: organizational, cultural, and personnel. A look at each specific category follows.
Our federal program is a highly matrixed organization, supporting more than 23 systems with more than 80 team members, performing both operations and maintenance work along with projectized enhancements. The number of projects per year historically ranged from 50 to 70. Due to the number of projects and the diverse skill sets required to support all the various systems, identifying 5 to 7 cross-functional team members who could be fully dedicated to Agile projects was an unlikely option.
This challenge is important, because one of the key success factors to implementing Agile software development is having dedicated team members. Stable, dedicated cross-functional teams build cohesion together as they learn to understand the product, the methodology, and their coworkers. This stability leads to much more accurate estimation. The estimation of team productivity, measured in story points, becomes more reliable with each iteration; however, accuracy is predicated on the assumption that the team is kept together and timeboxes are respected. Dedicating a team, keeping it together, respecting the timebox, and continuing to estimate using story points should ultimately translate to the increased accuracy of team velocity -- or, put simply, the amount of functionality a team can produce per iteration. If we were to truly realize the benefits of Agile adoption, then addressing the organizational challenge of operating in a matrixed environment would be paramount to our success.
We set out to implement Agile in an organization with a long history of Waterfall development. Our program had matured our processes over a number of years, taken great efforts to achieve CMMI ML-3, and imprinted on its team a culture of success via conformance with processes. Many of the lessons learned from victories of the past would need to be challenged and/or changed in an Agile environment to include the following:
Figure 1. Agile Literally Turns Traditional Development Upside-Down
Adopting Agile schedule and cost-estimation techniques: In an Agile environment, the traditional priority triangle is turned upside down. What was once fixed (scope) becomes a variable, and what were traditionally variables (resources and time) become fixed. As we started to develop our Agile capability, we did not know how the program's leadership team would react to this new estimation approach, especially after putting forth such efforts to create systems to identify and manage schedule and cost overruns.
Management versus leadership: Much of the culture and organizational structure of our federal program was built on a foundation of traditional Waterfall views toward project management. Projects were managed from the top down, with federal program leadership viewing the work of federal project managers, and federal project managers reviewing the work of project leads. Similar to the schedule and cost estimation challenge, we did not know how this highly structured and traditional culture would adapt to the more dynamic nature of requirements, designs, and final work products produced in an Agile environment. Self-directed teams? Ever-changing scope? These concepts were strange, potentially threatening, to the established hierarchy of the organization.
The personnel challenges we faced were in large part related to our lack of experience with this approach. Our company's core competencies were in supporting large federal financial and procurement systems using traditional development practices, and our operations and maintenance work was largely rooted in ITIL practices. As a result, the personnel we hired generally had a background suited to the work they would perform. While a few employees had some experience with some Agile practices, no one on the program was an Agile expert at that point. And with a company of our size (60 to 70 employees), we had limited reach-back to find expertise on our bench.
To further expand on the challenge, none of our federal customer team members demonstrated a wealth of experience in Agile software development. In lieu of any in-house expertise, the alternatives were to strategically hire or to contract out. Budgetary limitations ruled out those options.
In support of the strategic goal of Agile delivery, we developed and executed an Agile action plan. Key components of our plan included:
Evaluating and selecting an Agile framework -- we chose Scrum
Sending our personnel to training to become certified ScrumMasters
Bringing in an external Scrum coach for an executive training session to ensure our customer understood the Agile methodology and its benefits
Developing a training overview for new customers to educate them about their role as product owners
Conducting Agile brown-bag training sessions to develop a common understanding of Agile practices across the team
Creating an Agile software development life cycle (SDLC) including tools, templates, and techniques for development and delivery
Identifying a subset of pilot sprints upon which to execute using the new Agile SDLC
On this last point, identifying a subset of projects to pilot Agile, we found a ready and willing participant in an innovative new program charged with developing actionable business intelligence (BI) for the financial and operational aspects of our customer's business. The federal manager for this program stated his willingness to try new approaches to delivery. That was all we needed.
We applied our newly developed Agile framework for the BI program, while the standard Waterfall or structured iterative approach was used for most other projects. With a subset of projects in place to pilot our new approach, the Agile action plan was finalized. We began the year with a focus on developing our people and Agile methodology, planned and executed our pilot projects in midyear, briefed our program leadership on the results of our pilots, and updated our templates, policies, procedures, and guidelines based on sprint retrospective comments.
Some of our first sprint retrospective comments were very illuminating regarding what worked well and where we had the opportunity to improve. It was clear that Agile practices such as the daily stand-up, impediments list, and the increase in customer interaction worked well. It was also clear that most of the comments focused on improvement had to do with the pains of adopting a new methodology. The team struggled with the lack of dedicated teams, change/configuration management, Agile estimation, and full stakeholder participation.
We closed out our first year with an Agile Summit, which was a cumulative retrospective on all sprints conducted in the past year and included all of the team members who had participated in sprints. Figure 2 is a graphic representation of the overall timeline of our Agile action plan.
Figure 2. Agile Action Plan
By the end of FY 2012, we had increased Agile software development from less than 1 percent of our portfolio to 10 percent of all projects for that year. Along the way we identified additional successes as well as improvement opportunities to continue along our evolution.
In keeping with the Agile principles of inspecting and adapting, we conducted the aforementioned Agile Summit to look at the year in review. Major findings and key success factors derived from our experience include the following:
An empowered and available product owner is key. It's challenging to get customer time and focus in the way that Agile prescribes. Be sure your customer is ready to start the sprint when you are. Also be certain that your customer can assign a representative with the ability to make committed decisions. In the current federal environment, in an era of increased responsibility and decreasing budgets, federal employees are stretched as never before. Finding a customer who can dedicate his or her time to the sprint team is a serious challenge. Articles and training courses on Agile sometimes portray a utopian environment where a sprint team is working for a fully dedicated internal customer, with a product owner who is wholly focused and reliant on your team for his or her success. In today's federal environment, we found a number of customers drowning in a sea of assignments, even though we may have been there with a life preserver.
Dedicating staff is paramount to realizing the benefits of Agile. A matrixed team environment is not optimally aligned to the Agile model, and it presented us with a roadblock in terms of realizing the full benefits of Agile. When people move in and out of teams, team cohesion suffers, more coaching is required, team velocity does not improve as quickly, and velocity itself becomes less predictable. Where you can, we strongly recommend dedicating teams solely to Agile development. Where you cannot dedicate staff, we recommend seeking some alternative solutions. Alternative solutions may include dedicating as many of the sprint team members as possible (if all cannot be fully dedicated). For those who cannot be fully dedicated, a reasonable accommodation may be the designation of core hours, which at least makes them dedicated to Agile development during those hours.
Right-size the amount of change you introduce. Introducing too much change hurts team productivity and hinders root-cause analysis of process improvements. In our experience, the first pilot sprints produced an inordinate number of retrospective comments. It was enticing to try to implement the vast majority of retrospective recommendations in order to increase the speed of improvement. Don't! Spreading process improvements out over time will make for clearer adoption of any changes by the sprint team (because there are fewer changes to remember) and make for easier root-cause analysis regarding the impact of your changes (because there are fewer change variables).
Respect the timebox. Resist the temptation to lengthen the sprint to fit more work into the timebox. More sprints mean smaller deliveries, reducing risk, and allowing the team to learn faster. In our experience, we faced a challenge of having multiple products and product backlogs. With not enough sprint teams to conduct multiple iterations on one product, we found ourselves extending the timebox in order to complete the scope for the product owner. By doing this, even with our customer's best interests at heart, we violated one of the basic tenets of Agile development. In the process, we also invalidated any metrics we had regarding team velocity.
Consistently develop the product backlog. An often-overlooked component of Scrum is the continued development of the product backlog. There is often some amount of work that needs to occur before the sprint starts, such as initial requirements gathering, that allows for greater refinement of product backlog components. The Scrum Primer4 recommends that 5 or 10 percent of each sprint be dedicated to refining (or grooming) the product backlog. In our experience, this refinement is vital. Where we started sprints with only an epic user story, we often did not produce all functionality required or we took longer to develop than the timebox allowed.
Automated tools can alleviate some knowledge shortfalls on the team and help speed the process of Agile adoption. When first delving into the world of Agile adoption, we considered whether or not it would be wise to acquire an Agile tool. There were a number of well-respected tools in the market at the time (e.g., Mountain Goat, VersionOne, Atlassian). We were advised by several sources that a low-tech option of handmade Kanban boards, burn-down charts, and impediments lists were preferable in the Agile world.
In the end, one of the lessons learned was that we needed to acquire a tool both because of our relative newness to Agile and because of our lack of full team colocation. After some analysis, we adopted the JIRA GreenHopper tool. Since its adoption, we have found our team working seamlessly through tasks. For project work, we found that using a tool is more efficient for product and sprint backlog management, impediments lists, and story point estimation. For operations and maintenance work, the use of JIRA's Kanban boards enables teams working in multiple locations on multiple systems to prioritize and track their work. It may be true that the most effective communications approach to Agile work is low-tech, high-personal interaction, but when starting out, the use of an automated tool can be a big help in bridging the knowledge gap while the team learns through experience.
Developing an Agile culture is a process. It became apparent early on that it would be difficult for our team to get out of the Waterfall mentality (directed) and into the empowered Agile mentality (self-directed). This transition can be especially confusing to employees in a matrixed environment working on both Agile and Waterfall projects concurrently. Even more foreign to team members drowning for years in the Waterfall of traditional development was the concept of developing solutions vertically instead of horizontally. Even when provided with user story prioritization by the customer, our team members often still developed horizontally. By holding onto horizontal instead of adopting vertical development, we often found ourselves in all-or-nothing positions at the scheduled time for deployments, because we did not have any potentially shippable increments of software that satisfied the Definition of Done in our organization. If your organization encounters this difficulty, it is imperative that the ScrumMaster address this mentality as soon as it becomes evident within the sprint. Remember, one of the main responsibilities of the ScrumMaster is to coach the development team in the practices of Scrum. It is incumbent on the ScrumMaster to identify when the team deviates from best practices in order for it to reach its full potential.
The rise of Agile software development is moving beyond just a trend and toward a phenomenon. To ignore this wave is to put your organization at risk of not being able to compete. But even with the lure of its benefits driving an organization forward, Agile adoption can be a difficult transition to master. The solution for creating a successful environment for Agile adoption lies within one of the principal tenets of the methodology itself: Inspect and adapt. One need not look any further than the writings of Ken Schwaber and Jeff Sutherland, codevelopers of Scrum, who asserted in their landmark work, The Scrum Guide
, "Scrum is founded on empirical process control theory, or empiricism. Empiricism asserts that knowledge comes from experience and making decisions based on what is known."5
Meaning it is in the doing
that knowledge is gained. The cultural, organizational, and personnel challenges experienced at this federal customer were our particular challenges. But whatever challenges an organization faces with Agile adoption, the key to success is to lay out an action plan, execute it, and continually inspect and adapt processes to drive continuous improvement. Sometimes it just might take a Waterfall to become Agile.
5 The Scrum Guide