I still meet a lot of Agile software developers and teams who are not applying proper test automation practices, such as micro-testing, automated integration testing, and continuous integration. There are three reasons for this:
1. Developers don't see the value.
Some developers are just so good and experienced, and they make so few mistakes, that automated testing is adding overhead. These developers usually have more important things to do, like applying their über-skills to architecture, while the lesser souls fix their bugs. Getting teams to apply test automation practices with these developers on board is probably the hardest.
2. Managers don't see the value.
Some managers think automated testing is extra work. And since they have projects to deliver, there is no time to be fooling around. Somehow they overlook the fact that most project risk and uncertainty is the bug-fixing part, before the product is ready to be deployed. If there are strong developers and ScrumMasters on the team, they will, we hope, be able to convince their managers of the value of testing — or, if not, do it anyway.
3. Legacy code exists.
Sometimes a code base has existed for a while without proper test automation practices, and it has evolved into tightly coupled code with many dependencies. It is very hard to create micro-tests for code with many dependencies. Even mocking is not an option anymore, since there are so many mock objects required for a single test that development teams find it almost impossible to start with micro-tests.
Usually reasons (1) and (2) will eventually lead to reason number (3). In the beginning, the code base is still manageable without test automation, since the total amount of code and features is still limited and few regression tests need to be run. So teams don't really see the problem they're going to face later. Over time the pressure on the manual testing of code and bug fixing will increase, since more code will need to be regression tested and the code base will become larger and more complex to understand. Finally, at some point, half of the time or even more will be spent on testing and bug fixing, whereas Agile teams should focus on delivering customer value.
Preventing bugs is key
What Philip Crosby meant by his quote "Quality is free" (also the title of a book he published in 1979) is that prevention of defects will always be cost efficient. This definitively applies to software development. This is best illustrated in the graph below, which looks at the cost of defects, or bugs, in software and the phase in which they are found.
Minutes required to fix a bug
Backlog grooming: In a backlog grooming session, developers will validate or disqualify their assumptions by asking questions of the product owner. By discussing requirements in detail, misconceptions will be eliminated and possible bugs will be prevented. The time required to eliminate bugs is minimal, since not a single line of code has been written.
Micro-testing: When micro-tests fail after changing parts of the code, the failing tests will pinpoint for a developer the new defects that have been introduced. Since tests should be run often, the amount of code that was changed since the last time all tests passed will be so small that the developer can easily fix the newly introduced bugs, usually within minutes. Note that I prefer the term micro-tests over unit tests because the term unit doesn't emphasize enough that they need to be really small.
Integration tests: When micro-tests don't catch a certain bug, this bug will stay in the system until another test exposes it. When it's found in an automated integration test, changes from more developers will have been added. So it will require a bit more time to allocate the offending code, and perhaps it will require more effort from other developers who are needed to eliminate the bug. Since integration tests should run regularly, the developers can still recall the changes that introduced this bug, and it's probably a question of a man-hour or less to fix the problem.
Manual acceptance tests: When no automated tests expose the bug, we can only hope it will be exposed by a manual test. Usually these tests will be run when a story is implemented or, worse, only just before a release. Weeks may have passed, or even months, and it usually involves a bug-management system to track the bug. This causes additional overhead, since a product owner will need to understand the impact and prioritize, a quality engineer will need to find a way to reproduce, and developers will have to dig deep into their brains to get back to the change they made a while ago. In this stage, the total combined efforts of everyone involved to eliminate these bugs will be many hours to days.
Live bugs: The rest of your bugs will be released into the wild, exposing them to your end users. If the users perceive the quality of your products to be low, this is negative for your brand image, and users are likely to consider alternative products. Fixing bugs in a live environment means either that fixes will be shipped in a next version (and your end users will experience the negative impact until then) or, if the severity is high, a patched release will need to go out, with the additional overhead this will bring with it.
It comes with the job?
Have you ever walked into a room where a team has been fixing bugs for a week or more? It's full of low-energy, frustrated developers either looking at their debuggers or figuring out what the heck the bug report is about. It's not exactly the part of their job they enjoy most, especially not when frantic product owners are hovering around, pushing them to get to a releasable build. "It comes with the job" is an often-heard phrase. No! "You haven't done your job properly," I say. That's the reason these bugs are in the system, and they should have been prevented by using the tools that are readily available and that should be used in proper software craftsmanship.
The cost of bugs versus how long they exist in the system clearly shows that investing time in micro-testing and automated integration testing is worth your while. Investing a few hours now will save you endless frustrating hours spent debugging long-forgotten code changes and misunderstandings in bug reports. Plus it eliminates risks.
But how do we start?
It's clear that it makes no business sense to start writing tests for the whole code base, since it will take months before you'll start adding value for your end users again. So how do teams, when they finally see the light, start using test automation? The same way the chicken crossed the road: one step at a time! If you didn't start applying test practices from the start of your project, now is the time to get started. Use the ammunition above to convince the product owner (I hope you don't need to convince the ScrumMaster) and have an agreement on changing the Definition of Done.
Additions to the Definition of Done
- Each new user story will need to be covered by micro-tests.
- Each new story will need to be covered by automated integration tests.
- New bugs that are discovered in later stages will need to be covered by micro-tests and automated integration tests.
There will be an initial investment in build automation environment automatically executing these tests, and setting up your first few tests will take some time. Set realistic goals. For example, try one user story first; the velocity of the sprint in which you will start will drop, but not delivering any value will only fuel the nonbelievers. Don't let it discourage you; take baby steps and start enjoying the fun. What could developers love more than writing code? Writing code to prevent bugs so they can retire their debugging tools? Start to love the green bar when your micro-tests all pass!
Mind shift: Stop the line
Once we start to add more bugs to the system, tests will start to fail, which is a good thing. Imagine what it would mean if these bugs were only uncovered in a later stage? In the old way, bugs would be added to a bug-management tool and would need clarification and review by a quality engineer and product owner before we would eventually start fixing them. There needs to be a mind shift by the developers, since failing tests will need to be fixed asap, right when they are found! Stop the line is a lean practice, coming from the Toyota production system, where they would stop the manufacturing line when a defect was discovered to avoid producing defective products (waste). In software development, we need to stop producing more code and fix the failing tests first, or we will be adding more code on a defective code base and increasing the time required to fix the code base later (waste). When teams use continuous integration tools but haven't had a successful build in days, it defeats the purpose of adding these tests. When each small change is validated, a single failing test only affects a few lines of code, and it will be a quick fix.
When you just start out with the automated test practices, the amount of bugs you will prevent is limited, since only a small part of the code will be "covered" by micro-tests and automated integration tests. So there will be bugs in the system that will only be found in later stages. The responsible thing to do then is to first add a test that would have caught this bug, then fix the code; your test should now pass. This is an efficient way to identify new tests to be added to the system, since the bug shows that your existing tests didn't catch it. If you keep following your new Definition of Done, the coverage will keep increasing, the automated builds you produce will be of increased quality, and you'll find yourself spending less time debugging and debating bug report details.
Since developers, quality engineers, and product owners will spend less time on bug fixing and bug management, more time will be available to add real value for your end users. Developers will write more code, quality engineers will have more time to focus on test scenarios and exploratory testing, and product owners will have more time to think about new features. Not to mention that you'll have happier customers and faster releases, so that you're outperforming the competition.
There are really no excuses to postpone test automation practices. Start now!