Agile test automation: How far should we go?
In my experience, test automation is one of the key players in Agile development, because the process is about rapid development and rapid feedback. There is a small window of opportunity to validate your vision and then adapt it as necessary, before it's too late.
It always makes sense that when the product owner (PO) thinks about stories, he or she must communicate the big picture to the team, which enables the team to start thinking about two important parts of the process:
- How fast can we develop?
- How quickly can we test?
Most mature teams do this, and so have we. I used to write my blog posts based on my experience and my understanding, but then I had mentors like Michael Küsters, who is the owner of the blog Fail Fast, Move On
. So, this posting is based on my opinion and my understanding of automated testing.
Being on an Agile team means that we are always addressing these questions: How far should we automate? What should not be included? How quickly can we develop, test, and deliver?
These questions are always important to answer before diving deep into writing code. One needs to think about them and look for all the opportunities to deliver quickly but with reasonable quality.
I would like to share my own experience with test automation. How did we answer those questions? First, I'll explain the problems that can occur when teams ignore these questions, then I will describe our own solutions.
Problems with having an unclear definition of "automation"
If a team does not know how much automation it needs, it is probably in a never-ending loop of overhead. Different schools of thought will argue that we need to automate everything, but it varies from product to product, and it depends on which phase of development you are in. By development phase, I mean how mature your product is at that particular moment. I will not discuss this aspect in this article, as my aim is to show, through example, how we tackled those questions.
Being freshers, we had a very small window of opportunity, so we needed to deliver as fast as possible. Delivering fast does not always means "quick and dirty." So we thought from the start that we would automate the core functionalities. As we moved on, as a team, we felt that we were spending a lot of time maintaining automated tests rather than delivering new functionalities. One of the beauties of sprints is that the team realizes quickly what went in our favor and what we were opting out of.
Soon we raised the question in our retrospective of why we were spending so much time on maintaining automated tests and not on delivering enough value to the customer. After that we did not waste much time in identifying the core functionalities that we should automate and who should handle them.
Since the PO was the one who was writing stories and working closely with stakeholders, it was better that he answered the questions raised during the discussion:
- What are the core functionalities? Do we need to redefine them?
- Do we need to automate every new feature?
- How do we maintain balance between exploratory manual testing and automated testing, as both are pretty important?
We are fortunate to have a PO. He is an erudite person, and he himself is a Scrum trainer. As a PO, he knows exactly what is important and which actions should be taken first. Together with the team, the PO quickly redefined the core functionalities, and the team quickly realized that those were already covered by existing tests. The team agreed that if a new feature had a high impact on existing user journeys
, we would add a new automated test for it or update an existing one. If any feature did not have a high impact on the user journey, it would be tested with manual explorative techniques.
Maintain a balance between explorative tests and automated tests. In our case, the PO allowed the self-organized team the freedom to determine which technique to use when.
Because testing is often seen as a bottleneck during software development, it's necessary to first discuss related issues rather than ignore them. It can be challenging at times to deliver fast with a sufficient amount of testing, but it gives the team enough confidence to deliver. Being an Agile team member who is mostly busy with implementing the techniques to improve the process for the betterment of the product, I realized that testing is like development. You develop the testing incrementally along with product development: As the product increments, so does testing.
I have seen teams fail because they invested their energies in areas where it did not make sense. If we see things from a user perspective, you will realize that they do not care what your testing looks like or how much technical debt you have. They care about only the value you are offering to them. If your application enables users to use it to achieve their goals or solve their business problems, then you are already halfway there.
The author thanks Michael Küsters, whose advice for this article was invaluable.
- Note that not all tests are automated nor can be; therefore, always accompany automated tests with exploratory testing.
- Use a risk-based approach to determine which tests should be automated. To get the most value out of automation, only automate the most important business cases and scenarios.