I often get asked my views on testing agility, is it achievable? Can it keep pace with development, how flexible to change is the approach; the list goes on.
In truth, the level of agility within the test team needs to be driven by the level of agility in the organisation, and the programme team I.e. If your BA team are fixed in a traditional war and peace specification mode, the chance of your test team being ‘agile’ is diminished, as there will always need to be a quality bar, and always need a sign-off against the requirements; whatever format they take.
Testing agility is no different from organisational agility, if there is resistance to mindset change, then perceived failure of lean principles won’t be far behind. In order to achieve testing agility, the team needs a voice; needs to be included in all aspects of the decision making process. How can development estimates provide correct sizing unless all aspects of acceptance are understood and planned? If your game planning sessions only look at the development tasks and dependencies, it will not take long for debt to be built up over a number of sprints.
I’m not saying that including testing will remove all aspects of debt, far from it, but promoting collaboration from the start helps the team to understand the objective of the sprint, allows a common goal to be achieved. In addition, if there is a common understanding of the task in hand, there is likely to be a more efficient way to implement, which can only be reached if the goal is understood by all.
In terms of agility, I am also frequently asked about automation levels, and what is the target level to achieve. The simple answer is ‘as much as possible’, but realistically any approach to automation; whether conducted by your developer or tester needs to be appropriate, and need not be an overhead. The automation implemented needs to serve the team to maintain quality. It should not be used as some form of checklist item.
Personally, I like to see evidence from the development team that their unit test coverage is in line with the acceptance criteria that they are trying to apply. This could be a test driven implementation or could be a verification piece alongside or after the code is written before check-in. If this is integrated to a CI environment, then even better, but again, I prefer to see solutions that work for the team, and improve efficiency, so if unit tests are run in isolation and that works for the team, then that is the most important thing.
In terms of functional automation, over the years, I have become less of a fan of front end automation as part of sprint coverage, with preference on testing the APIs and messaging layer. Again, this approach is designed to serve the team with the highest return. The advances in such tool sets means that the returns can be even higher if you can reuse these tests to drive sprint level load and security tests on the messages.
Front end automation, especially in the multi-channel, digital age that we live in is important, but any approach needs to be weighed up in terms of maintainability and overhead cost to the team. If the balance can be struck, then coupling unit, api and front end automation together at runtime gives a powerful, repeatable set of tests that provides instant feedback on quality and any regression that is introduced is found on a build by build basis, reducing the overhead and debt on the team.
What this means to the tester, is that there is now more room to use exploratory techniques to add further quality to the deliverable and to sanity check the acceptance criteria before handing over to product owner walkthroughs. The experience that is gained from the exploration should be used in future planning sessions, a way to enhance the teams understanding of the current behaviours of the application that has been designed and built to ensure the implementation remains relevant to the business need, but also fully usable in the working environment.
The capturing of visual performance degradation may prompt the next area for refactoring, or lead to a simplification of the solution design to meet usability targets and enhance the user experience. The tester gaining agility through experience, and applying lean principles will not only feed the development pipeline, but also will assist with maintains a relevant automation catalogue that provides our instant feedback.
The tester must feel empowered to engineer change, to adapt their mindset to fit in with agile and lean process for the good of themselves and the team.