Tuesday 22 April 2014

Test Strategy Retrospective

Once an agile team is established and has started delivering working software, a test strategy retrospective can help to determine whether everyone in the team knows what the test strategy is, and agrees on which aspects of the strategy have been implemented. 

Why do a test strategy retrospective?

When people talk about testing in agile they often refer to cross-functional teams. By definition a cross-functional team is a group of people with different functional expertise working toward a common goal [1].

Many agile practitioners include in their understanding of the term the idea that any individual within the team is capable of completing any task. They speak of resources becoming T-shaped, with a depth of skill in their specialist area and a breadth across disciplines other than their own [2][3]. As a simplified example, a tester may have depth of skill in testing with a breadth of skill across business analysis and development.

Although there is usually a specialist tester in a cross-functional team, they are not the only person doing testing. Instead testing can be performed by anyone, which means that the quality of testing will vary depending on who is performing it.

Those from a development background may test that something works, often by creating an automated check, and consider testing complete. Those from the business are requirements driven and may test only to confirm that their needs are met. Those who are not testers are generally less interested in thinking about the ways in which a function doesn't work or could be exploited, so testing becomes more confirmatory and less investigative.

It's apparent to a tester that a shift towards confirmation of requirements comes at the expense of other types of thinking. When faced with this eroding test coverage the specialist tester has two options; alliance or surrender.

By alliance, the specialist tester implements practices that ensure critical thinking and interrogation of the application retain their place. They may institute peer review of the testing performed by non-specialist testers. They may adopt pair testing as a means of complementing the thinking of their colleagues.

By surrender, the specialist tester adopts the belief that testing is confirming that the requirements have been met. They may support automated checks as the primary means of testing an application. They may advocate for a minimum viable product, where the quality of the application is "good enough" for market and nothing more [4].

In either scenario, alliance or surrender, the specialist tester is making a conscious decision to alter the test strategy of the team. They are actively thinking about the trade-off in adopting one practice over another, the implications to test coverage and the impact on the overall quality of the product. But they are often thinking and deciding as an individual.

In a cross-functional team the performance of testing is considered open to all, yet strategic thought about testing is often not. This means that testers, in the loosest application of the word, may be adopting a practice without understanding why.

You may argue that the specialist tester is the only person in a cross-functional team with the ability to create a test strategy, given that testing is the area in which they have a depth of skill. I don't disagree, but counter that the method by which a strategy is decided and shared is important. A tester who fails to make their strategic decisions visible is adopting a high level of risk; taking ownership of choices that may not be theirs to make. And the benefits of a strategy are limited when the tester fails to communicate it to the team so that it is understood and widely adopted.

So, how can a tester determine whether their cross-functional team understands the test strategy that is in place and the decisions that underpin it? By leading a test strategy retrospective.

Creating a visualisation

A test strategy retrospective is designed to be engaging and interactive; to get people who are not testers to think about what types of testing are happening and why. It should take approximately one hour.

The specialist tester should lead this retrospective but not participate in creating the visualisation. This prevents the team from being lead by the opinion of the tester, and ensures that others engage their brains.

To run a test strategy retrospective you will need post-it notes in four different colours and a large surface to apply them to. A large boardroom table is ideal, as it allows the team to gather around all four sides. A large, empty wall is a good second choice.

Start the retrospective by clearly stating the purpose to those gathered; to visualise your test strategy and check that there is shared understanding in the team about what testing is happening.

Take two post-it notes, one labelled IDEA and the other labelled PRODUCTION. These are placed at the top left corner and top right corner of the surface, creating a timeline that reflects the process of software development from an idea to a deployed application.

Within this timeline, different types of test activities can occur. Some of these activities will be part of the test strategy, and some will not. Ask each team member to think about the test activities that are happening in the project, and those that should be.

Allocate five minutes for each person to write a set of post-it notes that each name one test activity, where the colour of the post-it note shows whether or not the activity is part of the test strategy and, if so, whether it is being implemented.

In this example, purple, pink and yellow post-it notes are used to mean:

Each individual should stick their post-it notes on to the timeline at the point they think the test activity will occur. At the end of the five minutes there should be a haphazard display of a large number of post-it notes.

Ask the team to collaboratively group activities with the same name, and agree on the placement of activities within the timeline. Where different names have been used to refer to the same concept, keep these items separate. Once the team are happy with their visualisation, or the conversation starts to circle, call a halt.

An example of a test strategy retrospective visualisation is below.

Leading a discussion on strategy

If you've reached this point of the retrospective by killing a circular thread of conversation then that may be the first place to start a discussion. But there are a number of other questions to ask of this visualisation.

Are there groupings that include different coloured post-it notes? Why?

Have people used different terminology to refer to the same type of test activity? Why?

Why are there activities that are in the test strategy that aren't being implemented?

What are the activities that aren't in the strategy and should be? Do we want to include these?

Are there any activities that are missing from the visualisation altogether? What are they?

These questions not only uncover misunderstanding about the current state of testing, but they also surface the decisions that have been made in determining the strategy that is in place. The visualisation is a product of the whole team and they are invested in it, creating a catalyst for a deep discussion.

For example, the team above are in a surrender state; the test activities that are shown in purple are largely for automated checking. This illustrates that testing is primarily confirmatory, with tools verifying that the requirements have been met. Yet, judging by the number of yellow post-it notes on the right hand side of the timeline, a number of people in the team feel there should be more investigative testing. Who made the decision to focus on automation? It appears that this choice that has not been widely publicised and agreed by the team as a whole. The retrospective offers an opportunity to discuss.

In a cross-functional team where anyone can perform testing, it is important for there to be a shared understanding of both the practical approach to testing tasks and the underlying test strategy. By creating agreement about the type of test activities being performed, any person who picks up a testing task understands the wider context in which they operate. This helps them to make decisions about the boundaries of their task; what they should cover and what sits within another activity.


  1. Great article! I like how this makes the test approach transparent to the team. I think it would be interesting to see what each team member thinks each term means as well.

    It would be interesting to approach it from a different perspective though - instead of asking "what types of testing would we like to have", instead ask "what do we want to know about our product and when do we ideally want to find out the answers?" And then, "what can we do to achieve that?". That way we can expand the discussion to consider techniques outside of standard testing techniques, and really focus on what's right for this project.

  2. Great article Katrina!

    Does the visualisation become your living test strategy, or do you update a document based on the outcome?

    Do you have plans to extend the retrospective once the team have the shared understanding? For example, to review the types of testing that were performed for different features, to discuss the circumstances that different types of testing are appropriate for, and to review what testing might have been missed when bugs escape the test activities?

  3. Katrina, I absolutely love your approach! I have done a similar activity with my teams in the past but I was leading the meeting and while asking the team for feedback, I didn't leave it to the team to fully define the activities. The next time I do it I will certainly follow your system.

  4. Katrina, excellent insights on T-Specialists, their specialization and their viewpoints!!. Indeed elegant.
    Further, "test strategy retrospective" is brilliant, innovative and creative concept wherein entire agile team will be aware of Testing and will be able to visualize precisely. Thank you for the insights !!
    With regards, Arul Varman