Wednesday, 12 July 2017

Test Automation Canvas

Test automation frameworks grow incrementally, which means that their design and structure can change over time. As testers learn more about the product that they are testing and improve their automation skills, this can reflect in their code.

Recently I've been working with a group of eight testers who belong to four different agile teams that are all working on the same set of products. Though the testers regularly meet to share ideas, their test automation code had started to diverge. The individual testers had mostly been learning independently.

A manager from the team saw these differences emerging and felt concerned that the automated test coverage was becoming inconsistent between the four teams. The differences they saw in testing made them question whether there were differences in the quality of delivery. They asked me to determine a common approach to automated test coverage by running a one hour workshop.

I am external to the team and have limited understanding of their context. I did not want to change or challenge the existing approach or ideas from this position, particularly given the technical skills that I could see demonstrated by the testers themselves. I suspected that there were good reasons for what they were doing, but perhaps not enough communication.

I decided that a first step would be to create an activity that would get the testers talking to each other, gather information from these conversations, then summarise the results to share with the wider team.

To do this, I thought a bit about the attributes of a test automation framework. The primary reason that I had been engaged was to discuss test coverage. But coverage is a response to risk and constraints, so I wanted to know what those were too. I was curious about the mechanics of the suites: dependencies, test data, source control, and continuous integration. I had also heard varying reports about who was writing and reviewing automation in each team, so I wanted to talk about engagement and maintenance of code.

I settled on a list of nine key areas:

  1. RISKS - What potential problems does this suite mitigate? Why does it exist?
  2. COVERAGE - What does this suite do?
  3. CONSTRAINTS - What has prevented us from implementing this suite in an ideal way? What are our known workarounds?
  4. DEPENDENCIES - What systems or tools have to be functional for this suite to run successfully?
  5. DATA - Do we mock, query, or inject? How is test data managed?
  6. VERSIONING - Is there source control? What is the branching model for this suite?
  7. EXECUTION - Is the suite part of a pipeline? How often does it run? How long does it take? Is it stable?
  8. ENGAGEMENT - Who created the suite? Who contributes to it now? Who is not involved, but should be?
  9. MAINTAINABILITY - What is the code review process? What documentation exists?

I decided to put these prompts into an A3 canvas format, similar to a lean canvas or an opportunity canvas. I thought that this format would create a balance between conversation and written record, as I wanted both to happen simultaneously.

Here is the blank Test Automation Canvas that I created:

A blank Test Automation Canvas

On the day of the workshop, the eight testers identified four separate automation suites under active development. They then self-selected into pairs, with each pair taking a blank canvas to complete.

It took approximately 20 minutes to discuss and record the information in the canvas. I asked them to complete the nine sections in the order that they are numbered in the earlier list: risks, coverage, constraints, dependencies, data, versioning, execution, engagement, and maintainability.

Examples of completed Test Automation Canvas

Then I asked the pairs to stick their completed canvas on the wall. We spent five minutes circling the room, silently reading the information that each pair had provided. As everyone had been thinking deeply about one specific area, this time was to switch to thinking broadly.

In the last 15 minutes, we finished by visiting each canvas in turn as a group. I asked two questions at each canvas to prompt group discussion: is anything unclear and is anything missing. This raised a few new ideas, and some misunderstanding between different teams, so notes were added into the canvas'.

After the workshop, I took the information from the canvas' to create a single A3 summary of all four automation frameworks, plus the exploratory testing that is performed using a separate tool:

Example of Test Automation Summary

In the image above, each row is a different framework. The columns are rationale, coverage, dependencies, mechanics, and improvement opportunities. Within mechanics are versioning, review, pipeline, contributors and data.

I shared this summary image in a group chat channel for the testers to give their feedback. This led to a number of small revisions and uncovered one final misunderstanding. Now I think that we have a reference point that clearly states the collective understanding of test automation among the testers. The next step is to share this information with the wider team.

I hope that having this information recorded in a simple way will create a consistent basis for future iterations of the frameworks. If the testers respect the underlying rationale of the suite and satisfy the high-level coverage categories, then slight differences in technical implementation are less likely to create the perception that there is a problem.

The summary should also support testers to give feedback in their code reviews. I hope that it provides a reference to aid constructive criticism of code that does not adhere to the statements that have been agreed. This should help keep the different teams on a similar path.

Finally, I hope that the summary improves visibility of the test automation frameworks for the developers, business people, and managers who work in these teams. I believe that the testers are doing some amazing work and hope that this reference will promote their efforts.

4 comments:

  1. I appreciate you respecting the teams, but bringing your own expertise and experience to bear on the problem. It seems very slick and useful. At the same time, not exactly the same as what was asked for.

    ReplyDelete
  2. That is really good approach..very informative

    ReplyDelete
  3. Creative adaptation of the business model canvas. I'd be interested in more details on the outcome of using it, if you decide to share that at some point.

    ReplyDelete