In my team, some of the testers have been with the organisation for a long time and hold a great deal of institutional knowledge about the systems and how to test them. They are capable of finding bugs that others in the team are not, because they have years of experience in the domain and often know exactly where to look for problems.
We also have a number of new people in the testing team, myself included, who don't yet know the intricacies of the business. Though we have strong testing ability, we haven't yet built our personal catalogue of domain-specific test heuristics that help us to find the type of bugs that are unique to this organisation.
The demographics of the team offer a strong opportunity for transfer of domain knowledge. When I map out the number of people in the team against their years of experience, there's a skyscraper of new staff and a long tail of domain experts, as shown below:
Having experienced some cumbersome knowledge sharing initiatives, I wanted to create something lightweight to transfer information across the team. Something that would be easy to use and remain relevant through the team taking ownership of regular updates.
I am a big fan of the format and content of Elisabeth Hendrickson's Test Heuristics Cheat Sheet as a means for capturing test heuristics. I decided that I would like to try to create a domain-specific test heuristics cheat sheet, using the same quick reference format, for a page in our wiki. Given my own lack of domain knowledge, and my desire for this to ultimately be owned by the team, I decided to kick this off by facilitating a workshop session for all our testers.
I scheduled one hour with the team. Prior to the session, I asked them to read my Heuristics and Oracles post, which includes simple definitions for each term and links to popular resources and articles.
We began the session with a few minutes of discussion so that those who did not have the opportunity to read the post could talk to the people around them about what it contained. This meant that everyone had some theoretical knowledge before we began, and it also gave me some time as the facilitator to finish arranging the resources required for our first exercise, the marshmallow challenge.
I decided to run a condensed version of the marshmallow challenge, which differed slightly to the official website. We split into four groups, with a time limit of 10 minutes, and each group had to create the tallest tower possible from the limited resources provided to them (bamboo skewers, tape, string and blu-tak). As is traditional, the tower had to support a single marshmallow on the top.
This was a fun activity to loosen people up at the start of the session. It also provided a non-technical opportunity to talk about how we think. Once the winner had been established, we ran a team brainstorming activity to identify the heuristics in use during the challenge.
Each person who contributed a heuristic to the list was given a marshmallow, as I had leftovers from the challenge activity. Perhaps due to this extra incentive, the group generated a great list of heuristics including:
- Start with a stable base
- Test early - don't wait to put the marshmallow on top
- Support the structure with your hands as it is being built
- Don't be scared to use the resources even though they are limited
- Blu-tak is the best connecting adhesive (better than tape or string)
At this point we were about halfway through the session. Everyone had a reasonable theoretical and practical understanding of heuristics in general. For the second half of the session, we switched to talking about test heuristics in our domain.
I ran a team brainstorming activity to generate test heuristics in eight categories. I had identified four functional areas that were common across our various front-end applications, then added the four different channels in which we deliver.
Each category had a large piece of paper, from a flip chart, with the words "How do a I test this?" in the top left corner and the words "How do I know it's a bug?" in the bottom right. Alongside each piece of paper were some printed screenshots from the applications as a visual prompt for thinking.
I gave the team 20 minutes to collaboratively brainstorm the test heuristics and oracles in each category. As they ran out of ideas in one area, they were encouraged to switch location to add their thoughts in another.
In the final 10 minutes of the session we went around the room and whoever was sitting in front of a piece of paper read out what was written on it. Though this was a little dry after a very interactive workshop, it allowed everyone to hear what other people had identified giving them some initial visibility of the type of information that our own domain-specific test heuristics cheat sheet would contain.
After the session, I took the sheets from the brainstorming exercise and transcribed them into our organisation wiki. I adopted the same format as Elisabeth, so our domain-specific test heuristics cheat sheet looks something like:
I had some great feedback from the workshop session. I feel confident that everyone in the test team now has some understanding of what heuristics are and why it is useful for us to identify the testing heuristics that we use in our domain. Time will tell whether the cheat sheet is truly useful. The real test will be in how it evolves from this starting point. I know that we've barely scratched the surface of capturing all the domain knowledge in the team, but hope we've made a step forwards in the right direction.