Technical background
We were developing a web application using Java, which was built using maven. I created a separate module in the same source repository as the application code for the automated checks. It used Concordion to execute JUnit tests against the application in a browser using Selenium WebDriver. The suite executed from a maven target, or tests could run individually via the IDE. I had a test job in our Jenkins continuous integration server that executed the tests regularly and produced an HTML report of the results using the HTML Publisher Plugin.What does it look like
Below are the results for a Sample Application*. The job in Jenkins is green and more information about this success can be found by hovering on the job and clicking the HTML report.My past experience was that management engage with the simple green / red feedback from Jenkins more than any other type of test reporting. Rather than continuing to fight this, I decided to change what it could tell them. There will always be aspects of functionality where it does not make sense to add an automated check, bugs that fall beyond the reach of automation, and decisions about scope that cannot be easily captured by automation alone. I wanted to communicate that information in the only place that management were listening.
The report is designed to be accessible to a non-technical audience and includes a pre-amble to explain the report structure. The entry point is designed to provide a high level visual overview of what testing has occurred, not just the results of automation. I found that the scrum master, product owner and project manager didn't drill further in to the report than this. This is essentially a living test status report that does not contain any metrics.
Each feature was highlighted to reflect a high level status of the checks executed within that space (that's why Sample Feature above is green). It linked to living documentation, as we used Specification by Example to define a set of examples in the gherkin format for each story.
Although written in plain English with a business focus, these specifications were rarely accessed by management. Rather they were used extensively by the business analysts and developers to review the behaviour of the application and the automated checks in place to verify it. The business analysts in particular would regularly provide unsolicited feedback on these examples, which is indicative of their engagement.
What do you think?
__
* This post was a while in coming because I cannot share my actual test reports. This is a doctored example to illustrate what was in place. Obviously...
Hi Katrina,
ReplyDeleteI really like this idea!
Could you tell me a little bit more about where the hierarchy in the mind-map comes from? For example, you have an "Add" node under "Sample Feature" with two child nodes "Type" and "Number". How do those correspond to the elements in your Gherkin document?
I presume you generated this report by hand, is that right? Or do you have a script that builds it automatically for you?
Hi Matt,
DeleteThanks for your comment, I'm glad you like the idea.
The mind map is currently not generated automatically. The correlation between specifications and nodes in the map are created by hand. The idea is that the mind map shows not just the specifications at a higher level, but also information about testing beyond the specification. Though I think an auto-generated map would be pretty neat, it would remove some of the intent of what I was using the maps for.
That said, my colleague Nigel Charman is very keen to try and implement a dynamic mind map that does generate based on the results of automation. But it's in a queue to work on, alongside many other things!
Thanks,
Katrina
This comment has been removed by the author.
ReplyDelete