09 July 2014

Software Testing World Cup, Test report

Credit
This is based on my team's effort in the European qualifying round for STWC (Software Testing World Cup). The team consisted of Agnetha Bennstam, Sofia Brusberg and me. Everything mentioned in this post is my interpretation of our collective work.

What is the Software Testing World Cup?
STWC is exactly what it sounds like: Testers from all over the world competing in software testing. You can read about it on the STWC web site but here's the bare minimum you need to understand this post:

To get to the finals you have to win a qualifying round. Typically there's one qualifying round per continent with a maximum of 250 teams each round and 1-4 testers per team. A round is 3 hours long during which the contestants test an application, revealed just before the round starts. Throughout the competition teams can ask questions to the application's product owner who answers these in a live video channel available to everyone. To get a better idea of how this works Matt Heusser have shared recordings of the video feeds. At the end teams are scored mainly based on a test report and the bugs they've filed.

Intro
To make any sense of this post you first need to check out our test report from STWC (in case worried; I've checked with the STWC judges it's okey to share our report).

My hope is this report can help spark some new ideas for other testers as well as, despite being written under severe time pressure, work as an example of how a test report can look.

Overview
Our test report is made up of six sections; Coverage, Bugs, Other risks, Suggestions, Conclusions and Strategy. Generally I think the report turned out pretty well; it's concise (not counting the coverage section), covers what the stakeholder asked for (including improvements) and communicates what we discovered. With that said there is definitely room for improvement...

Coverage
"What did we actually test?"

The reason for this section is bugs, risks and suggestions don't say much about what we actually tested, just where we happen to find notable stuff. Coverage is there to ensure further testing is focused on risky and/or not yet tested areas.

Given the limited time I think we did a decent job with the coverage section. Some of the categories are too broad and/or ambiguous (e.g. Search), thus making it easy to mistake testing from being performed when actually not. Also I think we should had communicated the coverage in a more concise way.

With more time to spend and better understanding of the product I would have liked to add a visual map of the coverage as an overview. Right now I think it's bits and pieces, missing the big picture.

Another way to improve this section would be to give a quick estimate on how confident we are in the respective areas as a way to show how confident we are in these areas. In our report there's no difference between "we made a quick test in this area" and "we spent a lot of time and effort testing this area", which hurts the section's usefulness.

Bugs
"How bad is it? Any reason to fear the product will explode in the customers' face?"

Lesson learned: Add a more business friendly description to bug reports and describe the business impact when actually reporting the bug! This time we didn't and thus had to quickly translate bug descriptions while writing the report. The result is not very impressive unfortunately. But I did learn something and, from what we discovered, I think the right bugs are listed.

Other risks
"Apart from severe bugs, what else did we observe/not observe that might affect release readiness?"

I like what we added to other risks, possibly with the exception of the potential availability issue described. I think that one was added because one of us pointed it out during the stressful last minutes, so we added it without really questioning if it was relevant. Compared to several reported bugs, not mentioned, and based on our interpretation of target audience it should probably had been reported as a minor (or rather potential) bug, not be part of the test report. But once again, overall I think this one did turn out nicely.

Suggestions
"Potential future improvements we discovered"

One of my favorite sections in the report. Could had been somewhat more detailed but to the point, relevant and what the stakeholder specifically asked for. I tend to always note down suggestions for improvements but adding them to the test report (I think) was a first to me. Will definitely consider adding a suggestions/improvements part to future test reports!

Conclusion
"Taking risks into consideration, what's our recommendation going forward?"

From a formatting perspective it's bad to let the page break in the middle of a section. Also I love the "edit Account editing" (time to proof read the report? obviously not enough). But, looking at the conclusion, I still find it relevant and correct even with some time to reflect. Another thing I like is it doesn't only present the stakeholder with one option, instead it embraces the fact we (testers) don't know the market and thus provides information relevant both for a "regular case" and a "we must release now" case.

Strategy
"What resources did we have at our disposal, what limitations did we have to adjust to and what ideas guided our testing?"

Since this report was given to an external customer we figured a rough plan might help their internal test team even though of course highly context dependent.

If you compare the time plan with my previous blog post you can see we didn't use the last hour to update it. I think it's close enough since we didn't diverge too much and updating the image would not had been a good way of spending our last precious minutes, however, a quick note on how we diverged from the plan, I think, would had been helpful/useful information. Also we wrote it on the form "we did", not "we plan to", which is simply wrong. Apart from that, nothing spectacular but hopefully somewhat meaningful.

Coloring
The header colors is there to make the report easier to overview. Apart from the red one for bugs and other risks they aren't very "self explanatory" but I do think they help the reader to find the information (s)he's looking for, especially when looking at the report a second or third time. One thing to improve is to make conclusion stand out more as I imagine a stressed reader would like to find this immediately. A different choice of, or way of using, colors might be a way.

Overall
I think we got the overall layout and categories right, I think most of the content was relevant and, after all, we only had 3 hours to finish planning, testing and reporting! For a 3 hour competition I think it's a well written report despite its shortcoming, which I hope I've highlighted well enough in this post.

To all the other STWC contestants; I would love to hear what you did different in your reports and how that turned out, eager to learn from it!

Finally: Thank you Agnetha and Sofia!

Lessons

  • Colors are great to add visual structure and make the report easier to skim/overview
  • Proof read more than once, stupid linguistic and grammatical errors hurt the report's credibility
  • Think about the actual business impact/description already when writing bug reports
  • It's hard to describe, especially in a concise way, what was tested and what was not
  • Don't write things on beforehand guessing how something will turn out, in that case, describe it as a plan
  • Improvements (what we called "suggestions") can definitely have its place in a test report
  • Don't wait with the report until the end, try to create as much as possible in parallel with the actual testing and planning

7 comments:

  1. Anonymous09 July, 2014

    Thanks for taking the time to share this as well your reasoning behind how/why you did what you did. I think this is a great example of an alternative to some of the exhaustive documentation testers tend to produce.

    ReplyDelete
  2. Anonymous10 July, 2014

    Nice article. Did your group qualify?

    ReplyDelete
    Replies
    1. Thanks!

      I don't know yet, the result won't be made public until late in July. To be honest, I don't think we will qualify. Reason is we were simply too inefficient finding bugs, especially in the beginning... But we'll see.

      Delete
  3. Liked your blog. thanks for sharing your valuable experience...

    ReplyDelete
  4. Hey, Nice post :) How did you added your post in STWC website. Even wrote a blog about my experience in STWC ASIA 2016 and published in my site :
    http://highonblog.com/stwc-software-testing-world-cup-asia2016-the-beginning/

    Please guide me how do i add this blog in STWC site

    ReplyDelete
    Replies
    1. Nice!

      ... I think I did absolutely nothing, the organizers added it for me. That might be because I tagged it correctly when I shared it via Twitter or something.

      My advice though would be to just contact Matt Heusser (or any of the other organizers) and ask if they'd like to add it to their page.

      Delete
  5. The laboratory has to pay the ethical fees to BIS from time to time. xenon test chamber

    ReplyDelete