How I use the questions
I use these questions as a quick way to form a test plan. The test plan might be everything from a high level, early plan for an upcoming test project to something small like the strategy for a specific test charter.The way I practically use the list is pretty straight forward. I just run through the questions and basically strike out questions that are not relevant. Not relevant refers to questions like "any particular tours we should perform" when creating a high level plan, "what are the key test/release milestones" for a plan covering a 90 minutes test charter or "how do we report bugs and status" if this is already stated by our overall process.
The questions not struck out I try to answer; either myself or in a meeting with the people involved. Involved in this case might refer to anything from a second tester to everyone involved in a project. The outcome is typically:
- I/we lack the necessary information or competence and need help answering this question
- I/we have an idea but I/we still want help/input from someone else
- I/we think I/we know the answer well enough but still want a second opinion
- I/we think I/we know the answer well enough
A few additional notes:
- The list should be improved to fit your context; add, remove and modify questions on the list accordingly (if you for instance know which bug tracker to always use, which most do, remove that question completely).
- You always have to take time and value into consideration. For a minor issue with low priority it's probably not worth inviting everyone in the project as the cost of doing so is too high; Your own judgement is cheaper and probably enough.
- The list can help e.g. project managers, line managers and developers think about testing concerns so do spread (your modified version of) the list to these people.
- You can split the list into several smaller lists; e.g. one short list of questions to use when creating test charters and a longer/different one for project test plans.
- It might be wise to think about what's important for a test plan before you even look at the questions. The point here is sometimes key details that definitely should be in your test plan might not be highlighted by any of the questions. If you start by using the list you might miss these key details (your thinking is limited by the questions). Once again I think value is important: How much time and effort seems reasonable to spend on this particular test plan?
Credit
I created these questions together with my students so credit should go to all the students involved. I also discovered testers before me had created their lists as well and I would not have thought about some of the questions below if it wasn't for their efforts.
So, before I present "my" version I want to highlight and thank both the students, Michael Larsen, Ministry of testing, The test eye and Michael Bolton. The four lists I just linked to, all come with their own twist/benefit so do check them out.
Questions
Notice some questions are generally only applicable when consulting, some only in an agile context, others when testing and development is clearly separated etc. Like has already been pointed out: Do remove or modify questions not relevant to your context.
Initial
I typically want to answer these questions before I go on with the rest, because these five typically impact my approach for the strategy in general.- How much time do we have?
- Do we need to document the strategy? Why? How will we document it?
- What’s the primary objective (testing mission)?
- Are there any other objectives?
- Where can we get the initial information about the product/feature/system we need?
Planning
- What resources do I have available?
people, money, equipment, licenses, time, expertise, facilities... - Is this a realistic amount of resources for this project?
- How flexible is the time plan and what happens if the product is not “ready” by the deadline?
- What project risks exist?
e.g. customers not clear about what they want. - What is our backup plan if risk X happens?
- What is our backup plan in general if everything fails?
- What is likely to change?
e.g. team setup, requirements, developers' focus, delivery dates… - Any meetings we should attend/discussions we should be part of?
- How do we handle handovers/new members/lost members?
- Who does what?
e.g. responsibilities and roles - Any known problems along the road?
- Are there any workarounds or solutions to the known problems?
- Any regulations, rules, standards, certifications etc. limiting us or forcing us to work/act in a specific way?
- What administrative tools are (almost) enforced and what else do we need/benefit from?
- How do we plan the everyday work?
- What are the key test/release milestones?
- How flexible is the scope - can the project be down-scaled if some unexpected problem happens?
Prioritization
- What is most important (to test)?
- What is not important (to test)?
- What can be skipped all together?
- What quality characteristics are most/least important?
- Any specific aspect of these characteristics that is more/less important?
- What is covered by other teams?
- How do we continuously verify we’re doing the right thing?
- What is our done criteria
e.g. strict deadline, customer acceptance tests or some other assessment of "good enough quality" and if so, by who? - What's the general requirement for quality?
Aim for the stars/critical system or “just don’t explode... too badly”
Information
- Where can I get information about X and who do I contact if that's not enough?
- Which claims exist?
- Which customers exist, can we contact them?
- Who can answer questions and which questions?
- What is still unknown/not clear about the project?
- How do we simplify the complexity?
Simplify the process of learning the product. An example might be "what sort of visual models would help to create?". - Any particular tours we should perform?
- Are there any general guidelines to how we deal with learning activities and knowledge sharing?
- How do we further inspire/reassure/encourage feedback, reviews and knowledge sharing?
- How do we stay up to date with what is happening in the project?
- How do we communicate with various information owners?
e.g. email (address), phone (number), instant messaging tool, via other person, meeting etc.
Support
- What kind of built in testability will help testers?
- Which modifications can/must be done to the process in general to support the testing?
- What do we need to learn more about?
- Any particular configuration or test data we can prepare?
- Which tools can help us?
- What other teams should/can/must we cooperate with? When, how and why?
- Do I know who the developers are and can I talk to them?
- Do the developers have time allotted for us?
- Are there any problems getting their time/getting in touch with the developers?
- Will they keep working on this?
- What will the developers test? How does this impact our testing? Can we make suggestions regarding their testing?
Testing
- How do I tell right from wrong?
Find potential oracles - Which oracles are generally most trustworthy?
- What testing risks exist?
e.g. unstable test environment or lack of knowledge about something. - Which test techniques might be useful?
- What expertise do we lack?
- Which scenarios/typical use cases exist?
- Which heuristics might be useful?
- What logical components/modules are there?
- Is there anything I'm not allowed to do?
- Any (testing) tips the developers can give to the testers?
Product
- Which product risks exist?
e.g. complex algorithms likely to be buggy or new technology used. - Is there any complexity we might be missing?
- Which functions will the system/application/feature have?
- Who’s the target audience?
- Which platforms, systems etc. should the product support?
- What requirements exist for this product?
- What problem is the product expected to solve? For who?
- What problems have happened in the past with this product?
- Any existing functionality that is impacted?
- What must the product never do?
e.g. any data sent as plain text is strictly forbidden
Reporting
- What do I need to cover and how well?
- How do we track and visualize coverage and progress?
- Which stakeholders exist?
- How do we report bugs and status? To who? Why?
"Why" as in: Which problems/questions will the receiver hope to solve/answer with our report. - What other artifacts/information do we need to share? To who? Why?
- When do we need to report what?
Additional sources
- Michael Larsen
- The test eye
- Ministry of testing
- Michael Bolton (thank you David Greenlees for the reminder)
Very cool Erik, thanks for sharing.
ReplyDeleteCombine yours with this (http://www.developsense.com/blog/2010/11/context-free-questions-for-testing/) and you have a wonderful set of thought provoking questions to he asked, and hopefully answered.
You're welcome,
DeleteI remember looking at Michael's questions as well when I created the initial list so they should be in the sources links.
... and yes, I agree they're good and just like the other lists, his questions are a great compliment to my questions (and vice versa).