- Test process dictating a factory testing kind of approach.
- Progress was presented as pass-fail ratios with a brief comment at best
- Test cases were saved in a humongous, very general, database application.
- Bugs and requirements were saved in another humongous, very general, database application.
- Most documents were based on big, thorough templates.
- All development was split into features, a feature was then further split into smaller packages. A large feature could take over a year to finish (implementation description written to hand off to integration testing) while packages were typically released once ever two weeks.
- The product was a huge real time system (middle-node in an even bigger system), had legacy from 1975 and was mostly written in a language older than C.
Early in 2012 I was asked if I wanted to join a pioneering (second), self directed, cross functional team at my former job. It was the golden opportunity I had been searching for, in my wish to introduce more context driven thinking. I answered yes immediately.
Eager to put everything I had been reading about into practice I started scouting my possibilities. First mission was to figure out what "self directed team" actually meant. The answer was rather disappointing: "Well, you control the inner process but you have to provide the same output as everyone else" (documents, test case storage etc.).
I decided to interpret output a bit more... general (ask Meike about my willingness to test or bend rules). I began asking the various receivers of progress reports, strategy documents and other output, what questions they tried to answer with these artifacts. I then decided to interpreted "answering the same questions" as "same output", which would later stir up a lot of criticism but luckily noone seemed to know who was i charge of us at this point so I got away for now.
I began to wildly change everything around me, more or less removing the whole foundation on which all our previous testing relied without adding any new. For a while it seemed fine. I got shit done! It was fun, it was quick, it felt efficient, I was -exploring-!
July - Chaos
Vacation time, everyone was calm, just mentally preparing for a month of freedom.
... except anyone in contact with the testing in my team...
Everything was in chaos! In the midst of trying out everything I wanted to implement (but didn't understand), I had lost track. To make matters worse I had to, in the middle of this chaos, hand over my work to an experienced tester who would join my team starting the first day on my vacation. I scratched my head trying to figure out what I had done and what I wanted him to do. I sat down and started writing, trying to at least make it look like I had things under control. Doing this I realized my team was not informed about half of the stuff I called "how we work with test in my team". The dreamy world I had been living in dimmed away as reality slowly materialized in the document in front of me. When done I could at least breathe a sigh of relief, I hadn't screwed up everything (yet). I handed over my work and went on vacation.
Vacation
A well packed schedule was turned upside down when my youngest son was hospitalized with a severe bacteria infection. All the traveling back and forth to the hospital gave me plenty of time to reflect on my work. I slowly started to see ways of turning my mess into something valuable.
August - The turning point
Vacation ended and I headed back to work. The day I came back my new testing colleague, who we can call Saam, since that's his name, left for his four weeks of freedom. I started looking at his testing. It was also chaotic, but a lot more structured (call it professional if you like) than mine. I started to clean up my mess by attacking the main problem at hand: How to know what have been tested?
The first solution was nothing fancy, but oh so important: A simple spreadsheet with functions to test on one axis and general stuff like how the function works under load or during restarts on the other. Each box that formed had a color representing its status, yellow for unknown, red for open bugs, white for invalid combinations and so forth. It was basically a simpler version of the Low Tech Dashboard, which I unfortunately didn't know about at the time (if I were to redo it I would use the Low Tech Dashboard as my foundation). The document started almost entirely yellow (unknown) but at least I had a plan.
When Saam came back we started talking about how to work with test in general and found out we had very similar ideas. From here on forward things worked better and better, thanks to brilliant chemistry and adventurous but realistic approaches.
September to October - Upping the pace
Saam loved to dig into technical details, I loved the more overlaying principles, together we rocked! As we learned more and more and started to form a routine I began documenting the process that had emerged. It all ended up as a concise, rather visual, easy to grasp, four page process description, probably the shortest document ever created inhouse (of course not exaggerating even the slightest).
It basically said:
- Start a wiki page for the feature. Fill it with information like; lessons learned, operational instructions and where to find more detailed information. This wiki page is updated regularly throughout the whole feature and emphasis is on keeping it as short as possible, only including valuable, up to date material.
- Learn about the feature, read docs, speak with programmers and experiment with what exists already. While doing this, document test ideas and a brief test strategy. Don't plan to much but make sure good ideas aren't lost.
- The moment before code is shipped to us, grab the developers and possibly a testing expert/domain expert. As simple as possible, present our test ideas so far and ask for feedback.
- When the code is shipped, use the test ideas as input, but don't in any way limit yourself to only test those. The goal is to steadily add new ones as more is learned about the system and feature.
- As the perception about the status of a certain area changes, update the current status (color code) in the spreadsheet and add a few comments to help explain the situation. This spreadsheet serves as progress and coverage report.
- When done with testing in an area (group of charters) sit down together, look at the test ideas now documented and see if there is anything important left out based on current knowledge. Consider inviting a developer.
- When the whole feature is done, book a feature retrospective to reflect on the work done; what do we want to bring to future features, what could be improved, what should be avoided, new ideas, lessons learned, gaps in knowledge and so forth, both regarding the approach and the feature tested.
- The output of this meeting is an end report. The end report is basically the wiki page with ending notes added. The receiver for the end report is integration testers, testers testing the same/similar features as well as stakeholders interested in the work done.
- Anything in this process is subject to change if the context suggests so.
The example above is stripped from everything company specific but that more concerned how things were solved in our specific context anyway. Also the original process included a short description of how to actually work with exploratory testing in a way that provides visual structure and a bit concerning flow. It had it's flaws, sometimes based on lack of experience and sometimes on context (like how well we had to keep track of test ideas since test data in this form was sometimes requested by costumers), but it was a tremendous improvement.
By the way, the spreadsheet was later replaced by an 8 hour PHP hack... imagine being able to improve your situation by replacing a tool costing thousands of dollars with an 8 hour inhouse hack. I felt pretty good about myself after that day.
By the way, the spreadsheet was later replaced by an 8 hour PHP hack... imagine being able to improve your situation by replacing a tool costing thousands of dollars with an 8 hour inhouse hack. I felt pretty good about myself after that day.
Up until this point very few seemed to care about our work. Some liked our new approach and the qualitative reports we created was appreciated but nothing more.
Suddenly someone noticed we had left out basically everything described in the official test process. Our work was questioned, our judgement was questioned and people demanded we fixed our "mess" (document test cases done for instance).
Luckily for us we had prepared ourselves both to explain our decisions, how the new output fitted in the old system and how traceability was still achieved. We heard some grudges but since the testing in the feature seemed to be a success and the transition towards agile was in focus it was actually received rather well. Especially my boss at that time, and the two test managers, deserves a lot of credit for their reactions (they also knew to some degree what was happening in the team as we stopped making most of it a secret when things worked well again).
Today - Trying to support from the outside
The brilliant colleagues I had, including Saam, keeps pushing for the ideas we introduced. I try my best to support them but this blog post reminds me I could improve on that.
Important to note: The company in question has developed a lot since this story happened so it's not in any way representative for how they work now, it's just a story about change, nothing else!
Lesson: Breaking rules should not be your first option
If you think what I did was irresponsible, I definitely agree. I do think you sometimes have to bend or break rules to make change happen but it should really be your last resort if the stakes are high and done with great care. Speaking to whoever is in charge and change things that way is generally a much better idea... what would had happened if Saam never joined the team for instance?
If it's completely impossible to change something without going behind everyone's back, a new job is probably a much better option.
Lesson: You can change most things
To continue on my previous lesson: Notice that I've never found something that with the right skill, time and effort, didn't seem possible to change though. I know it exists (e.g. strictly regulated industries) but it's rarely the problem. So trying over and over again, staying naive at a healthy level, is a great skill/trait.
Lesson: Experience is invaluable
If someone tells me:
- This idea with lightweight test processes are just bullshit
My answer would be:
- Let me tell you about when me and Saam...
Lesson: Testing requires skill
I knew the theory about how to organize testing in an exploratory way already in June but it took me many hours of practicing to really become skilled enough to perform efficient exploratory testing. It's like driving, you can't learn just by reading, you actual have to practice (but reading can speed up the process).
Lesson: My unsuccessful experiments were just as valuable as my successful ones
When I started to reflect on my work during my vacation I used my experiences so far to find problems to solve. When doing this I developed an understanding of each problem that a successful experiment probably couldn't had provided me. This was useful later when I ran into other problems that I could somehow relate back to problems I already had encountered.
Lesson: Reflection is just as important as planning
Reality constantly crushed my plans but in the beginning I was so focused on getting forward I completely neglected this. When I finally started reflecting, it became an invaluable tool to keep me on track. Plans didn't take surprises (changes, my misconceptions, my simplifications or my lack of understanding) into consideration. All those required extensive reflection to figure out as I learned more and more.
Lesson: Adaptation is key
I tried to just pick an interesting concept and force it into my context, this failed badly most of the times. When I started to look at what I was trying to implement to see where and how it fitted, my attempts to introduce things suddenly became efficient. It's like solving a jigsaw puzzle with the difference you have an unlimited amount of pieces, only a few can fit in a good way, when you actually find a good piece you still need to cut it to make it fit and as you solve the jigsaw puzzle the picture you try to make changes... so kinda not like a jigsaw puzzle at all but maybe that gives you a useful mental image.
Lesson: Learn from others' experiences
One thing I used when trying to turn things around during my vacation was things other testers had told me, stuff like "we did like this but it failed due to that". Exploratory testing was nothing new at the company, great testers already used it in secrecy, they just couldn't solve the problems they had run into trying to fit it into our process (problems like knowing what they had covered and traceability). What they told me helped me realize problems I had to solve and raised questions I had to be comfortable answering as well as provide solutions to problems I already knew about.
Lesson: Discovering my behavioral warning signs was great
Today I know I'm in trouble when I for instance stop sharing things with others or when I start to give very many reasons (rather than a few very clear ones) to why we should do something. Both are signs I'm either off course or at least don't know what I'm doing good enough. The latter can sometimes be a necessary step but at least it's good to be aware. When I discover one of these signs I can immediately stop what I'm doing and start reflecting to ensure I'm not drifting off in a dangerous direction.
Lesson: Being the only judge is dangerous
In my team I was the only one with testing experience (until Saam joined). This led to the rest of the team trusting my judgement when it came to decisions concerning test (which is to some extent reasonable). As soon as Saam and I started to work together we began questioning each other's opinions, ideas and directions, helping us to stay on a healthy course.
Lesson: Reasons are not proof
I can provide reasons for pretty much everything (including why you should jump off a cliff) but that doesn't mean everything is right (including the part where you jump off a cliff). I argued for my decisions early on, providing tons of reasons, even though I was way off. Now I know better than to accept peoples reasons, instead I look for lack of real life experience, generalizations, false connections and other assumptions (including my own).
Lesson: It's important to have someone to discuss with
It's no coincidence everything started to improve when Saam and I started to work together. When I was alone (Saam came to similar conclusions) my creativity dropped, my willingness to stand up for the changes I wanted to perform decreased, my motivation dropped significantly, my feeling of trust in my results dropped, learning was not happening as rapidly and finally I got stuck a lot more often and for longer duration (even when Saam couldn't help me it started the process of involving someone who could cause obviously I wasn't stupid asking whatever question it was). Having someone to quickly turn to whenever I needed a second opinion or some quick guidance was key.
Lesson: Exploratory testing needs support from your process
I didn't understand how crippled my exploratory testing efforts would be by the not adapted surrounding test process (like test reporting, test strategy guidelines, traceability requirements etc.). For instance the test management tool everyone had to use had no support for rapidly changing test scope and qualitative progress reporting. To even give us a fair chance of succeeding we had to work that out first.
Considering what kind of environment we actually created, my advice would be to isolate yourself. So instead of trying to change the world around you (which many will object and in the end you will probably just be allowed to change half the stuff anyway severely crippling your chances of success) try to build a micro universe where as much as possible is adjusted to fit an exploratory approach without colliding with the outside world. In agile, having an independent team with tested code as the only demanded output and working in an isolated part of the product / team owned product, could be one way.
Lesson: Education can save time and improve quality of test
In the beginning the exploratory testing didn't work efficiently. One reason, I believe, was I didn't really incorporate learning. Instead of stopping my testing effort when I felt my knowledge was insufficient I continued just putting out each fire I ran into (learning the bare minimum to continue). This was sometimes the right decision but often not as it constantly interrupted my testing flow. As soon as I started to spend longer stretches educating myself in an area (not just to solve one single problem), I also started to get a better flow and I started to recognize valuable tests I had previously not seen.
Lesson: If you can't argue for something you don't understand it
I've found a lovely exercise I wish I had performed earlier. Before I suggest a change I run it in my head trying to defend it against all the critique I can imagine. If I find a question I can't answer I try to sort that out before I continue. Notice, that the answer is context dependent so you can't just "learn" answers, you have to go through this over and over (but after a few times the loop will be faster as long as the suggestion is not too complex).
Example:
- We should remove the storage of test cases
- But without storage we can't reuse them
- Have we ever done that? For instance, it takes more time finding an existing suitable test case than to write a new one?
- But we lose traceability
- We could store session debriefs or one liners of tests we've performed, it would be a lot cheaper. By the way, when do we need traceability on that scale?
- Well, what if a customer asks for exactly what we've tested... or our test manager?
- Has that ever happened? And is it really worth all this time and storage money just to achieve that?
- But what about new tester, where will they go to learn test
- Our test cases are used for this purpose only due to lack of something better. I think the teams can manage this a lot better with pair testing and internal competence boosts. Also the test scripts don't really explain why you should do certain things partly because people stop writing when it becomes to complex (easier to figure out as you go), so the competence value is minimal as it is today.
...
Wrap up
This is one of my proudest moments, most frustrating experiences, most motivating explorations and definitely one of my most educational adventures. The lessons listed is just a brief start, we learned so much and I still learn things from reading this now.
Finally, this blog post was almost entirely written in December last year. I can now see lacks in my understanding of exploratory testing and how colored I was from only have been working at the company described. That's a lesson in itself.
Thanks for reading!
Wow. Thank you for writing all of this up. Very useful.
ReplyDelete