After reading an article tweeted by Tobbe Ryber I made an interesting connection...
Wouldn't a profile similar to what was described in the article be possible to create using Louise Perold's Test Planner from Let's Test?
So I quickly tried it ending up with the following columns:
Problem: What problems would I be able to solve for a company?
How: How do I act/what do I do to solve the problems I say I can solve?
Success: How would my work benefit the company? What will come of it?
Failure: How do I usually screw up / fail to meet expectations (incl my own)?
Data: What usually varies between companies I've worked for (how has that changed my success/usefulness)
Issues: What do I typically need from the company to function well?
When doing this I suddenly thought: Maybe the same approach could be useful when planning my CAST presentation? Or maybe when talking about the way forward with the product I test? Or maybe my travel to Madison? Or...
Well, I've just spent a moment thinking about this so it might be crap but I still find it interesting that something I learned specifically for testing seems to be useful in so many other contexts.
Thank you Louise for providing me with an interesting tool that seems to be useful in very many cases! (give me a day and the number of ways to use it will have grown substantially... I think .)
So how can I apply what Ilari, Huib, Jean-Paul, James, Johanna, Leo, Griffin, John, Steve, Zeger and Scott talked about in something else but testing? I'm inspired to figure that out next...
And how can you use what you learned in other ways than described/intended by the presenters? And what can you learn from doing that?
... Even more interesting: What, currently not test related, have you learned that could help you improve your testing?
Good luck exploring what you already know to learn something new!
28 May 2013
23 May 2013
Let's Test - a summary
Sunday evening
- You are never alone (if you don't want to)! Just sit down at a random table or start talking to someone you've never met before. If you don't dare doing that, just stand still and someone will approach you. The amount of people I got to meet in three days blew my mind! Thank you everyone!
- The energy also struck me. Every discussion had a wonderful flow; people cared, people was thinking and people was curios.
- Putting personalities and, sometimes, faces on people from Twitter was a cool thing.
- We need to identify and clear shallow agreements. Questions are a great tool for this.
- Context-Driven can mean a Paradigm, a Community and an Approach. Don't confuse the three, for instance, if your paradigm is Context-Driven you can't just jump in and out, you are Context-Driven (by choice) on the other hand, the community is something where you to some degree be part of while still being part of another community or having another paradigm.
- Greeters and Guides are the people that introduce new people to Context-Driven Testing and help them and others grow inside the community. These people are key to making the community thrive.
- People take their decisions based on what they observe, or rather perceive they observe. This makes observational skills a key in most aspects of life. In testing they are our bread and butter in finding/providing important information.
- There are so many things biasing our observations (old models, language, instructions/provided information, distractions and a whole lot of other things) and we can't do much about many of them per se. However, knowledge and critical thinking can help us be aware and chose different approaches to help work around them.
- When communicating observations it's important to avoid ambiguity when possible: this is (what is "this"?), preferred (by who?), few (in comparison to what? how many?).
Comments:
I love how Ilari approaches questions. He has an amazing ability to just suck up the question, think, ask for additional information and form a well constructed answer in seconds, through all this he always keeps his calm. I have a lot to learn from this guy! (especially as someone opening my mouth before even starting to think ,)
I also like how Ilari left us with a lot of tools but not much instructions on how to use them. I've already taken on the challenge of finding ways to incorporate this new knowledge in my daily work.
I also like how Ilari left us with a lot of tools but not much instructions on how to use them. I've already taken on the challenge of finding ways to incorporate this new knowledge in my daily work.
- Give visualization a chance! You don't have to be an artist to sketch, make mind maps or add illustrations, but you do have to actually try!
- Cover your walls with sketches, diagrams, illustrations and other visual representations of you work.
- Get a magic marker (Edding 345, grey marker, edding.com, EAN: 4 004764 841592) and even your bad images will start to look cool.
This is what you should aim for in a tutorial. Left is before, right is after, can we agree that this is a significant improvement? (even without the magic marker by the way). With better light and some quick editing these visualizations could also be used in future blog posts, cool bonus!
Monday evening: Takeaways from discussions
- Every time I do even the slightest attempt to use the Socratic method I and, as it seems, others learn something. Driving a discussion using questions is a rewarding and useful skill!
- A role is not the same as responsibilities or skills.
- If everything is considered priority one, nothing becomes priority one.
- Where do you want to go, what career paths matters to you?
- To the business, bugs are not an interesting problem, but the complications of them are!
- What is relevant changes over time, you might one day wake up and realize perceived quality is suddenly a much more important issue than functionality and rapid shipping (or in any other order).
For most summaries I've tried mixing what the presenter seemed to focus on and what I took away. For this keynote it's only what mattered to me cause I had three things that really did matter. I urge you to check out this keynote yourself when/if it becomes available! (in case you weren't there)
I could list the things discussed (or well, actually I couldn't since I didn't write much of it down) but the list, as Huib said, is not what's relevant. Instead, start thinking about what might happen, how that affects your work and prepare yourself for what's relevant to you. It's a huge topic and there are obviously no known answers but you better be ready cause the future will happen no matter what you want.
Session: Leo Hepis, Linguistics, How to Keep a Dialog Constructive
- Volunteer! You might feel uncomfortable being put in the spotlight but look at it as an amplifier for learning. Also, if you're one of those who want to build a brand I image this is a good thing to do cause I've not lowered my respect for any of the volunteers in any of the sessions I've attended but I've sure found a couple of people who've earned my respect by doing it (and sometimes you get away with cool stuff, right Kristoffer? .)
- Keeping a discussion cooperative is key if you want to effectively transfer information.
- At least for me, it's easy to provide an answer before I'm sure of what answer I really should provide.
Session: Griffin Jones, What is Good Evidence?
- Griffin had a lovely list of attributes to check for in a piece of evidence, check out his slides!
- There's a difference between what we actually did and what we intended to do (hello "passed" test case, I have no idea what actually made you "pass").
- Discouraging feedback, like leaving out confusing/unexplained/contradictory details/data/observations or other things that might conflict with your verdict, hinders the possibility to critically analyze a piece of evidence, which hurts the validity of your evidence.
Session: Louise Perrold, The Test Planner
- Start with Problems: Which are the business problems we try to solve with our product?
- Make additional columns for each of the things below and populate them as well with items:
Success, what represents good
Failure, in what ways can the product fail, what consequences may occur
How, how can we test this product (logistics and techniques)
Data, what variables do we have
Issues, what do we need, to be able to test - Use the items from each column to inspire/spawn new items in the same or other columns. Repeat until satisfied.
Comments:
Probably the most "loosen up" (laughing, smiling, chatting, casual) session I went to, which was a great thing. Don't know if it's a coincidence / wishful thought but I feel like I remember more from this one than most others.
Bug hunters will notice "Data" seems a bit out of place... answer is quick and dirty graphics editing on a really not professional level.
Tuesday evening: Test Lab
- A great test leader can really make a difference! Less control more empowerment (e.g. coaching, empathy, support, freedom) is one of the keys to that for me.
- It's really fun to sit in a tight group and just work together focused to find relevant product information (feel free to call it bugs in this specific case). By the way, I'm at a company where we already do that and it's just as fun "at home"!
- It's hard to explain a bug in purely text, images and/or videos are often great compliments.
Session: John Stevenson, Information Overload and Bad Decisions
- S L O W D O W N !
- For instance requirement documents can prime/anchor you to certain solutions limiting your ability to see important options. Be aware of this.
- We often make quick judgements instead of think. We need to be aware of this and apply critical thinking to avoid missing key observations.
Session: Steve Smith, Debugging Human Interactions
- There is a relationship between low self esteem and very active defense reactions.
- We don't receive sensory input (like seeing, hearing etc.) in an objective way. Reactions to certain sensory input can for instance make us shut down/forget/ignore other sensory input.
- Being aware of how sensory input is "used" (forming one or more meanings of the input, feelings, feelings about having these feelings, prepare defense mechanisms and setting rules for commenting) can help understanding your or someone else's reaction.
Session: Zeger Van Hese, Testing In The Age of Distraction - The Importance of (de)Focus
- Y O U D O N ' T N E E D T O R E S P O N D !
- Receptive distractions reloads/refreshes us. Examples: taking a walk or having a coffee
Deceptive distraction drains us. Examples: meetings or emails - Observe when you start to procrastinate and see if you can find certain patters/scenarios.
Comments:
Zeger commented that flow, in the original description, "suspends critical abilities", which might be a bad thing for testers. My interpretation is that it rather inhibits what may distract you from proceeding and in a regular creative activity that thing is criticism but in testing, criticism is more or less the "creative activity" itself. I would instead interpret flow, in the context of for instance exploratory testing, to inhibit the risk of you rejecting a critical observation. But that's just a thought that popped up as I reread my notes. Feel free to comment or question (directed to anyone, not specifically to Zeger).
Zeger commented that flow, in the original description, "suspends critical abilities", which might be a bad thing for testers. My interpretation is that it rather inhibits what may distract you from proceeding and in a regular creative activity that thing is criticism but in testing, criticism is more or less the "creative activity" itself. I would instead interpret flow, in the context of for instance exploratory testing, to inhibit the risk of you rejecting a critical observation. But that's just a thought that popped up as I reread my notes. Feel free to comment or question (directed to anyone, not specifically to Zeger).
Keynote: Scott Barber, Business Value in Testing
- Testing as an isolated activity has no value... but the resulting information is worth something.
- Sometimes the needs of the Business overrides the needs of the Users (e.g. a product must get out on the market to not miss a marketing window).
- It's important for testers to learn basic business lingo to improve our ability to communicate information. Hopefully that can also motivate business people to learn basic testing lingo as well.
Reoccurring themes throughout many of the conversations and presentations I attended
- We need to stop multitasking!
- Beware of various kinds of bias that limits what/how we observe.
- People, people, people! Like it or not, our business is all about people!
- Keep things simple! E.g. leave out redundant/irrelevant information.
- We need to understand the business implications of what we do.
Some comments I liked
Notice! None of these are exact quotes (I suck at noting down exact quotes), I hope I haven't messed up the original meaning due to this, please correct me if that's the case!
Great enough instead of good enough or perfect
/who said this, Martin? Leo? Maria? Was after Steve Smith's session.
Jerry Weinberg is Yoda
/James Bach, keynote, completely taken out of context by the way
It's all about following your energy
/Huib Schoots, lunch
There's no reason not everyone can become awesome
Johanna Rothman, keynote
A few people that really made an impression on me
I started making this list but soon realized I had at least 20 people I wanted to add and it just became silly. But, stubborn as I am, I want to highlight three:
Richard Robinson
For a Miago Do belt Rich got the challenge to form and lead a test team with the mission to test XBMC for roughly an hour and a half and finally debrief the results. I was fortunate enough to be part of this team cause the experience itself was awesome and I think Rich did a kick ass job! Whenever I'm doing something test leader:ish he'll be my role model! (read the first item in the Test Lab summary further up for some reasons).
Huib Schoots
Pure energy on two feet and with a well earned Buccaneer cap on his head. Interesting, helpful and intelligent. His (and Jean-Paul's) visualization tutorial also had an instant effect on my notes which was awesome! Next mission, inspired by the mentioned tutorial and other discussions, is to cover the office with various visualizations of our product.
Johanna Rothman
Apart from delivering a keynote that helped me realize/explain certain things (see summary further up) I really enjoyed just speaking to her. Apart from curiosity, a willingness (or even urge) and ability to help, she seems to deeply care about everyone she's speaks to. Amazing person, I truly value what she taught me during meals and through her keynote! Yeah, and it's hard not to mention how she stands up for what she believes (I think everyone attending Let's Test knows what I mean)
A lot of other people deserves more than a short mention, like Helena, Steve, Jari, Ilari, Jesper, Martin, Kristoffer... but the line has to be drawn somewhere. But let me just say: Thanks to all of you for the amazing discussions, exercises and lectures I've had during my last three days! I deeply appreciate every single one of them!
Wrap up: The Taxi Driver
In the taxi back to the train station I spoke to the taxi driver. I think his story well summarizes what people on Let's Test was all about.
Short version is he had been a hairdresser for 15 years but lost his energy. So he changed career and started driving a cab. What he loved about it was feeling free and the human interactions, he also liked how his taxi company's ethics resonated with his own. When listening to him my impression was he loved his job and that he truly cared about being the best taxi driver he could ever be. Even though leaving Let's Test I was smiling the whole way, that guy would had fitted perfectly at Let's Test.
14 May 2013
Key testing skill: Thoughts into words
Background
During SWET 4, one thing that really struck me was how amazing the other participants were at quickly and accurately turn their thoughts into words. It was probably the most striking difference I noticed between myself and these experienced testers. Since then I've kept thinking about this skill and how to improve it. This is my thoughts so far...
Why
One of the early slides in the RST material starts with: "Testing is in your head" which I think sums up why this skill is so crucial. You need it if you want anyone to understand anything about your testing. Some examples; you need it to get:
During SWET 4, one thing that really struck me was how amazing the other participants were at quickly and accurately turn their thoughts into words. It was probably the most striking difference I noticed between myself and these experienced testers. Since then I've kept thinking about this skill and how to improve it. This is my thoughts so far...
Why
One of the early slides in the RST material starts with: "Testing is in your head" which I think sums up why this skill is so crucial. You need it if you want anyone to understand anything about your testing. Some examples; you need it to get:
- Programmers to understand your bugs
- Stakeholders to understand why these bugs are important
- Your employer to know why (s)he needs you
- Your colleagues to know what you're doing and how it's going
- Anyone to understand your problems
- Stuff on paper (strategy, results, methodologies etc.)
- You to understand for yourself what you are doing/have done
Notice the list doesn't mainly consists of stuff only some test thinking expert needs to master, it's stuff every tester needs to master!
How can you improve this skill?
Since this skill is fundamental to testing you will practice it to some degree by just doing ordinary work but the ideas I've listed below are ways I've found more efficient either as ways to practice the skill or to get feedback on how well you're doing.
Answer the fundamental questions
For many questions the answer is far less useful than the process of getting to an answer. Some examples in testing:
- What does it mean to test something?
- Why do we need testers?
- Can't programmers test themselves?
- What's the tester's role?
- What is a bug?
- Do testers need education? Why? Why not?
- Can we find all bugs? How? Why not?
- How do you know when you're done testing?
- How can we measure efficiency of testing?
- How do we know what's important?
- Why do we run extreme tests?
- Can we on beforehand figure out what to test? How? Why not?
- If we know the status of a product shouldn't we decide whether or not to ship?
Can you answer them? Is your answer a quote from some famous tester like "a bug is something that bugs somebody who matters"? How do you know who matters? How do you know what these people think? Is it enough if one out of a million of these are bugged? Two? Five? A hundred? If we expect someone to matter later who we expect will be bugged about this, is it a bug? If people who matter don't know about something, e.g. a missing link, but would love it, and be bugged if it was later removed, is the fact that it's missing a bug?
The questions I asked above is exactly the kind of questions I want to you ask and try to answer. It's a kind of mental gymnastics that will not only help you improve your ability to put thoughts into words but it will also help you learn a lot more about testing (which will be a common theme for the rest of this blog post).
Blog
I brought up the importance of blogging way back in my 2012 refelction posts, there you can find a bunch of motivational links and tips on why to blog.
Anyway, I get two things out of blogging relevant to the topic:
- Feedback on how well I present/explain my thoughts (if people read and react)
- An efficient way of practicing the act of putting thoughts into words
The second point depends on that I write about my own thoughts and not just reference what other people say.
Talk about test
Talk to colleagues, speak with people on Twitter, go to conferences/meetups, present. All these things force you to practice the art of putting thoughts into words, some of them also force you to do it rapidly to keep the conversations going which adds another dimension (which, for instance, blogging doesn't).
Explain what you do to non-testers
Have you ever tried explaining what you do and why it's important/interesting to someone like, say your mom? Or boss? Or best bud? Or some programmer you know? Or maybe you from before you became a tester ("hey myself, you're going to become a software tester and you'll love it, let me explain to you why!")? Try it, it's a lot harder (at least for me) than you might think. Also it really forces you to look deep into your thoughts to find out what things really mean to someone not familiar with testing (sometimes it helps you figure out what stuff really means in a more general/other context than testing).
Ask for feedback
It's important to practice but it's also important to know if your efforts are paying off. "Am I really better at explaining what I do know?". You can ask pretty much anyone how they interpreted you explanations and/or how easy they though your explanation was to understand. Here are a few quick, general example:
It's important to practice but it's also important to know if your efforts are paying off. "Am I really better at explaining what I do know?". You can ask pretty much anyone how they interpreted you explanations and/or how easy they though your explanation was to understand. Here are a few quick, general example:
- Was my last bug report clear to you? Did you miss something?
- Was my last debrief useful? What could I've done to improve it?
- Did you understand my explanation or what part was hard to understand?
- Is it clear to you why I think we need to use an exploratory approach?
Question stuff
If you walk around and accept not understanding what others try to communicate to you I believe there's a great risk you do the same about your own explanations/thoughts. Try to get out of this by questioning anything you don't believe in and ask for another explanation whenever you don't understand something. I've found this to really help me question my own believes, assumption and opinions.
Good question to ask yourself after any explanation:
Would my dumbest colleague understand this and why not?
As a recommended reading on this I really love this blog post from Tommy MacWilliam.
Mentor someone
I've just done this briefly but I find it a great way to force myself to explain in detail what I'm doing and why. It also provides me with a great possibility to get feedback, both verbally and as I monitor my pupil's improvements.
Rapid Software Testing
The Rapid Software Testing course is a lot about trying to get you to think like a top notch tester. One of the key aspects practiced during this course is explaining what you do (putting thoughts into words) and build the courage to do so. I highly recommend the class for many reasons, this is definitely one!
Summary
Putting thoughts into words is key when doing any form of testing (planning, executing, bug reporting, following up etc.). Ways I've discovered that works well for me to practice this skill is:
- Question the answers, why is this answer true? Is it really true in this context? When is it not true? Why?
- Try to come up with your own answers and refine them using questions
- Blog, or write about testing in some other way
- Talk testing whenever possible
- Ask for feedback
- Don't accept not understanding neither your own nor others explanations
- Become a mentor to force yourself to communicate your thoughts
- Rapid Software Testing is a great course to help you improve this skill
14 April 2013
ConTest (meetup) - Security Testing
What is ConTest?
ConTest (link in Swedish) is a local test meetup in Malmö, Sweden started by Henrik Andersson (please correct me if I'm lying). Each meetup they have a theme and participants provide content by sharing short presentations/lightnings talks (approx 5 mins) that are followed by a facilitated discussion.
First Talk: Martin Thulin: Learning Security Testing
Martin, just like me, started exploring the world of security testing quite recently. In his talk he went through his top three resources for self-education in security testing:
ConTest (link in Swedish) is a local test meetup in Malmö, Sweden started by Henrik Andersson (please correct me if I'm lying). Each meetup they have a theme and participants provide content by sharing short presentations/lightnings talks (approx 5 mins) that are followed by a facilitated discussion.
First Talk: Martin Thulin: Learning Security Testing
Martin, just like me, started exploring the world of security testing quite recently. In his talk he went through his top three resources for self-education in security testing:
- Google security blog
General information about online security - OWASP
Great cheat sheets, basic security information, list of security testing tools and much much more. - Hack this site
Step by step hacking challenges used for practice/learning.
But maybe most importantly he shared the message:
"Anyone can become a hacker"
Great topic and great presentation!
"Anyone can become a hacker"
Great topic and great presentation!
In the discussion that followed tons of resources came up. I won't list them all here, instead I urge you to check out the #foocontest hashtag in Twitter..
Second Talk: Me: Quick Tests in Security Testing
Quick tests are cheap (quick, simple, low resource), general tests that usually indicate the existence, or potential existence, of a common problem or group or problems. Even though rarely proving the absence of a certain problem they can often be a good start to help you highlight common problems early (disclaimer: this definition/description of quick tests might not match with James Bach's and Michael Bolton's that is used e.g. in RST).
I spoke about two quick tests (for web) I've used:
- Refresh
- Cocktail string
Refresh means whenever I find a page that seems to make a lot of database calls, heavy calculations or connect to external servers (like an email or SMS server) I simply press and hold F5 (refresh) for a short while. What I'm looking for doing this is database errors, any changes in content/design, error messages in general and a finally fully context dependent stuff like received mails when the page is calling an email server or alarming patterns in logs.
In practice I've used this to take down a couple of databases (or rather connection to the databases). Simple and effective. Credits to Joel Rydén by the way who taught me this.
The idea with the Cocktail String is described in a separate post.
As a bonus I can provide you with a few other:
As a bonus I can provide you with a few other:
- Scramble cookies
Just quickly change the contents of cookies a page creates. Try both to generate errors by e.g. use strings where a number is set, use other plausible values (0 is always interesting, negative values as well) and combine with the cocktail string. - Back button
When viewing sensitive data, log out and try pressing back. Is the data visible? Common and scary bug in some systems (systems expected to be used on any shared computer/handling very sensitive data), irrelevant in others (remember browsers deal with caching differently). - Bob
When creating an account first try password "bob". It's so insecure very few systems should allow it (but does).
Final Talk: Sigurdur Birgisson: Security, Usability... huh?
I was a bit disappointed about Sigge's talk, not that it was bad (it was actually awesome) but because I was hoping he would share something really smart about how to deal with the often conflicting wishes of security and usability (like captcha). It also had very little to do with security testing, but who cares...
So what was it about? Sigge talked about how he thought the quality characteristics (CRUSSPIC STMPL mnemonic) were often interpreted as too abstract when talking to stakeholders. Instead he used the Software Quality Characteristics published on the Test Eye. What he did was he printed a card for each "sub characteristic" and, for best effect, tried to add matching examples from the product examined. The goals were both to get the cards (characteristics) prioritized to aid the testing focus, to support a healthier discussion of what the product needed to do as well as to make stakeholders care for them (it's not just about new features).
- It must be quick, quick is above everything else!
- What about data integrity?
When I said that, the customer started to hesitate
// Sigge
Henrik Andersson also shared an interesting story related to this presentation where he had gotten the top managers in a company to prioritize the original 8 quality characteristics (CRUSSPIC) in the context of a product and used this both when testing and when reporting status. Brilliant as well!
There are a lot more to say about this presentation and I might get back to it in the future. For now, just check out Sigges blog. Finally it made me think of ways to improve my own ongoing prioritization job with my product owner, for that I'm really grateful!
Summary of ConTest
Great people, great mix of people, great discussions, great presentations, great facility, great to meet Sigge before his Australia adventure, great to meet Henrik Andersson for the first time, great to try ConTest's format and great to get new insights about security testing. ConTest was a blast! Thank you Malmö, Foo Café and all the ConTest participants!
So what was it about? Sigge talked about how he thought the quality characteristics (CRUSSPIC STMPL mnemonic) were often interpreted as too abstract when talking to stakeholders. Instead he used the Software Quality Characteristics published on the Test Eye. What he did was he printed a card for each "sub characteristic" and, for best effect, tried to add matching examples from the product examined. The goals were both to get the cards (characteristics) prioritized to aid the testing focus, to support a healthier discussion of what the product needed to do as well as to make stakeholders care for them (it's not just about new features).
- It must be quick, quick is above everything else!
- What about data integrity?
When I said that, the customer started to hesitate
// Sigge
Henrik Andersson also shared an interesting story related to this presentation where he had gotten the top managers in a company to prioritize the original 8 quality characteristics (CRUSSPIC) in the context of a product and used this both when testing and when reporting status. Brilliant as well!
There are a lot more to say about this presentation and I might get back to it in the future. For now, just check out Sigges blog. Finally it made me think of ways to improve my own ongoing prioritization job with my product owner, for that I'm really grateful!
Summary of ConTest
Great people, great mix of people, great discussions, great presentations, great facility, great to meet Sigge before his Australia adventure, great to meet Henrik Andersson for the first time, great to try ConTest's format and great to get new insights about security testing. ConTest was a blast! Thank you Malmö, Foo Café and all the ConTest participants!
11 April 2013
Cocktail Strings - Quick Test for Web Security Testing
This is a part of my talk on ConTest this evening. I started blogging about the whole event but time is running out so I'll start with this and provide you with the rest tomorrow.
Cocktail String?
Cocktail String are a form of quick test that can be used in security testing. The idea is a general string with a mix of ending characters and other stuff that can be used for MySQL injections, XSS and similar. Here's the example I provided:
'"$%>*<!--
The initial single and double quotes are to screw up badly sanitized query strings or similar, the $ is to get an error in case the string is evaluated as PHP, %> is to end a PHP or HTML tag, the star I don't know what it's for but since it's often used as wildcard in various situations I figured it might provide something and the ending <!-- is my secret weapon as it safely, but with a cool effect, exposes possibility to execute user defined HTML (and thus potentially scripts like javascript).
Where to use?
The string, or variations of it, can be used wherever users can send data to a server like in forms (especially look for hidden form named "id" and similar), cookies, file uploads, HTTP requests etc.
What are you looking for?
Exactly what to look for depends on context but here are a few suggestions:
- Any kind of errors like no database connection, faulty query errors, code errors etc.
- Garbage being printed on the page
- Any irregularities on the page (even subtle design changes like an extra line break)
- Check HTML code for occurrences of the string
- Check for errors in any logs
- If used in combination with databases, check the database content
- Check cookie content
Actual examples
I names my user this on a server application. Everything was fine until I started the administration interface, suddenly I couldn't see anything but a few garbage characters on a white background. The reason was the user's name wasn't sanitized on the admin pages but everywhere else.
Another similar example was when I used it as name for an item occurring in a log blanking out everything showing up after that log entry.
Finally I used it on a friend's project to highlight MySQL injection possibilities.
Ways to improve it
Many great suggestions to improve the string came up during the meetup. One was to add unicode versions of certain characters since this for instance fools some sanitation functions built into PHP as well as other sanitation plugins. Another was to add international characters including stuff like Arabic and Japanese. Finally Mattias suggested using an iframe instead of the comment tag at the end since loading a scary page is even more striking than blanking out half the page (as well as actually proving a really destructive bug exists). Cred to a lot of people for all the great suggestions (especially Simon for unicode, I'll add that for tomorrow's testing!).
Finally notice you'll have to change this string to fit your context, for instance a string used on an ASP.NET application or a Java desktop application looks different.
Automation
One interesting comment that came up was automating this (which is typically what security scanners do already, by the way). First off, it's suited really well for random automation that just sends various combinations of dangerous characters into every field it finds and compares for instance HTML output to a control. Might miss stuff / give a lot of false positives but still a great way to work with it. It also addresses one of the weaknesses with the string; since it uses so many different commonly filtered out characters the string might get filtered out due to one character while a couple of the other would have worked in isolation (this is typically what I mean when saying quick tests rarely provide evidence that a certain problem can't exist), with the speed of automation (assuming you achieve that) you could test them more one by one as well as in more combinations.
Summary
So a bit of a messy post but long story short:
Instead of Bob, name your next test user '"$%>*<!-- and let it live in the lovely city of '"<!-- with interests in '"$% and >*<.
Good night!
09 April 2013
EAST meetup - talking metrics
Yesterday's EAST meetup was focused on test metrics, starting with us watching Cem Kaner's talk from CAST 2012 and then continuing with an open discussion on the webinar and metrics in general.
Cem's talk
Cem talked about the problem to setup valid measurements for software testing and the work he presented on threats to validity in measurements seemed very interesting (you can read more about that in the slides). He also talked about problems with qualitative measurements:
So far so good.
But Cem also, as I interpreted it, implied that all quantitative measurements we know of are crap but if a manager ask for a certain number you should provide it. In this case I agree we testers in general need to improve our ability to present information but, as I will come back to, I strongly disagree with providing bad metrics even after presenting the risks with them.
How do you measure test in your company
Everyone was asked: How do you measure test in your company. The most common answer was happy/sad/neutral smilies in green/red/yellow, which was quite interesting since it relates closely to emotions (at least as it's presented) rather than "data".
The meaning of the faces varied though:
Cem's talk
Cem talked about the problem to setup valid measurements for software testing and the work he presented on threats to validity in measurements seemed very interesting (you can read more about that in the slides). He also talked about problems with qualitative measurements:
- Credibility, why should I believe you
- Confirmability, if someone else analyzed this, would that person reach the same conclusion
- Representativeness, does this accurately represent what happens in the big picture
- Transferability, would this transfer to another similar scenarios/organizations
So far so good.
But Cem also, as I interpreted it, implied that all quantitative measurements we know of are crap but if a manager ask for a certain number you should provide it. In this case I agree we testers in general need to improve our ability to present information but, as I will come back to, I strongly disagree with providing bad metrics even after presenting the risks with them.
How do you measure test in your company
Everyone was asked: How do you measure test in your company. The most common answer was happy/sad/neutral smilies in green/red/yellow, which was quite interesting since it relates closely to emotions (at least as it's presented) rather than "data".
The meaning of the faces varied though:
- Express progress, start sad and end happy (="done")
- Express feelings so far, first two weeks were really messy so we use a sad face even if it looks more promising now
- Express feelings going forward, good progress so far but we just found an area that seems really messy so sad face (estimate)
In most cases some quantitative measures were used as input but wasn't reported.
My personal favorite was an experiment Johan Åtting talked about, where a smiley, representing the "mood" so far (refers to item two in the list above), is put on a scale, representing perceived progress (see picture). Seemed like a straight forward and nice visual way to represent both progress and general gut feeling. The measurements of progress in this case were solely qualitative if I understood correctly but would work if you prefer a quantitative way of measuring it as well.
Also a couple of interesting stories, both from the great mind of Morgan. First was an example of a bad measurement with managers basing their bonuses on lead time for solved bug reports. This was fine as long as developers had to prioritize among incoming reports but when they improved their speed this measurement dropped like a stone (suddenly years old, previously down prioritized, bugs were fixed) and managers got upset.
The second story was about a company where developers and "quality/production" (incl. testers) were separated. Testers in this case tested the components sent from developers in isolation before going to production. However, when the components got assembled tons of problems arised, problems customers reported back to the developers without quality/production knowing it. This lead to a situation where quality/production couldn't understand why managers were upset with bad sales, the product was fine in their view. The situation improved when Morgan started to send the number of bugs reported from customers (bad, quantitative measurement) to the quality/production department.
An interesting twist was later when they tried to change the bad quantitative measurement with something more representative and met a lot of resistance since management had learned to trust this number. I asked him if he in retrospect would had done anything differently but we never got to an answer.
I shared a creative test leader's way of dealing with number. She reported certain information (like test case progress and bug count) but removed the actual numbers. So when presenting for upper management she simply had visual graphs to demonstrate what she was talking about. As far as I know this was well received from all parties.
Finally an interesting comment from Per saying: "Often I find it more useful to say; give us x hours and I can report our perceived status of the product after that".
Epiphanies (or rather interesting insights)
During the discussions I had a bunch of interesting insights.
- Measurements are not a solution to trust issues!
- Instead of saying "we have terrible speech quality" or "the product quality is simply too bad" we could let the receiver listen to the speech quality or demo the aspects of the product we find bad. It's a very hands on way to transfer information we've found.
- Ethics is a huge issue. If we want someone to believe something we can (almost) always find compelling numbers (or analysis for that matter).
- Measuring progress or making estimations will not change the quality of test/product's state after a set amount of time (just as a reminder of what you don't achieve and the cost of measurements).
- If a "bad measurement" can help us reach a better state in general (like in Morgan's example), is it really a bad measurement? (touching on ethics again).
- When adding qualitative measurements to strengthen our qualitative measurements, are we digging our own grave? (risk of communicating: So it's not my analysis/gut feeling that matters, it's the numbers)
- How a metric is presented is often more important than the metric itself.
- In some cases brought up the measurements weren't even missed when not reported anymore.
- Don't ask what reports or data someone wants, ask what questions they want you to answer.
- Cem talked about transfer problem for students, making it hard for them to understand how a social science study can relate to computer science for instance. I think the same problem occurs when we move testing results into a world steered by economics (numbers).
- Even bad measurements might be useful to highlight underlying problems. Once again Morgans examples somewhat shows this and Per talked about how more warnings in their static code analysis was an indication programmers might be too stressed (if I interpreted it correctly). In these cases it's important to state what the measurement is for and that's it's just a potential indication.
Measurement tempering
We talked about how we easily get into bad habits/behavior when quantitative measurements becomes "the way" to determine test status.
Bad behavior when measuring test case progress:
- Testers saving easy tests so they have something to execute when the progress is questioned
- Testers running all simple tests first to avoid being questioned early on
- Testers not reporting the actual progress to avoid putting too much pressure on them or to fake progress to calm people down.
- Testers writing tons of small simple tests that typically test functionality individually which creates a lot of administrative overhead as well as risk of not testing how components work together, this to ensure a steady test case progress.
- Test leaders/managers questioning testers when more test cases are added (screws up "progress"), as a result, testers ignore broadening test scope even when obviously flawed.
Bad behavior when measuring pass/fail ratio:
- Ignoring bugs occurring during setup / tear down.
- Slightly modifying a test case to make it pass (making it conform with actual result rather than checking if that's a correct behavior).
- Slightly modifying a test case to make it pass (removing a failing check or "irrelevant part that fails")
Culture
We also talked about how the culture (not only related to test measurements) in various countries affected testing. One question was: Why is CDT so popular in Sweden. Among the answers was low power distance (Geert Hofstede, cred to Johan Åtting), Law of Jante and that we're not measured that much in general (i.e. late grading in schools, not grading in numbers etc.).
We also talked about drawbacks with these cultural behavior (like hard to get to decisions since everyone should get involved/agree).
Finally, and mostly, we talked about how our view on testing and measuring sometimes collides with other cultures, with actual examples from India, Poland and the US. This discussion was a bit fragmented but feel free to grab me if you want to hear about it.
Summary
This was a great EAST meetup and I really feel sorry for the guys I know like this topic but couldn't attend. Definitely a topic I hope (and think) we'll get back to!
Finally a lovely quote from Magnus regarding a private investigator determining time to solve a crime:
"Let's see, there are 7 drops of blod, this will take 3 weeks to solve".
Good night! .)
10 March 2013
EAST meetup - Trying a new format
What is EAST
Short version: EAST is a local network for software testers in the area I live (Linköping, Sweden). About once a month we meet to share thoughts, learn and socialize/network. If you want to know more you can read my post from my first EAST meetup.
New Format / Concept
Most previous meetups have followed the format: Start with food and socializing, then 1-2 presentations, each followed by discussions. A few other formats have been used, like letting each participant shortly describe how testing is performed at his/hers company (try to get everyone involved), testing games and the pub evening with only beer and socializing. This time some hands-on testing was on the menu.
Webinar
As usual we started with food but instead of casually chatting with each other we together looked at the Six Thinking Hats for Software Testers by Julian Harty (referring to Edward de Bono's thinking hats). The webinar was interesting but I missed the social part. Maybe a 5-15 minutes long webinar would have worked better for me.
Hands on
Second part was hands on. First David, the facilitator, quickly presented the application we were to test and how to get it. The application in this case was an app for taxi drivers to help them keep track of their work (part of a full system for taxi companies). Second we would, in groups or alone, test the application using the six thinking hats in any way we saw fit.
First I and a former colleague from Ericsson started out exploring/scouting the app but soon felt we didn't get any momentum and joined another group of three. A vivid discussion on how to use each hat and what fitted where ended with us being ready to actually start generating test ideas about 5 minutes before deadline but that didn't matter, at lot of interesting stuff came out of the discussions.
Sharing results
When we were back together we started the sharing of ideas and experiences from the hands on part. Immediately it became clear the various groups had used the hats very differently.
We attacked the whole app and used the various hats from the perspective of an actual user (or rather our made up view of how it was to be a taxi driver, lesson learned: better understanding of the user; for instance its needs, knowledge and general behavior, would have helped a lot). This perspective gave us an interesting high level view but it was a bit hard to stay focused as we often drilled down too deep exposing too many details. Others focused on a smaller part of the app which, in this case, seemed a lot more efficient.
Of course how to use the hats would depend on the mission in a real life scenario but in this case the mission was not set like that, the goal was just to try out the hats and learn from each other's way to approach the hats and for this, our perspective took a little too long / became a bit too complex, at least for beginners like us.
Wrap up
To wrap things up we had a more general discussion about the hats and format.
A few thoughts about the hats:
Short version: EAST is a local network for software testers in the area I live (Linköping, Sweden). About once a month we meet to share thoughts, learn and socialize/network. If you want to know more you can read my post from my first EAST meetup.
New Format / Concept
Most previous meetups have followed the format: Start with food and socializing, then 1-2 presentations, each followed by discussions. A few other formats have been used, like letting each participant shortly describe how testing is performed at his/hers company (try to get everyone involved), testing games and the pub evening with only beer and socializing. This time some hands-on testing was on the menu.
Webinar
As usual we started with food but instead of casually chatting with each other we together looked at the Six Thinking Hats for Software Testers by Julian Harty (referring to Edward de Bono's thinking hats). The webinar was interesting but I missed the social part. Maybe a 5-15 minutes long webinar would have worked better for me.
Hands on
Second part was hands on. First David, the facilitator, quickly presented the application we were to test and how to get it. The application in this case was an app for taxi drivers to help them keep track of their work (part of a full system for taxi companies). Second we would, in groups or alone, test the application using the six thinking hats in any way we saw fit.
First I and a former colleague from Ericsson started out exploring/scouting the app but soon felt we didn't get any momentum and joined another group of three. A vivid discussion on how to use each hat and what fitted where ended with us being ready to actually start generating test ideas about 5 minutes before deadline but that didn't matter, at lot of interesting stuff came out of the discussions.
Sharing results
When we were back together we started the sharing of ideas and experiences from the hands on part. Immediately it became clear the various groups had used the hats very differently.
We attacked the whole app and used the various hats from the perspective of an actual user (or rather our made up view of how it was to be a taxi driver, lesson learned: better understanding of the user; for instance its needs, knowledge and general behavior, would have helped a lot). This perspective gave us an interesting high level view but it was a bit hard to stay focused as we often drilled down too deep exposing too many details. Others focused on a smaller part of the app which, in this case, seemed a lot more efficient.
Of course how to use the hats would depend on the mission in a real life scenario but in this case the mission was not set like that, the goal was just to try out the hats and learn from each other's way to approach the hats and for this, our perspective took a little too long / became a bit too complex, at least for beginners like us.
Wrap up
To wrap things up we had a more general discussion about the hats and format.
A few thoughts about the hats:
- In this scenario we had no problem coming up with ideas quickly and the hats initially felt more limiting than helping. In a situation where we would have been stuck or at least worked with the product for a bit longer, the hats would probably had been more helpful. This might also change as we get better and better at using the hats but only time can tell how that turns out.
- Keep using the hats on items already identified would probably render interesting results. For instance first we could use the white hat to identify a data item, say cars in the Taxi app, then we could use the hats on this item to identify risks (lack of cars), opportunities (more efficient usage of cars), new data (various types of cars), feelings (lust to drive fast), "darkness" (breakage, with passenger in the car) and ideas (what if the cars were any kind of vehicle, like bicycles or trucks, what other users could then benefit from the app). Continuing to apply the hats on each new item added would help to dig really deep down.
- The hats would be interesting to use on the product elements or quality criteria categories described in the RST Appendices.
- You should be able to visualize the findings quite effectively using a mind map.
- The hats could be a great tool to help refining, populating or creating a model.
- Someone suggested it could be useful to help explaining the various parts of testing to, for instance, a programmer (not sure I understood this correctly but as I interpreted it, it could be used to help them understand the wider "find valuable information" perspective rather than the narrow "find faults" perspective of testing).
- If you're not stuck yet, starting with no particular hat and instead just add stuff wherever it fits in the beginning might be a more efficient way to start (possibly changes when you're more use to the hats).
- The hats were helpful to find more non functional aspects to test.
A few reactions to the format/execution itself:
- Generally people seemed to have liked the format a lot.
- We had some troubles understanding the app's usage/user. Using a known application (like MS paint) or object (like a chair) might had been better ways to understand the concept of the hats. On the contrary, using this app forced us to do some scouting and work with assumptions which broaden the ways the hats could be used.
- Like I said before, the webinar was a cool way to mix it up personally I missed the regular chatter while eating. Still very cool so a shorter webinar might be perfect.
Finally it was fun to just study my group during the hands-on part. Let's just say we had very different ways to attack the problem (relentless scouting, analyzing the input information or quickly getting a set of data to work with for instance).
Thanks!
Thanks to David Månsson, Aveso, who facilitated this meetup in a kick ass way! Also thanks to all the participant for sharing interesting thoughts. Looking forward to future experiments, this one was definitely a success in my book!
Thanks!
Thanks to David Månsson, Aveso, who facilitated this meetup in a kick ass way! Also thanks to all the participant for sharing interesting thoughts. Looking forward to future experiments, this one was definitely a success in my book!
12 February 2013
This is what I do, I'm a tester
We have an application we want to create.
First we have Angie. Angie decided how the application is planned to function, look etc. based on users' expectations and demands (sometimes she's just guessing but she's good at that as well). She communicates her plan to Bob, Cody and Dora and let them have a say about the ideas. Cody is the programmer (of course), he makes things happen by writing incorrect english with very many special characters (also known as programming). But Cody needs help to make his cool stuff look cool. That's why he loves Bob, the designer. They are usually trash talking each other but deep within it's pure love.
Cody and Bob do their work based on Angie's plan but sometimes they, intentionally or unintentionally, drift away from it. Intentionally, for example, could mean Cody realizes that the plan conflicts with how the rest of the application is coded and he simply tweaks the new code to fit the old while unintentionally could simply be a misinterpretation. Also, many things are not explained in Angie's plan so Cody and Bob have to take initiatives in order to not constantly disturb Angie as well as to make their work creative and fun. Also Angie doesn't master design and code the same way Bob and Cody do so she has to trust their judgement.
First we have Angie. Angie decided how the application is planned to function, look etc. based on users' expectations and demands (sometimes she's just guessing but she's good at that as well). She communicates her plan to Bob, Cody and Dora and let them have a say about the ideas. Cody is the programmer (of course), he makes things happen by writing incorrect english with very many special characters (also known as programming). But Cody needs help to make his cool stuff look cool. That's why he loves Bob, the designer. They are usually trash talking each other but deep within it's pure love.
Cody and Bob do their work based on Angie's plan but sometimes they, intentionally or unintentionally, drift away from it. Intentionally, for example, could mean Cody realizes that the plan conflicts with how the rest of the application is coded and he simply tweaks the new code to fit the old while unintentionally could simply be a misinterpretation. Also, many things are not explained in Angie's plan so Cody and Bob have to take initiatives in order to not constantly disturb Angie as well as to make their work creative and fun. Also Angie doesn't master design and code the same way Bob and Cody do so she has to trust their judgement.
Finally we have Dora. She looks at what Bob and Cody have created to get as much information as possible about the application's shape. Bad shape in this case can come in many forms (list definitely not complete):
- Stuff is simply wrong
2+2 is not 9 and trying to login with an empty password shouldn't crash the application - Stuff seemed good on paper but in reality it sucks
A popup confirming you've made a change was a great idea until you updated 2000 items at once. - Stuff is not consistent
Angie likes popups while Cody likes CSS boxes. This can become a problem, especially when there's not one but twenty Codys, all with their own preference. - By-products might be a problem
A global search was a great feature until it was discovered logging in now takes 10 minutes instead of 0,3 seconds due to search indexing. - Some desired features weren't anticipated
"These short service messages seems really useful, maybe we should let users send them to each others?" "SMS? Useful? Well, I guess we could do that..." - Bad people can do bad stuff
Picture from xkcd. - Stuff simply didn't solve whatever stuff was suppose to solve
The new wizard (not the cool magical kind, the nasty GUI kind) was suppose to simplify the creation of new documents but now the user is presented with so many options they could earlier ignore it's just insanely complex! What a heck is document structure template, I just want a new document, damn it!
Dora looks for all this to help Bob and Cody (and Angie) look good so that users (and managers) won't come with pitchforks trying to end their lives. She also helps stakeholders like Angie and others make better decisions by providing them as much valuable information as possible about the application's shape. To make sure good decisions about the application can be made it's important Dora, just like Angie, is a great communicator.
Some people think Dora's job is about making sure 2+2 is 4. They are the same people who wouldn't care if the meat they buy is rotten as long as it's packaged according to spec. Others think Dora is a sadistic mistress, which she might be, but not during working hours. If she was she would just stay quiet about problems found and see the programmers run in panic as users storm the building with their pitchforks.
- You just want to see stuff break, Dora!
- No, but I rather see it break here than in the face of thousands of users
Some people think Dora's job is about making sure 2+2 is 4. They are the same people who wouldn't care if the meat they buy is rotten as long as it's packaged according to spec. Others think Dora is a sadistic mistress, which she might be, but not during working hours. If she was she would just stay quiet about problems found and see the programmers run in panic as users storm the building with their pitchforks.
- You just want to see stuff break, Dora!
- No, but I rather see it break here than in the face of thousands of users
I'm Dora! Or well, I'm not, but this is what I do, I'm a tester!
04 February 2013
Improving value from reading pt1: The School for Skeptics
Intro
"Improving value from reading" is a new series in this blog that I hope will be successful enough (as in value for me) to not just make it past the first post.
Some background...
One of many observations during my 2012 reflections was I didn't seem to get much out of reading books. In this series I will try to use/apply the content of what I read to hopefully change that.
The book
First out is the Swedish book Skeptikerskolan ("The School for Skeptics"). This book is divided into two parts, first an introduction to fallacies with examples when a practical part with illustrations of how to practice skepticism when discussing religion, politics, news etc. I will only focus on the first part by providing test related examples for each fallacy in the book.
For those of you familiar with skepticism, please help me correct my examples as well as my English translations of definitions and how they are used. I've tried my best but don't want to spend too much time just getting the translations correct.
Fallacies
Appeal to Authority
There's no value in certifications, if you don't believe me just listen to Michael Bolton!
Personal Attack
James Bach is a high school dropout, he's well over his head trying to challenge well established test methodologies.
False dilemma
Either we have test cases or we'll have no idea of what's been tested!
Appeal to consequences
I clicked the login button and everything crashed so there's something wrong with the login code.
The straw man
Exploratory testers just care about their own work. From their point of view no one else should know anything about the tests or testing status.
Argument from ignorance
We've never figured out why the node is crashing but it must be due to some OS bug.
Personal incredulity
I can't see how this patch would cause the performance issues we see, it must be something else.
Tu quoque
The senior testers don't care about following the test process so I can ignore it as well.
Comment, seems like the definition online for Tu quoque doesn't correspond well to the explanation in the book (but the term is used), is this a valid example?
Does not follow
We found a lot of bugs testing the user administration, there's no use to continue testing, this piece of software is just crap.
Final consequences
We need to embrace context driven testing if we ever want to reach an efficient test process.
Begging the question
Test cases makes our testing better so we need them.
Simultaneous
During traffic there is often screen flicker so they must be related.
Confusing currently unexplained with unexplainable
We've tried various ways to replicate the bug but failed, it's irreproducible.
Slippery Slope
They're trying to automate the smoke tests, soon we'll have to write automated scripts for every test we do.
Ad hoc explanation
- TMap can't improve any test process
- It improved my team's test process
- It can't improve any professional test process
No true Scotsman
- TMap can't improve any test process
- It improved my team's test process
- What you call test process is not really a test process
Argument to the stick
If you don't document your tests this project will fail miserably.
Appeal to Popularity
With over 200 000 certificates issued there's no doubt ISTQB is a must have for testers.
Appeal to Nature
Exploring software is just the natural way to test, it must be more efficient.
Guilt by association
Stop questioning his ideas, don't forget he was the brain behind both our previous successful releases!
"Improving value from reading" is a new series in this blog that I hope will be successful enough (as in value for me) to not just make it past the first post.
Some background...
One of many observations during my 2012 reflections was I didn't seem to get much out of reading books. In this series I will try to use/apply the content of what I read to hopefully change that.
The book
First out is the Swedish book Skeptikerskolan ("The School for Skeptics"). This book is divided into two parts, first an introduction to fallacies with examples when a practical part with illustrations of how to practice skepticism when discussing religion, politics, news etc. I will only focus on the first part by providing test related examples for each fallacy in the book.
For those of you familiar with skepticism, please help me correct my examples as well as my English translations of definitions and how they are used. I've tried my best but don't want to spend too much time just getting the translations correct.
Fallacies
Appeal to Authority
There's no value in certifications, if you don't believe me just listen to Michael Bolton!
Personal Attack
James Bach is a high school dropout, he's well over his head trying to challenge well established test methodologies.
False dilemma
Either we have test cases or we'll have no idea of what's been tested!
Appeal to consequences
I clicked the login button and everything crashed so there's something wrong with the login code.
The straw man
Exploratory testers just care about their own work. From their point of view no one else should know anything about the tests or testing status.
Argument from ignorance
We've never figured out why the node is crashing but it must be due to some OS bug.
Personal incredulity
I can't see how this patch would cause the performance issues we see, it must be something else.
Tu quoque
The senior testers don't care about following the test process so I can ignore it as well.
Comment, seems like the definition online for Tu quoque doesn't correspond well to the explanation in the book (but the term is used), is this a valid example?
Does not follow
We found a lot of bugs testing the user administration, there's no use to continue testing, this piece of software is just crap.
Final consequences
We need to embrace context driven testing if we ever want to reach an efficient test process.
Begging the question
Test cases makes our testing better so we need them.
Simultaneous
During traffic there is often screen flicker so they must be related.
Confusing currently unexplained with unexplainable
We've tried various ways to replicate the bug but failed, it's irreproducible.
Slippery Slope
They're trying to automate the smoke tests, soon we'll have to write automated scripts for every test we do.
Ad hoc explanation
- TMap can't improve any test process
- It improved my team's test process
- It can't improve any professional test process
No true Scotsman
- TMap can't improve any test process
- It improved my team's test process
- What you call test process is not really a test process
Argument to the stick
If you don't document your tests this project will fail miserably.
Appeal to Popularity
With over 200 000 certificates issued there's no doubt ISTQB is a must have for testers.
Appeal to Nature
Exploring software is just the natural way to test, it must be more efficient.
Guilt by association
Stop questioning his ideas, don't forget he was the brain behind both our previous successful releases!
27 January 2013
The year I became a passionate tester, part VI
December
The only (but really cool) event that occurred was Securitas Direct/Verisure's christmas party. It was hosted in their beautiful office in Malmö. Can't say I'm jealous, I mean, who wants a workspace with ocean view in three directions? Well, maybe, at least I got over it and had a great time with future colleagues, including one of the two guys who'll be on my dream team in Linköping (not putting pressure on anyone). The only new year's resolution I'll make by the way is coming dressed up next year (costume party).
Apart from that, December was mainly a month of reflection, Twitter and family life. Via Twitter I started to get in contact with some really cool testers like Jari Laakso, Kristoffer Nordström and Jean-Paul Varwijk. Via reflections I started to grasp/understand/better appreciate my progress as a tester this year (reason for this blog post series) as well as wrap up the work done by me and my colleague at Ericsson. Cool things that need separate post(s) to really do them justice.
Wrap up of 2012
So, 2012 was an amazing year, when it started I considered testing being an activity that mainly demanded endurance, patience and the ability very rigorously read technical documents (all skills/traits/knowledges I'm definitely not famous for, at least not in a good way). Now when 2012 has come to an end I value a whole bunch of other skills/traits/knowledges instead like observation, creativity, general system knowledge, analysis and communication. I now find testing adventurous and exciting rather than a repetitive routine work.
I've also met a ton of amazing people! Some have been thanked already but of course there are tons of you who I've not even mentioned. I will not attempt to list you all, instead THANKS TO ALL OF YOU WHO, IN ONE WAY OR ANOTHER, HAVE HELPED ME ON MY JOURNEY!
Some special mentions left to do:
EAST
EAST is a local open network (Linköping area) for testers, started by Johan Jonasson and Johan Åtting. When I started to look at its role in all this I realized:
Blogging
Starting 2012, blogging to me it was more or less just a poor narcissistic attempt to make something hopelessly uninteresting, interesting or it was something used by the top elite within a business to share their ultimate knowledge about something. At the end of 2012 I tell everyone who wants to progress to blog, not for anyone else's sake (that day will hopefully come) but for their own personal development. Blogging has taught me the value of reflection, greatly improved my ability to put thoughts into words (which is essential to understand, analyse and explain something), it has generated new perspectives, changed priorities or given me a more sensible view on some things, it has improved my English, my writing skills and my "technical storytelling".
Later when some of my blog posts became interesting to other testers it became a great way of getting in contact with other testers, getting feedback and acting as a portfolio. From my heart I urge you to try it out, not, to repeat myself, to become famous but to develop yourself!
Work
I undertook an amazing adventure at work that just didn't fit the format of this series. It didn't include dragons, fairies and elves but, considering how we traditionally have worked with testing, it sometimes felt just as exotic. Some day I'll blog about it but for now I just want to thank my other partner in crime, Saam Eriksson, a passionate, smart tester I hope will continue and further improve our work now when I leave Ericsson (3 working days left). You've meant a lot to me and my development this year Saam, thank you!
Everything comes with a price, dearie
24 hours a day, that's one of few hard constraints we have in life. During these hours we should rest, eat, take care of family and friends, work, take care of ourselves, develop/follow dreams, follow through on commitments, manage everyday activities (pay bills, clean up, get to and from work etc.)... and probably other things I've missed. This constrain is a huge pain in the butt, especially if you, like me, have very many things you love and want to explore.
A few thoughts and lessons from 2012 about managing this puzzle:
This year couldn't have come to a better start as my two fantastic boys got the most amazing little sister 11:10pm, January 1st.
Things I know will happen 2013:
Things I feel motivated to do now and thus might do during 2013:
Things I aim to continue with:
Finally, and foremost, I will continue enjoying my life as a dad!
Blog series wrap up
Thank you for reading this, if nothing else it has inspired and taught me a lot! If you want to start a similar journey like mine, feel free to contact me for help, tips or mentoring using comments on this blog, Twitter (@brickuz) or any other way you figure out.
Finally there are two more people I need to thank:
First, my fiancée. Among the million reasons I have for that; thanks for supporting me, inspiring me and being an awesome mom to my kids! Thank you, thank you and thank you!
Second, myself. This was at times a rough ride where I had to stand up and take responsibility for my actions to be able to continue, make sacrifices, take risks and really challenge both my ego and fears to succeed. Thank... me... for this journey!
2013, here i come!
The only (but really cool) event that occurred was Securitas Direct/Verisure's christmas party. It was hosted in their beautiful office in Malmö. Can't say I'm jealous, I mean, who wants a workspace with ocean view in three directions? Well, maybe, at least I got over it and had a great time with future colleagues, including one of the two guys who'll be on my dream team in Linköping (not putting pressure on anyone). The only new year's resolution I'll make by the way is coming dressed up next year (costume party).
Apart from that, December was mainly a month of reflection, Twitter and family life. Via Twitter I started to get in contact with some really cool testers like Jari Laakso, Kristoffer Nordström and Jean-Paul Varwijk. Via reflections I started to grasp/understand/better appreciate my progress as a tester this year (reason for this blog post series) as well as wrap up the work done by me and my colleague at Ericsson. Cool things that need separate post(s) to really do them justice.
Wrap up of 2012
So, 2012 was an amazing year, when it started I considered testing being an activity that mainly demanded endurance, patience and the ability very rigorously read technical documents (all skills/traits/knowledges I'm definitely not famous for, at least not in a good way). Now when 2012 has come to an end I value a whole bunch of other skills/traits/knowledges instead like observation, creativity, general system knowledge, analysis and communication. I now find testing adventurous and exciting rather than a repetitive routine work.
I've also met a ton of amazing people! Some have been thanked already but of course there are tons of you who I've not even mentioned. I will not attempt to list you all, instead THANKS TO ALL OF YOU WHO, IN ONE WAY OR ANOTHER, HAVE HELPED ME ON MY JOURNEY!
Some special mentions left to do:
EAST
EAST is a local open network (Linköping area) for testers, started by Johan Jonasson and Johan Åtting. When I started to look at its role in all this I realized:
- EAST was the place there I got convinced I needed to attend RST.
- EAST was the reason I picked up LinkedIn and without LinkedIn I would not had come in contact with Maria Kedemo (new job).
- Speaking about job, I met Johan Åtting via EAST, that was the other amazing job offer this year.
- EAST was one of the big reasons I started blogging (first test related post is about my first EAST meetup).
- Johan Jonasson, Johan Åtting, Joel Rydén and other EAST participants have all been role models and sources of inspiration, key to last year's events.
- EAST became the crucial link between the testing world existing at my work and the testing world existing in the rest of the world.
I could go on but let's just say I'm forever grateful for EAST and I hope I can inspire/help other participants to start similar journeys. Thanks to all who have participated or in other ways contributed!
Blogging
Starting 2012, blogging to me it was more or less just a poor narcissistic attempt to make something hopelessly uninteresting, interesting or it was something used by the top elite within a business to share their ultimate knowledge about something. At the end of 2012 I tell everyone who wants to progress to blog, not for anyone else's sake (that day will hopefully come) but for their own personal development. Blogging has taught me the value of reflection, greatly improved my ability to put thoughts into words (which is essential to understand, analyse and explain something), it has generated new perspectives, changed priorities or given me a more sensible view on some things, it has improved my English, my writing skills and my "technical storytelling".
Later when some of my blog posts became interesting to other testers it became a great way of getting in contact with other testers, getting feedback and acting as a portfolio. From my heart I urge you to try it out, not, to repeat myself, to become famous but to develop yourself!
Work
I undertook an amazing adventure at work that just didn't fit the format of this series. It didn't include dragons, fairies and elves but, considering how we traditionally have worked with testing, it sometimes felt just as exotic. Some day I'll blog about it but for now I just want to thank my other partner in crime, Saam Eriksson, a passionate, smart tester I hope will continue and further improve our work now when I leave Ericsson (3 working days left). You've meant a lot to me and my development this year Saam, thank you!
Everything comes with a price, dearie
24 hours a day, that's one of few hard constraints we have in life. During these hours we should rest, eat, take care of family and friends, work, take care of ourselves, develop/follow dreams, follow through on commitments, manage everyday activities (pay bills, clean up, get to and from work etc.)... and probably other things I've missed. This constrain is a huge pain in the butt, especially if you, like me, have very many things you love and want to explore.
A few thoughts and lessons from 2012 about managing this puzzle:
- If you have kids, make sure you put them in the center of the puzzle.
- The puzzle won't (in a good way) solve itself when there is an overflow of activities.
- Each piece can be optimized (value/quality per time unit) by observing, reflecting, analyzing and prioritizing, but you need a balance there as well (premature optimization is the root of all evil, you know).
- The usually extrovert me quickly becomes introvert (not in the positive way) when I don't feel I have enough "me-time" and that affects all parts of my life.
- It's easy to be blinded by your own passion but don't forget there are other things in life like friends, family and yourself.
This year couldn't have come to a better start as my two fantastic boys got the most amazing little sister 11:10pm, January 1st.
Things I know will happen 2013:
- New amazing job at Securitas Direct/Verisure!
- Let's Test 2013!
Things I feel motivated to do now and thus might do during 2013:
- BBST Foundation course
- Reclaim my presentational skills
- Connect with more testers on Skype
- Figure out how to get more out of reading (or start prioritizing other learning activities over reading)
- Be a speaker at some testing conference
- Try Skype coaching (as a student, had a taste of it thanks to Peksi)
Things I aim to continue with:
- Be an active blogger
- Be active in EAST
- Practice/play/experiment with testing
- Experiment with my learning
- Meet more testers in person
Finally, and foremost, I will continue enjoying my life as a dad!
Blog series wrap up
Thank you for reading this, if nothing else it has inspired and taught me a lot! If you want to start a similar journey like mine, feel free to contact me for help, tips or mentoring using comments on this blog, Twitter (@brickuz) or any other way you figure out.
Finally there are two more people I need to thank:
First, my fiancée. Among the million reasons I have for that; thanks for supporting me, inspiring me and being an awesome mom to my kids! Thank you, thank you and thank you!
Second, myself. This was at times a rough ride where I had to stand up and take responsibility for my actions to be able to continue, make sacrifices, take risks and really challenge both my ego and fears to succeed. Thank... me... for this journey!
2013, here i come!
Subscribe to:
Posts (Atom)