14 April 2013

ConTest (meetup) - Security Testing

What is ConTest?
ConTest (link in Swedish) is a local test meetup in Malmö, Sweden started by Henrik Andersson (please correct me if I'm lying). Each meetup they have a theme and participants provide content by sharing short presentations/lightnings talks (approx 5 mins) that are followed by a facilitated discussion.

First Talk: Martin Thulin: Learning Security Testing
Martin, just like me, started exploring the world of security testing quite recently. In his talk he went through his top three resources for self-education in security testing:
  • Google security blog
    General information about online security
  • OWASP
    Great cheat sheets, basic security information, list of security testing tools and much much more.
  • Hack this site
    Step by step hacking challenges used for practice/learning.
But maybe most importantly he shared the message:
"Anyone can become a hacker"

Great topic and great presentation!

In the discussion that followed tons of resources came up. I won't list them all here, instead I urge you to check out the #foocontest hashtag in Twitter..

Second Talk: Me: Quick Tests in Security Testing
Quick tests are cheap (quick, simple, low resource), general tests that usually indicate the existence, or potential existence, of a common problem or group or problems. Even though rarely proving the absence of a certain problem they can often be a good start to help you highlight common problems early (disclaimer: this definition/description of quick tests might not match with James Bach's and Michael Bolton's that is used e.g. in RST).

I spoke about two quick tests (for web) I've used:
  • Refresh
  • Cocktail string
Refresh means whenever I find a page that seems to make a lot of database calls, heavy calculations or connect to external servers (like an email or SMS server) I simply press and hold F5 (refresh) for a short while. What I'm looking for doing this is database errors, any changes in content/design, error messages in general and a finally fully context dependent stuff like received mails when the page is calling an email server or alarming patterns in logs.

In practice I've used this to take down a couple of databases (or rather connection to the databases). Simple and effective. Credits to Joel Rydén by the way who taught me this.

The idea with the Cocktail String is described in a separate post.

As a bonus I can provide you with a few other:

  • Scramble cookies
    Just quickly change the contents of cookies a page creates. Try both to generate errors by e.g. use strings where a number is set, use other plausible values (0 is always interesting, negative values as well) and combine with the cocktail string.
  • Back button
    When viewing sensitive data, log out and try pressing back. Is the data visible? Common and scary bug in some systems (systems expected to be used on any shared computer/handling very sensitive data), irrelevant in others (remember browsers deal with caching differently).
  • Bob
    When creating an account first try password "bob". It's so insecure very few systems should allow it (but does).
Final Talk: Sigurdur Birgisson: Security, Usability... huh?
I was a bit disappointed about Sigge's talk, not that it was bad (it was actually awesome) but because I was hoping he would share something really smart about how to deal with the often conflicting wishes of security and usability (like captcha). It also had very little to do with security testing, but who cares...

So what was it about? Sigge talked about how he thought the quality characteristics (CRUSSPIC STMPL mnemonic) were often interpreted as too abstract when talking to stakeholders. Instead he used the Software Quality Characteristics published on the Test Eye. What he did was he printed a card for each "sub characteristic" and, for best effect, tried to add matching examples from the product examined. The goals were both to get the cards (characteristics) prioritized to aid the testing focus, to support a healthier discussion of what the product needed to do as well as to make stakeholders care for them (it's not just about new features).

- It must be quick, quick is above everything else!
- What about data integrity?
When I said that, the customer started to hesitate
// Sigge

Henrik Andersson also shared an interesting story related to this presentation where he had gotten the top managers in a company to prioritize the original 8 quality characteristics (CRUSSPIC) in the context of a product and used this both when testing and when reporting status. Brilliant as well!


There are a lot more to say about this presentation and I might get back to it in the future. For now, just check out Sigges blog. Finally it made me think of ways to improve my own ongoing prioritization job with my product owner, for that I'm really grateful!

Summary of ConTest
Great people, great mix of people, great discussions, great presentations, great facility, great to meet Sigge before his Australia adventure, great to meet Henrik Andersson for the first time, great to try ConTest's format and great to get new insights about security testing. ConTest was a blast! Thank you Malmö, Foo Café and all the ConTest participants!

11 April 2013

Cocktail Strings - Quick Test for Web Security Testing

This is a part of my talk on ConTest this evening. I started blogging about the whole event but time is running out so I'll start with this and provide you with the rest tomorrow.

Cocktail String?
Cocktail String are a form of quick test that can be used in security testing. The idea is a general string with a mix of ending characters and other stuff that can be used for MySQL injections, XSS and similar. Here's the example I provided:

'"$%>*<!--

The initial single and double quotes are to screw up badly sanitized query strings or similar, the $ is to get an error in case the string is evaluated as PHP, %> is to end a PHP or HTML tag, the star I don't know what it's for but since it's often used as wildcard in various situations I figured it might provide something and the ending <!-- is my secret weapon as it safely, but with a cool effect, exposes possibility to execute user defined HTML (and thus potentially scripts like javascript).

Where to use?
The string, or variations of it, can be used wherever users can send data to a server like in forms (especially look for hidden form named "id" and similar), cookies, file uploads, HTTP requests etc.

What are you looking for?
Exactly what to look for depends on context but here are a few suggestions:
  • Any kind of errors like no database connection, faulty query errors, code errors etc.
  • Garbage being printed on the page
  • Any irregularities on the page (even subtle design changes like an extra line break)
  • Check HTML code for occurrences of the string
  • Check for errors in any logs
  • If used in combination with databases, check the database content
  • Check cookie content
Actual examples
I names my user this on a server application. Everything was fine until I started the administration interface, suddenly I couldn't see anything but a few garbage characters on a white background. The reason was the user's name wasn't sanitized on the admin pages but everywhere else.

Another similar example was when I used it as name for an item occurring in a log blanking out everything showing up after that log entry.

Finally I  used it on a friend's project to highlight MySQL injection possibilities.

Ways to improve it
Many great suggestions to improve the string came up during the meetup. One was to add unicode versions of certain characters since this for instance fools some sanitation functions built into PHP as well as other sanitation plugins. Another was to add international characters including stuff like Arabic and Japanese. Finally Mattias suggested using an iframe instead of the comment tag at the end since loading a scary page is even more striking than blanking out half the page (as well as actually proving a really destructive bug exists). Cred to a lot of people for all the great suggestions (especially Simon for unicode, I'll add that for tomorrow's testing!).

Finally notice you'll have to change this string to fit your context, for instance a string used on an ASP.NET application or a Java desktop application looks different.

Automation
One interesting comment that came up was automating this (which is typically what security scanners do already, by the way). First off, it's suited really well for random automation that just sends various combinations of dangerous characters into every field it finds and compares for instance HTML output to a control. Might miss stuff / give a lot of false positives but still a great way to work with it. It also addresses one of the weaknesses with the string; since it uses so many different commonly filtered out characters the string might get filtered out due to one character while a couple of the other would have worked in isolation (this is typically what I mean when saying quick tests rarely provide evidence that a certain problem can't exist), with the speed of automation (assuming you achieve that) you could test them more one by one as well as in more combinations.

Summary
So a bit of a messy post but long story short:
Instead of Bob, name your next test user '"$%>*<!-- and let it live in the lovely city of '"<!-- with interests in  '"$% and >*<.

Good night!

09 April 2013

EAST meetup - talking metrics

Yesterday's EAST meetup was focused on test metrics, starting with us watching Cem Kaner's talk from CAST 2012 and then continuing with an open discussion on the webinar and metrics in general.

Cem's talk
Cem talked about the problem to setup valid measurements for software testing and the work he presented on threats to validity in measurements seemed very interesting (you can read more about that in the slides). He also talked about problems with qualitative measurements:

  • Credibility, why should I believe you
  • Confirmability, if someone else analyzed this, would that person reach the same conclusion
  • Representativeness, does this accurately represent what happens in the big picture
  • Transferability, would this transfer to another similar scenarios/organizations

So far so good.

But Cem also, as I interpreted it, implied that all quantitative measurements we know of are crap but if a manager ask for a certain number you should provide it. In this case I agree we testers in general need to improve our ability to present information but, as I will come back to, I strongly disagree with providing bad metrics even after presenting the risks with them.

How do you measure test in your company
Everyone was asked: How do you measure test in your company. The most common answer was happy/sad/neutral smilies in green/red/yellow, which was quite interesting since it relates closely to emotions (at least as it's presented) rather than "data".

The meaning of the faces varied though:

  • Express progress, start sad and end happy (="done")
  • Express feelings so far, first two weeks were really messy so we use a sad face even if it looks more promising now
  • Express feelings going forward, good progress so far but we just found an area that seems really messy so sad face (estimate)
In most cases some quantitative measures were used as input but wasn't reported.

My personal favorite was an experiment Johan Åtting talked about, where a smiley, representing the "mood" so far (refers to item two in the list above), is put on a scale, representing perceived progress (see picture). Seemed like a straight forward and nice visual way to represent both progress and general gut feeling. The measurements of progress in this case were solely qualitative if I understood correctly but would work if you prefer a quantitative way of measuring it as well.


Also a couple of interesting stories, both from the great mind of Morgan. First was an example of a bad measurement with managers basing their bonuses on lead time for solved bug reports. This was fine as long as developers had to prioritize among incoming reports but when they improved their speed this measurement dropped like a stone (suddenly years old, previously down prioritized, bugs were fixed) and managers got upset.

The second story was about a company where developers and "quality/production" (incl. testers) were separated. Testers in this case tested the components sent from developers in isolation before going to production. However, when the components got assembled tons of problems arised, problems customers reported back to the developers without quality/production knowing it. This lead to a situation where quality/production couldn't understand why managers were upset with bad sales, the product was fine in their view. The situation improved when Morgan started to send the number of bugs reported from customers (bad, quantitative measurement) to the quality/production department.

An interesting twist was later when they tried to change the bad quantitative measurement with something more representative and met a lot of resistance since management had learned to trust this number. I asked him if he in retrospect would had done anything differently but we never got to an answer.

I shared a creative test leader's way of dealing with number. She reported certain information (like test case progress and bug count) but removed the actual numbers. So when presenting for upper management she simply had visual graphs to demonstrate what she was talking about. As far as I know this was well received from all parties.

Finally an interesting comment from Per saying: "Often I find it more useful to say; give us x hours and I can report our perceived status of the product after that".

Epiphanies (or rather interesting insights)
During the discussions I had a bunch of interesting insights.
  • Measurements are not a solution to trust issues!
  • Instead of saying "we have terrible speech quality" or "the product quality is simply too bad" we could let the receiver listen to the speech quality or demo the aspects of the product we find bad. It's a very hands on way to transfer information we've found.
  • Ethics is a huge issue. If we want someone to believe something we can (almost) always find compelling numbers (or analysis for that matter).
  • Measuring progress or making estimations will not change the quality of test/product's state after a set amount of time (just as a reminder of what you don't achieve and the cost of measurements).
  • If a "bad measurement" can help us reach a better state in general (like in Morgan's example), is it really a bad measurement? (touching on ethics again).
  • When adding qualitative measurements to strengthen our qualitative measurements, are we digging our own grave? (risk of communicating: So it's not my analysis/gut feeling that matters, it's the numbers)
  • How a metric is presented is often more important than the metric itself.
  • In some cases brought up the measurements weren't even missed when not reported anymore.
  • Don't ask what reports or data someone wants, ask what questions they want you to answer.
  • Cem talked about transfer problem for students, making it hard for them to understand how a social science study can relate to computer science for instance. I think the same problem occurs when we move testing results into a world steered by economics (numbers).
  • Even bad measurements might be useful to highlight underlying problems. Once again Morgans examples somewhat shows this and Per talked about how more warnings in their static code analysis was an indication programmers might be too stressed (if I interpreted it correctly). In these cases it's important to state what the measurement is for and that's it's just a potential indication.
Measurement tempering
We talked about how we easily get into bad habits/behavior when quantitative measurements becomes "the way" to determine test status.

Bad behavior when measuring test case progress:
  • Testers saving easy tests so they have something to execute when the progress is questioned
  • Testers running all simple tests first to avoid being questioned early on
  • Testers not reporting the actual progress to avoid putting too much pressure on them or to fake progress to calm people down.
  • Testers writing tons of small simple tests that typically test functionality individually which creates a lot of administrative overhead as well as risk of not testing how components work together, this to ensure a steady test case progress.
  • Test leaders/managers questioning testers when more test cases are added (screws up "progress"), as a result, testers ignore broadening test scope even when obviously flawed.
Bad behavior when measuring pass/fail ratio:
  • Ignoring bugs occurring during setup / tear down.
  • Slightly modifying a test case to make it pass (making it conform with actual result rather than checking if that's a correct behavior).
  • Slightly modifying a test case to make it pass (removing a failing check or "irrelevant part that fails")
Culture
We also talked about how the culture (not only related to test measurements) in various countries affected testing. One question was: Why is CDT so popular in Sweden. Among the answers was low power distance (Geert Hofstede, cred to Johan Åtting), Law of Jante and that we're not measured that much in general (i.e. late grading in schools, not grading in numbers etc.).

We also talked about drawbacks with these cultural behavior (like hard to get to decisions since everyone should get involved/agree).

Finally, and mostly, we talked about how our view on testing and measuring sometimes collides with other cultures, with actual examples from India, Poland and the US. This discussion was a bit fragmented but feel free to grab me if you want to hear about it.

Summary
This was a great EAST meetup and I really feel sorry for the guys I know like this topic but couldn't attend. Definitely a topic I hope (and think) we'll get back to!

Finally a lovely quote from Magnus regarding a private investigator determining time to solve a crime:
"Let's see, there are 7 drops of blod, this will take 3 weeks to solve".

Good night! .)

10 March 2013

EAST meetup - Trying a new format

What is EAST
Short version: EAST is a local network for software testers in the area I live (Linköping, Sweden). About once a month we meet to share thoughts, learn and socialize/network. If you want to know more you can read my post from my first EAST meetup.

New Format / Concept
Most previous meetups have followed the format: Start with food and socializing, then 1-2 presentations, each followed by discussions. A few other formats have been used, like letting each participant shortly describe how testing is performed at his/hers company (try to get everyone involved), testing games and the pub evening with only beer and socializing. This time some hands-on testing was on the menu.

Webinar
As usual we started with food but instead of casually chatting with each other we together looked at the Six Thinking Hats for Software Testers by Julian Harty (referring to Edward de Bono's thinking hats). The webinar was interesting but I missed the social part. Maybe a 5-15 minutes long webinar would have worked better for me.

Hands on
Second part was hands on. First David, the facilitator, quickly presented the application we were to test and how to get it. The application in this case was an app for taxi drivers to help them keep track of their work (part of a full system for taxi companies). Second we would, in groups or alone, test the application using the six thinking hats in any way we saw fit.

First I and a former colleague from Ericsson started out exploring/scouting the app but soon felt we didn't get any momentum and joined another group of three. A vivid discussion on how to use each hat and what fitted where ended with us being ready to actually start generating test ideas about 5 minutes before deadline but that didn't matter, at lot of interesting stuff came out of the discussions.

Sharing results
When we were back together we started the sharing of ideas and experiences from the hands on part. Immediately it became clear the various groups had used the hats very differently.

We attacked the whole app and used the various hats from the perspective of an actual user (or rather our made up view of how it was to be a taxi driver, lesson learned: better understanding of the user; for instance its needs, knowledge and general behavior, would have helped a lot). This perspective gave us an interesting high level view but it was a bit hard to stay focused as we often drilled down too deep exposing too many details. Others focused on a smaller part of the app which, in this case, seemed a lot more efficient.

Of course how to use the hats would depend on the mission in a real life scenario but in this case the mission was not set like that, the goal was just to try out the hats and learn from each other's way to approach the hats and for this, our perspective took a little too long / became a bit too complex, at least for beginners like us.

Wrap up
To wrap things up we had a more general discussion about the hats and format.

A few thoughts about the hats:
  • In this scenario we had no problem coming up with ideas quickly and the hats initially felt more limiting than helping. In a situation where we would have been stuck or at least worked with the product for a bit longer, the hats would probably had been more helpful. This might also change as we get better and better at using the hats but only time can tell how that turns out.
  • Keep using the hats on items already identified would probably render interesting results. For instance first we could use the white hat to identify a data item, say cars in the Taxi app, then we could use the hats on this item to identify risks (lack of cars), opportunities (more efficient usage of cars), new data (various types of cars), feelings (lust to drive fast), "darkness" (breakage, with passenger in the car) and ideas (what if the cars were any kind of vehicle, like bicycles or trucks, what other users could then benefit from the app). Continuing to apply the hats on each new item added would help to dig really deep down.
  • The hats would be interesting to use on the product elements or quality criteria categories described in the RST Appendices.
  • You should be able to visualize the findings quite effectively using a mind map.
  • The hats could be a great tool to help refining, populating or creating a model.
  • Someone suggested it could be useful to help explaining the various parts of testing to, for instance, a programmer (not sure I understood this correctly but as I interpreted it, it could be used to help them understand the wider "find valuable information" perspective rather than the narrow "find faults" perspective of testing).
  • If you're not stuck yet, starting with no particular hat and instead just add stuff wherever it fits in the beginning might be a more efficient way to start (possibly changes when you're more use to the hats).
  • The hats were helpful to find more non functional aspects to test.
A few reactions to the format/execution itself:
  • Generally people seemed to have liked the format a lot.
  • We had some troubles understanding the app's usage/user. Using a known application (like MS paint) or object (like a chair) might had been better ways to understand the concept of the hats. On the contrary, using this app forced us to do some scouting and work with assumptions which broaden the ways the hats could be used.
  • Like I said before, the webinar was a cool way to mix it up personally I missed the regular chatter while eating. Still very cool so a shorter webinar might be perfect.
Finally it was fun to just study my group during the hands-on part. Let's just say we had very different ways to attack the problem (relentless scouting, analyzing the input information or quickly getting a set of data to work with for instance).

Thanks!
Thanks to David Månsson, Aveso, who facilitated this meetup in a kick ass way! Also thanks to all the participant for sharing interesting thoughts. Looking forward to future experiments, this one was definitely a success in my book!

12 February 2013

This is what I do, I'm a tester

We have an application we want to create.

First we have Angie. Angie decided how the application is planned to function, look etc. based on users' expectations and demands (sometimes she's just guessing but she's good at that as well). She communicates her plan to Bob, Cody and Dora and let them have a say about the ideas. Cody is the programmer (of course), he makes things happen by writing incorrect english with very many special characters (also known as programming). But Cody needs help to make his cool stuff look cool. That's why he loves Bob, the designer. They are usually trash talking each other but deep within it's pure love.

Cody and Bob do their work based on Angie's plan but sometimes they, intentionally or unintentionally, drift away from it. Intentionally, for example, could mean Cody realizes that the plan conflicts with how the rest of the application is coded and he simply tweaks the new code to fit the old while unintentionally could simply be a misinterpretation. Also, many things are not explained in Angie's plan so Cody and Bob have to take initiatives in order to not constantly disturb Angie as well as to make their work creative and fun. Also Angie doesn't master design and code the same way Bob and Cody do so she has to trust their judgement.

Finally we have Dora. She looks at what Bob and Cody have created to get as much information as possible about the application's shape. Bad shape in this case can come in many forms (list definitely not complete):
  • Stuff is simply wrong
    2+2 is not 9 and trying to login with an empty password shouldn't crash the application

  • Stuff seemed good on paper but in reality it sucks
    A popup confirming you've made a change was a great idea until you updated 2000 items at once.

  • Stuff is not consistent
    Angie likes popups while Cody likes CSS boxes. This can become a problem, especially when there's not one but twenty Codys, all with their own preference.

  • By-products might be a problem
    A global search was a great feature until it was discovered logging in now takes 10 minutes instead of 0,3 seconds due to search indexing.

  • Some desired features weren't anticipated
    "These short service messages seems really useful, maybe we should let users send them to each others?" "SMS? Useful? Well, I guess we could do that..."

  • Bad people can do bad stuff

    Picture from xkcd.

  • Stuff simply didn't solve whatever stuff was suppose to solve
    The new wizard (not the cool magical kind, the nasty GUI kind) was suppose to simplify the creation of new documents but now the user is presented with so many options they could earlier ignore it's just insanely complex! What a heck is document structure template, I just want a new document, damn it!
Dora looks for all this to help Bob and Cody (and Angie) look good so that users (and managers) won't come with pitchforks trying to end their lives. She also helps stakeholders like Angie and others make better decisions by providing them as much valuable information as possible about the application's shape. To make sure good decisions about the application can be made it's important Dora, just like Angie, is a great communicator.

Some people think Dora's job is about making sure 2+2 is 4. They are the same people who wouldn't care if the meat they buy is rotten as long as it's packaged according to spec. Others think Dora is a sadistic mistress, which she might be, but not during working hours. If she was she would just stay quiet about problems found and see the programmers run in panic as users storm the building with their pitchforks.

- You just want to see stuff break, Dora!
- No, but I rather see it break here than in the face of thousands of users

I'm Dora! Or well, I'm not, but this is what I do, I'm a tester!

04 February 2013

Improving value from reading pt1: The School for Skeptics

Intro
"Improving value from reading" is a new series in this blog that I hope will be successful enough (as in value for me) to not just make it past the first post.

Some background...

One of many observations during my 2012 reflections was I didn't seem to get much out of reading books. In this series I will try to use/apply the content of what I read to hopefully change that.

The book
First out is the Swedish book Skeptikerskolan ("The School for Skeptics"). This book is divided into two parts, first an introduction to fallacies with examples when a practical part with illustrations of how to practice skepticism when discussing religion, politics, news etc. I will only focus on the first part by providing test related examples for each fallacy in the book.

For those of you familiar with skepticism, please help me correct my examples as well as my English translations of definitions and how they are used. I've tried my best but don't want to spend too much time just getting the translations correct.

Fallacies

Appeal to Authority
There's no value in certifications, if you don't believe me just listen to Michael Bolton!

Personal Attack
James Bach is a high school dropout, he's well over his head trying to challenge well established test methodologies.

False dilemma
Either we have test cases or we'll have no idea of what's been tested!

Appeal to consequences
I clicked the login button and everything crashed so there's something wrong with the login code.

The straw man
Exploratory testers just care about their own work. From their point of view no one else should know anything about the tests or testing status.

Argument from ignorance
We've never figured out why the node is crashing but it must be due to some OS bug.

Personal incredulity
I can't see how this patch would cause the performance issues we see, it must be something else.

Tu quoque
The senior testers don't care about following the test process so I can ignore it as well.
Comment, seems like the definition online for Tu quoque doesn't correspond well to the explanation in the book (but the term is used), is this a valid example?

Does not follow
We found a lot of bugs testing the user administration, there's no use to continue testing, this piece of software is just crap.

Final consequences
We need to embrace context driven testing if we ever want to reach an efficient test process.

Begging the question
Test cases makes our testing better so we need them.

Simultaneous
During traffic there is often screen flicker so they must be related.

Confusing currently unexplained with unexplainable
We've tried various ways to replicate the bug but failed, it's irreproducible.

Slippery Slope
They're trying to automate the smoke tests, soon we'll have to write automated scripts for every test we do.

Ad hoc explanation
- TMap can't improve any test process
- It improved my team's test process
- It can't improve any professional test process

No true Scotsman
- TMap can't improve any test process
- It improved my team's test process
- What you call test process is not really a test process

Argument to the stick
If you don't document your tests this project will fail miserably.

Appeal to Popularity
With over 200 000 certificates issued there's no doubt ISTQB is a must have for testers.

Appeal to Nature
Exploring software is just the natural way to test, it must be more efficient.

Guilt by association
Stop questioning his ideas, don't forget he was the brain behind both our previous successful releases!

27 January 2013

The year I became a passionate tester, part VI

December
The only (but really cool) event that occurred was Securitas Direct/Verisure's christmas party. It was hosted in their beautiful office in Malmö. Can't say I'm jealous, I mean, who wants a workspace with ocean view in three directions? Well, maybe, at least I got over it and had a great time with future colleagues, including one of the two guys who'll be on my dream team in Linköping (not putting pressure on anyone). The only new year's resolution I'll make by the way is coming dressed up next year (costume party).

Apart from that, December was mainly a month of reflection, Twitter and family life. Via Twitter I started to get in contact with some really cool testers like Jari Laakso, Kristoffer Nordström and Jean-Paul Varwijk. Via reflections I started to grasp/understand/better appreciate my progress as a tester this year (reason for this blog post series) as well as wrap up the work done by me and my colleague at Ericsson. Cool things that need separate post(s) to really do them justice.

Wrap up of 2012
So, 2012 was an amazing year, when it started I considered testing being an activity that mainly demanded endurance, patience and the ability very rigorously read technical documents (all skills/traits/knowledges I'm definitely not famous for, at least not in a good way). Now when 2012 has come to an end I value a whole bunch of other skills/traits/knowledges instead like observation, creativity, general system knowledge, analysis and communication. I now find testing adventurous and exciting rather than a repetitive routine work.

I've also met a ton of amazing people! Some have been thanked already but of course there are tons of you who I've not even mentioned. I will not attempt to list you all, instead THANKS TO ALL OF YOU WHO, IN ONE WAY OR ANOTHER, HAVE HELPED ME ON MY JOURNEY!

Some special mentions left to do:

EAST
EAST is a local open network (Linköping area) for testers, started by Johan Jonasson and Johan Åtting. When I started to look at its role in all this I realized:
  • EAST was the place there I got convinced I needed to attend RST.
  • EAST was the reason I picked up LinkedIn and without LinkedIn I would not had come in contact with Maria Kedemo (new job).
  • Speaking about job, I met Johan Åtting via EAST, that was the other amazing job offer this year.
  • EAST was one of the big reasons I started blogging (first test related post is about my first EAST meetup).
  • Johan Jonasson, Johan Åtting, Joel Rydén and other EAST participants have all been role models and sources of inspiration, key to last year's events.
  • EAST became the crucial link between the testing world existing at my work and the testing world existing in the rest of the world.
I could go on but let's just say I'm forever grateful for EAST and I hope I can inspire/help other participants to start similar journeys. Thanks to all who have participated or in other ways contributed!

Blogging
Starting 2012, blogging to me it was more or less just a poor narcissistic attempt to make something hopelessly uninteresting, interesting or it was something used by the top elite within a business to share their ultimate knowledge about something. At the end of 2012 I tell everyone who wants to progress to blog, not for anyone else's sake (that day will hopefully come) but for their own personal development. Blogging has taught me the value of reflection, greatly improved my ability to put thoughts into words (which is essential to understand, analyse and explain something), it has generated new perspectives, changed priorities or given me a more sensible view on some things, it has improved my English, my writing skills and my "technical storytelling".

Later when some of my blog posts became interesting to other testers it became a great way of getting in contact with other testers, getting feedback and acting as a portfolio. From my heart I urge you to try it out, not, to repeat myself, to become famous but to develop yourself!

Work
I undertook an amazing adventure at work that just didn't fit the format of this series. It didn't include dragons, fairies and elves but, considering how we traditionally have worked with testing, it sometimes felt just as exotic. Some day I'll blog about it but for now I just want to thank my other partner in crime, Saam Eriksson, a passionate, smart tester I hope will continue and further improve our work now when I leave Ericsson (3 working days left). You've meant a lot to me and my development this year Saam, thank you!

Everything comes with a price, dearie
24 hours a day, that's one of few hard constraints we have in life. During these hours we should rest, eat, take care of family and friends, work, take care of ourselves, develop/follow dreams, follow through on commitments, manage everyday activities (pay bills, clean up, get to and from work etc.)... and probably other things I've missed. This constrain is a huge pain in the butt, especially if you, like me, have very many things you love and want to explore.

A few thoughts and lessons from 2012 about managing this puzzle:
  • If you have kids, make sure you put them in the center of the puzzle.
  • The puzzle won't (in a good way) solve itself when there is an overflow of activities.
  • Each piece can be optimized (value/quality per time unit) by observing, reflecting, analyzing and prioritizing, but you need a balance there as well (premature optimization is the root of all evil, you know).
  • The usually extrovert me quickly becomes introvert (not in the positive way) when I don't feel I have enough "me-time" and that affects all parts of my life.
  • It's easy to be blinded by your own passion but don't forget there are other things in life like friends, family and yourself.
I survived the end of the world, here I come 2013!
This year couldn't have come to a better start as my two fantastic boys got the most amazing little sister 11:10pm, January 1st.

Things I know will happen 2013:

Things I feel motivated to do now and thus might do during 2013:
  • BBST Foundation course
  • Reclaim my presentational skills
  • Connect with more testers on Skype
  • Figure out how to get more out of reading (or start prioritizing other learning activities over reading)
  • Be a speaker at some testing conference
  • Try Skype coaching (as a student, had a taste of it thanks to Peksi)

Things I aim to continue with:
  • Be an active blogger
  • Be active in EAST
  • Practice/play/experiment with testing
  • Experiment with my learning
  • Meet more testers in person

Finally, and foremost, I will continue enjoying my life as a dad!

Blog series wrap up
Thank you for reading this, if nothing else it has inspired and taught me a lot! If you want to start a similar journey like mine, feel free to contact me for help, tips or mentoring using comments on this blog, Twitter (@brickuz) or any other way you figure out.

Finally there are two more people I need to thank:

First, my fiancée. Among the million reasons I have for that; thanks for supporting me, inspiring me and being an awesome mom to my kids! Thank you, thank you and thank you!

Second, myself. This was at times a rough ride where I had to stand up and take responsibility for my actions to be able to continue, make sacrifices, take risks and really challenge both my ego and fears to succeed. Thank... me... for this journey!

2013, here i come!

23 January 2013

The year I became a passionate tester, part V

November 1st
Worth a special mention. The day was crazy for several reasons but three highlights:
  1. Securitas Direct asked for a second interview
  2. Invited to SWET 4
  3. My note taking post was retweeted by Michael Bolton (may sound silly but it was huge to me at that point)

RST (Rapid Software Testing)
RST started great, I was on top of my game, asking relevant questions, thinking, observing, reacting... all until the first break. I don't know exactly why but I lost track after that (had a great time anyway but was not happy with my own contribution to that).

When apologizing to James he responded:

I was going to say you should talk more.
First thing tomorrow you'll see something you haven't seen.
...
Well, don't worry about... or worry if you want, because I'm going to have you do the Mysterious Sphere problem.

I immediately scanned google, RST blog posts and other resources but found nothing.

A good ten hours of sleep made the trick and day 2 I was on top of my game again! Just before lunch James turned to me and said something like:

After the break we'll see Erik here, test the Mysterious Sphere, the only exercise that has made anyone cry during my class.

It was amazing! I still go "Ohhhh! That's why... I should have... what if... now I get it!" when I think about it. I'm forever grateful I got the chance despite my bad start!

Day 3 wrapped up an amazing experience! I left RST with my precious blue pouch (RST graduates will understand), new contacts, a raised confidence in myself as a tester and 21 pages of sketches, ideas and epiphanies. Thanks James Bach, Tobbe Ryber, Robert, Niklas, Per, Tiago, Layer10 and all the others that made this experience possible/invaluable! If you haven't participated yet I suggest you check out David Greenlees post on how he got to RST and schedule a meeting with your your boss!

Get the most out of RST:

  • Get there well rested (important!), the course will demand a lot of you.
  • Look up the word heuristic to make sure you understand it.
  • Show you want to be challenged (be active), it'll make the course even better.
  • Check out the RST slides before, I think that helped me stay focused on the discussions.
  • Don't care to print the slides and bring them to the course though, as slides will be skipped, order changed and the exercise slides are hidden anyway.
  • Faking will only lead to problems.
  • Read up on the Socratic Method (a low pressure example, expect more "pressure" in class)
  • Make sure you know how to get to the facility, have travel tickets in place etc.
  • Ask questions!
  • Drop the idea of finding the ultimate answer to the questions asked (important!). Most questions will not have one! Accepting that will help you both learn and contribute more.
  • Don't worry, things will just happen and you'll do great!
Thanks to Johan Jonasson, Joel Fridfjäll, Joel Rydén, David Månsson for contributing to the list together with James Bach, Michael Bolton and Paul Holland.

SWET 4: Intro
Please read Johan Jonasson's great post on SWET 4 for details on the event. Maria Kedemo has also shared some thoughts, especially check out the "what happens now" section, great reading!

After receiving my invitation I called Johan Jonasson:
- Am I really qualified for this?
- Go there and find out!

It was amazing! The presentations were interesting, the following discussions even more so, the lightning talks were cool and the people inspiring, I had the time of my life!

A funny thing: Maria Kedemo was presenting "Model based exploratory interviewing" or, in other words, her interviewing model when recruiting testers for Securitas Direct.

On the topic of switching roles: SWET started 2 days after RST ended, felt a bit weired to go from being James "unknown student" at RST to become his "peer" at SWET. Weired as in a good, interesting way I should add.

Want a quick reason why I think Securitas Direct and Sectra are two great companies for testers by the way? Both companies' test managers were among the 15 invited to SWET 4 (Johan Åtting couldn't come though and sent Joakim Thorsten instead).

I need a separate post to describe this crazy experience in any detail but one advice: If you ever get the chance, take it! I was really intimidated by the concept and merits of the other testers but as I arrived I quickly understood why great testers love peer conferences, you (can) learn so much! Going was definitely one of the best decisions this year!

One final advice: Prepare your lightning talks, I changed topic ten mins before walking up, the result was messy.

Special thanks to Torbjörn Ryber, Henrik Emilsson and Rikard Edgren for arranging this amazing event as well as giving me the chance, and thanks to Martin Jansson, James Bach, Sigurdur Birgisson, Sandra Camilovic, Anna Elmsjö, Johan Jonasson, Maria Kedemo, Oscar Cosmo, Saam Koroorian, Simon Morley, Joakim Thorsten... and myself... for helping them make it amazing!

New job
After interesting discussions, cool exercises and talks to potential future colleagues it was time to make a decision. In the end it came down to details; both Securitas Direct and Sectra have amazing testers, culture, test managers and products. Finally I decided to go for Securitas Direct choosing, what I interpreted, as the bigger challenge over Sectra's more helpful environment (all but one tester at Securitas Direct work in Malmö, not Linköping where I work).

Anyway, I think Joel Ryden (brilliant guy, test consultat at Securitas Direct and former employee at Sectra) summarized my situation quite well: "Whatever you choose it'll be a great choice". I'm thrilled to start my new job at the end of next week, something I'll talk more about in a later post!

Jiro dreams of sushi
This documentary is great enough to mention here. Watch it!

Kids
Finally... kids can smell when you really need sleep (like the day before RST and SWET). But at least that gives you a lot of time to hug them before you leave. Jokes aside, coming back from SWET 4 having my two boys rush into my arms was by far the best of all the cool things in November. Be passionate about testing but remember what's most important!

My boys pairing up to test/hack/crash a tablet.

Summary
November was crazy, feels like I've just mentions half the cool things that happened! Anyway, three key take aways:
  • Don't worry about if you're good enough or not, just get out there and learn until you are!
  • RST is a mind-blowing course that I recommend to every tester!
  • Practice putting thoughts into words (blog, reflect, join debates...), it's a critical skill to anyone, not just testers!

Starting level: Prepared
Finishing level: Self-recharging bomb of inspiration

16 January 2013

The year I became a passionate tester, part IV

October
I've covered January to September in three posts. October and November however were way too eventful to keep that tempo so this post will only cover October... just in case you're puzzled by the headings (they are not months in some ancient calender).

Work opportunities

With two days apart, immediately after I had made myself available to job offers, I was contacted by possibly the two most interesting companies I could think of (from what I could tell; great culture, interesting products, passionate testers, top notch test managers and the right location). First Maria Kedemo, test manager at Securitas Direct, asked me to consider a job opportunity they had announced several months before but not found a candidate for yet. Second, Johan Åtting, test manager at Sectra, and I had a discussion regarding my RST situation. Somewhere in that discussion he asked me to send an application. I was on cloud nine! I'll tell you how it turned out in part V...

Some thoughts on improving your chance of getting great job offers:

  1. Learn to test
    Practice, practice, practice
    Explore, experiment, play
    Read, watch and listen to great testers
    Get a mentor or ask for coaching
    Ask for help when you need it
    Courses, webinars, books
  2. Create a portfolio
    Start a test blog or website
    Share your story (or CV if you're boring like me)
    Share your work and experiences in whatever way you prefer
    Create a LinkedIn profile
    Create a Software Testing Club profile
  3. Broaden your network
    Be active in your local test community

    Socialize with testers on Twitter
    Start connecting with testers on Skype
    Go to a conference
    Be open to receive mentoring, help or coaching
  4. Get some reputation
    Be a presenter

    Accept challenges (like testing puzzles)

    Mentor or coach testers

    Answer questions on, for instance, test forums

    Make relevant comments on blog posts and articles

    Host a meetup, conference or similar
Course
Instead of the Rapid Software Testing course, my employer sent me to a centrally purchased course, arranged by test researchers from MDH. As I didn't share the course instructors' opinions about testing it provided a great opportunity to practice challenging ideas as well as arguing for my own. Some examples of things I challenged was that a model had to be an actual simulation (James Bach would go bananas if he met the instructor for that part of the course), how exploratory testing is presented in the agile testing quadrants (see picture below) and the idea of having automation as a goal. To give the course some credit, it did teach me the theory and formal names on some stuff I already used, like equivalence partitioning.


The agile testing quadrants, I stay skeptical to this...

In the end I got a certificate which I would have received even if I had sat quiet, understanding nothing, for four days (unrightfully it wasn't in endurance).


Some thoughts on how to get more value out of a (testing) course:
  • Ask questions as soon as you don't understand something or how something is relevant
  • Reflect on why you disagree with ideas presented and try to express it (challenge)
  • Practice constructive feedback on the instructors
  • In your mind, try applying the ideas presented to your own context, does it make sense? Why not? Can you think of a context where it does?
  • If you have a colleague attending the course, reflect on its content together afterwards
  • Experiment with the stuff you've learned to make it stick
My first hosted local meetup
I had joined all the EAST meetups since I first heard about the group in May but, apart from calling to a pub evening, I had not hosted a meetup. That changed in October as ~20 testers gathered in a conference room at Ericsson discussing testing in an agile context, CAST 2012 and various other topics. Apart from almost having to squeeze in 20 people in a room designed for 8 and giving everyone the wrong address, it all went smooth. Definitely something I would like to do again.

Some thoughts on facilitating a (local) test meetup

  • Facility: restaurant/pub, talk to your employer, check for a sponsor
  • Inform: colleagues, other contacts (use consultants), LinkedIn, Twitter, flyers
  • It's always nice to have at least one friend/colleague you know will show up
  • Use any existing group (if no test group is available use an agile, craftsmanship, developer or other similar group) to spread the news
  • Scout: attend other meetups (test related or not) and conferences
  • Content: some (emergency) discussion topics are nice
  • Content: watch an online presentation together is an option to a live presentation
  • Don't worry about what people will think. The initiative is more than enough to please most!
  • Make things interesting by experimenting!
The blog post
To prepare myself for RST I did, as I often do, experiment with a way to practice what I wanted to learn. The big difference this time was I blogged about it (Practice: Note taking). I had not anticipated the response. People retweeted it, started following me on Twitter (well, some quit after a while but anyway .) and old posts suddenly had their number of reads doubled and tripled.

Some blog tips from a newbie (most based on other blogs I like rather than my own):
  • Share your experiments
  • Share your experiences
  • Share your mistakes
  • Share your problems and ask for help
  • Share your insights
  • Share your ideas, but putting them into practice first will probably be more appreciated
  • Be brief
  • Tell a compelling story
  • Don't force-feed people with your posts, especially your not so great ones
  • Ask yourself: How is this relevant to someone else?
  • Ask yourself: Why is this important to me?
  • Use pictures/visualizations to communicate more (I'm great at this when presenting, I suck at it in my blog)... it's also more fun.
  • Uptight blog posts are shot on sight, relaaaax, it's sexy to be vulnerable. 
  • A blog post is an efficient tool to help reflecting (this was an amazing experience for me)
  • Use headings and lists to make stuff more readable / easier to get an overview

Pekka Marjamäki
James Bach praised "Peksi" during an RST course in Finland so I looked up his work, was impressed and started following him (blog and Twitter). Soon after he found my Why I've signed up for RST instead of buying a kick ass sofa and new computer post and he contacted me asking if I would like to try an exercise as a warm up for RST.

The exercise was about him claiming something (in this case "Programmers can't test their own code") and my work was to, by only using questions, convince him he was wrong. I didn't fare that well as the mission goes (I had a strategy, that he gave me credit for, but let's just say it wasn't efficient) but I learned tonnes! I am forever grateful for his help and I hope we can do something similar soon again!


Finally I think Peksi's initiative says a bit about the context driven testing community in general. If you want to develop as a tester, just reach out and people will help you! To me there seems to be more teachers than pupils so help out by asking for help.

A few ways to get in touch with great testers

Summary
October was in a sense a preparation step for the events in November, but as you've hopefully acknowledged by now: the journey is just as important as the end goal. See you in part V!

Starting level: Committed
Finishing level: Prepared

09 January 2013

The year I became a passionate tester, part III

This is the continuation of my 2012 reflections series, reading part I and part II first is recommended.

The great adventure you will not hear about
My greatest adventure this year is the journey from a heavyweight to lightweight test process in my team at Ericsson. However, that story just doesn't play well with the month to month format so I'll save that to some rainy day in a distant future. However here is the 10 second version and I'll also provide you a few lessons learned at the end of this post:
  1. I tried to introduce exploratory testing
  2. I screwed up badly
  3. I finally started reflecting on my work
  4. I started to turn things around
  5. I became we
  6. We kicked ass, testing was more fun than ever
  7. People (at least some) started to take note
  8. We kicked ass, testing was better than ever
  9. We started preaching about what we thought was amazing
  10. Job finished, result was, we believe, a heck of a lot better than traditionally (talking quality of test, motivation and how much we learned).
July and August - Making promises
Mostly vacation, a lot happened but only one thing suited for this post.

After having my RST (Rapid Software Testing) course request put on hold over and over since May, with no good explanation to why (came later), I promised myself two things if the request was denied:
  1. Make myself available to job opportunities
  2. At least look into the possibility of paying the course myself.
September - The final no
Early in September I told my fiancée I wanted to attend RST, with or without support from my employer. She simply answered "if that's what you really want, I trust your judgement" (I love my fiancée even more sometimes).

When I finally got a no from my job I simply requested vacation and registered myself for the course and it actually felt exactly that easy. I felt ecstatic for so many reasons:
  • I had signed up for RST, the course I wanted to attend so badly!
  • James Bach was the person inspiring me to start this journey, having him as instructor meant tons to me!
  • I had proven to myself that I was really committed to becoming a great tester!
The countdown had begun...

Insights so far
Not too much of interest in July, August and September, instead, let's look at 3 useful insights from the adventures at work.

Schedule time for reflection
Continuous reflection is key to keep you on track but I noticed the more I drifted off course the less I naturally spared time to reflect. One change that rendered great results for me was scheduling time for reflection. In my case I dedicated 1 hour per week where I dropped everything and just focused on what I was doing, issues, what my current direction/goal was and making models...

Visualize what you're doing
One of the turning points, going from chaos to success, was when I created my first visual model of what I was doing. The model was simply a timeline (for the feature/deliveries) where I started to add all the activities I was doing (without details). From that model a new one occurred; a model describing how I wanted to work. That model could easily fit on a paper and became our central tool when describing our work, discussing improvements and when reflecting in general. The model also helped me explain what I was doing better. In retrospect I think making a similar model based on how we use to work would have made the model even more powerful.
Also, the moment we could visualize our status, coverage and plan (spreadsheet with some fancy additions, can't talk about content though), our credability went through the roof.

Challenge your own ideas
One exercise I've found useful is simply to defend things I believe in against a ferocious attacker (either played by myself or, even better, a colleague). I've found it useful not only to evaluate ideas but also to improve my understanding of them, practice arguing and put thoughts into words.
Example:
- We should stop enforcing testers to document test scripts it's way to expensive.
- But what happens to traceability?
- We never use scripts when we check back on previous work, a well written title is enough. 
- But we save money each time we reuse a test case!
- That's a great idea but in reality we rarely reuse test cases since it's quicker to write new ones and also the product still changes too rapidly to make tests valid even a month later.
- But it's a great way for new testers to learn how to test!
...

Starting level: Inspired
Finishing level: Committed