Manage your biases as a tester – Part 4/4

Manage your biases as a tester – Part 4/4

This is the fourth and last article of the series about cognitive biases. Please start with this first article if it’s not done yet: Manage your biases as a tester Part 1. In this last one, we’ll see some biases of the category “What should we remember” of the Buster Benson‘s categorization according to his article

 


What should we remember?

 

Negativity bias

Something very positive will generally have less of an impact on a person’s behaviour and cognition than something equally emotional but negative.

opposites-489521_640Be careful not being too negative about someone else work, because it will have a greater impact on people than the opposite positive words. Because of the “Ikea effect”, everyone is very sensible about his own work and want it to be appreciated, not criticized. We, software testers, have this responsibility to be good communicators in all circumstances; for good news, but also for bad news. Also, giving some empathy will always be rewarded.

Fading affect bias

More commonly known as FAB, it is a psychological phenomenon in which information regarding negative emotions tends to be forgotten more quickly than that associated with pleasant emotions

halloween-1743272_640This one tells that if you give too much negative emotions in the information you give to your stakeholders, then, because of the “Fading affect bias”, these information will tend to be forgotten more quickly which is absolutely not what you are expecting. Do you remember this Critical issue you share with Product Owners and that is still not fixed, and not even at the top of the backlog? Do you remember how you present it? Do you give the information being neutral, unemotional and factual? If not, this FAB makes your issue low in the backlog.

 

 

Recency bias

Give more credence to most recent observations

nikon-882165_640If for example you filed an issue 6 months ago, and now test it again and have a different behaviour, which observation should you trust more? The most recent seems to be the one as the product has evolved and this latest test is probably the best one. But what if you misinterpreted the description that has been written 6 months ago? What if you didn’t test exactly the same scenario and that’s why you don’t observe the same behaviour?

You should always take care of contradictory information and check if there is real reason for the behaviour to have evolved. Ask the developers, look in the git log, search in your issue tracker…etc, but please don’t just jump to the conclusion that it works now because you couldn’t reproduce anymore. Sometimes the reality is far to be that simple.

 

Primacy bias

Give more credence to the first observations I make.

blue-143734_640Following “Recency bias”, on the contrary, don’t give more credence to first observations. For Software Testing, as we are used to test for non-regression and can imagine the progression of a software, my feeling is that this bias is presumably less prone to occur.

 

 

 

 

Google effect (digital amnesia)

Tendency to forget information that can be found readily online by using Internet search engines such as Google. According to the first study about the Google effect people are less likely to remember certain details they believe will be accessible online.

mac-459196_640Last but not the least, the Google effect. In our mind, we tend to not try to remember the content of our searches, but the way we found it. Is it really a problem for nowadays Software Testers? Is it a problem when you know how fast things are going, meaning that the search you’ve done 2 months ago now returns better results. If you’re afraid to lose things, some tools exists to backup your brain: see Evernote, pocket, Google keep or Framabag/Wallabag  if you do care about your data.

 

 

 

Conclusion

The list of biases and fallacies is not exhaustive, neither are the examples I gave in these four articles. I’d really appreciate getting some feedbacks and other bias idea a Software Tester has to deal with. Thanks for reading.

 


 

References
Buster Benson: “Cognitive bias cheat sheet – Because thinking is hard”
Michael Bolton: “Critical thinking for testers”
Maaike Brinkhof: “Mapping biases to testing”
Wikipedia: “List of cognitive biases”
Daniel Kahneman: “Thinking, Fast and Slow”

Share this... Tweet about this on TwitterShare on LinkedInShare on Google+Buffer this pageShare on FacebookPin on Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *