User psychology: how confirmation bias impacts behaviour

Using behavioural science to understand cognitive biases can help us design user experiences that are more satisfying and increase the number of people who complete our conversion goals.

User psychology

The idea that our behaviour is governed by predictable biases is a widespread idea often used to explain why we behave the way we do. A bias is often defined as a cognitive shortcut, or rule of thumb, which helps us make decisions quickly without requiring our brains to perform complex calculations.

It is often implied, if not explicitly stated, that biases are a symptom of our brain’s laziness. Our brain, according to these views, simply can’t be bothered to do a better job.

The main symptom of this laziness is the brain’s willingness to let our feelings get in the way of our rational thoughts.

Indeed, ever since the publication of Daniel Kahneman’s hugely successful book “Thinking, fast and slow” there has been a general acceptance of the idea that some of our most profound behavioural shortcomings are a symptom of fast and unconscious thinking, similar to what we mean by “gut feeling”, left unchecked by slow and deliberate thinking which could have “corrected” our initial reactions to a situation.

However, the latest in behavioural science has developed a much more nuanced understanding of our biases, as I will illustrate in this blog. But can we predict user behaviour because we understand common cognitive biases or do we need to use additional techniques, such as A/B testing and user testing, to understand the unconscious needs and behaviour of our website visitors?

Confirmation bias in user psychology

Take a behavioural phenomenon which often goes by the name of “Confirmation Bias” which has been described as “the human tendency to seek, interpret and remember information that confirms pre-existing beliefs”. Confirmation bias has been used to explain almost anything and everything, from why we feel bad about ourselves[1] to why love is blind.

Confirmation bias has also been blamed for the observation that in the Brexit debate people not only interpret the same information differently depending on which side of the divide they are on[2], but also get their information almost exclusively from sources that tend to confirm their views in, so called, echo chambers[3].

The Wason Selection Task

Possibly the most famous experiment used as evidence for confirmation bias is the “Wason Selection Task”. In this task, participants are asked to select information to test the truth or falseness of a proposition. Imagine that the picture below shows four cards which each have a digit on one side and a colour on the other. Which cards (note the plural) would you need to turn over to test the proposition that “if a card has an odd digit on one side it is coloured red on the other side”? Give it a try before proceeding to the answer below.



If you chose to turn over the “3” card and the red card you are in good company, as 80-odd% of people do the same, but you would also be wrong. The correct answer, is that you should turn over the “3” card (so far so good) and the brown card.

The fact that most people choose to turn over the red card rather than the brown card has been interpreted as evidence for the confirmation bias because finding an odd digit on the back of the red card would constitute confirmatory, but not conclusive, evidence. Whereas finding an odd digit on the back of the brown card would constitute contradictory evidence which is also conclusive.

It seems therefore that in this task, people overwhelmingly seek confirmatory evidence rather than contradictory evidence at the cost of non-conclusiveness.

Feelings and irrational behaviour

Explanations for confirmation bias often blame our emotions. The reasoning behind these explanations assumes that we don’t like the feelings associated with having to contemplate the possibility that our beliefs may be wrong. We may therefore think of reasons why the contradictory evidence could be irrelevant, or wrong, to ward off such feelings or try to avoid experiencing such feelings by avoiding being confronted by contradictory evidence altogether.[4]

Unfortunately, such an explanation explains very little because the obvious question that arises immediately is: “Why do we feel that way about contradictory evidence?”

To complicate matters, it turns out that our willingness to embrace contradictory evidence is highly context dependent.

Have a look at a slightly different version of the Wason Selection Task represented in the picture below. Imagine that the four cards in the picture represents customers in a pub. Each card has the age of the customer on one side and what they are drinking on the other. Now think about which cards you would turn over to check that “all customers under the age of 18 are drinking a non-alcoholic drink”.



I bet that even if you hadn’t seen the previous version of the Wason Selection Task you would have turned over the “16” card and the “beer” card, which is the correct answer.

It appears that in real life confirmation bias is not as universally common as it is often made out to be.

So, do we know in what kind of circumstances confirmation bias is going to colour our responses to new information?

Fortunately, we do, but they do require a reappraisal of the word “bias” from something that is always bad to something that can also be good.

Practical versus ‘rational’ thinking

Suppose we create yet another version of the Wason Selection Task. You are asked to test the hypothesis that “if you eat food that is past its use-by date, you get ill”. This time, however, there are no cards. You have to gather information from your environment in order to decide whether this proposition holds true or not. Take a moment to think about what information you would be looking for.[5]

Before we get to the answer, let’s think about how this new version relates to the first version of the Selection Task discussed above. Eating food that is past its use-by date is the equivalent of an odd digit, while being ill is equivalent to the colour red. Conversely, not eating food past its use-by date is equivalent to an even digit and not being ill is equivalent to the colour brown.

Thus, turning over the “3” card is equivalent to finding people who have eaten food that was past its use-by date. Clearly, if we find any such person and they are ill we have found our confirmatory evidence. But we also need to rule out any contradictory evidence, which means finding people who are not ill (i.e., the brown card) to see if they have eaten food that was past its use-by date. This second task, however, appears to be insurmountable.

Most people you encounter will not be ill – What are the chances that any of them will have eaten food that was past its use-by date? Wouldn’t it make much more sense to look for people who are ill to see if they have eaten food that was past its use-by date or not? This, of course, equates to turning over the red card in the original version.

So, any time it is more practical to look for confirmatory evidence as opposed to contradictory evidence, confirmation bias can save us a whole load of effort, which clearly demonstrates that doing the ‘rational’ thing may often not be the most practical thing to do.[6]

Cognitive biases can serve us

In optimisation, understanding biases helps us formulate hypotheses for why website visitors behave the way they do. These hypotheses can be tested using techniques such as A/B testing and understood using research techniques such as user testing. However, as optimisers we are equally prone to biased decision making which can affect the design of the variations used in our tests.

When visitors arrive at a website, we want them to stay and complete our conversion goals. But what do our visitors believe when they arrive? Do they believe it is worthwhile to stay and attempt to find what they are looking for or do they believe that would be a waste of time? In the former case their default position will be to stay unless they find evidence that their initial belief was wrong. In the latter, they will leave. Therefore, any confirmation bias will work in our favour in the former case, but against us in the latter.

Time is precious, which means people make up their minds quickly about whether they should stay or not. We have seen that confirmation bias is all about practicality over conclusiveness.

For visitors who are not sure about the value of staying we need to make it immediately obvious that looking for contradictory evidence is both practical and potentially rewarding.

At this point the skills of the behavioural scientist have to be combined with those of the interaction and UX designer. Together we can devise tests that are both practical and based on sound data-driven hypotheses as well as theories from behavioural science.

If you’d like to learn more about how we use behavioural science to improve our services, please get in touch on 0161 713 2434 or email [email protected].


[1] Because we pay attention only to information that confirms our insecurities.
[4] Studies that picture the brain’s activity show a pattern of responses to contradictory evidence involving the emotional areas of the brain suggesting that such feelings truly exist (E.g., Jonas Kaplan, Sarah Gimbel & Sam Harris – Neural correlates of maintaining one’s political beliefs in the face of counterevidence – Scientific Reports 2016).
[5] This thought experiment has been modelled on a study by Mike Oaksford and Nick Chater – A Rational Analysis of the Selection Task as Optimal Data Selection – published in the Psychological Review in 1994.
[6] This is very similar to Politiek and Berndsen’s idea of hypothesis testing as risk behaviour! – Journal of Behavioral Decision Making, vol. 13, pp. 107-123

Nina Mack

10th September 2019

Trying to increase conversions?

Get a free 15 minute CRO strategy call with a senior optimiser

Request a callback

Are you ready to improve your website’s conversion performance?