# Confirmation bias in software engineering

Of all the limits on our cognition, confirmation bias is perhaps the ugliest. Confirmation bias is hard to get to grip with as it leads us to consider even contradictory evidence as confirmatory. Does confirmation bias affect even software professionals? Of course it does, we are all in the grip of confirmation bias. Even, the debate about estimates (or noestimates for those who prefer that) is affected by our cognitive limitations.

[bibshow]Jorgensen and Papatheocharous take a shot at confirmation bias in software engineering and software development in a recent paper. [bibcite key=”citeulike:13818735″] Their paper describes a literature overview based on a search in Google Scholar:

((“confirmation bias” OR “confirmatory bias”) AND (“software development” OR “software engineering”))

Confirmation bias has troublesome implications for us. For instance, one study showed that “agile believers” tended to interpret random data as favourable to agile. Another consequence of confirmation bias can be that customers and suppliers have very different expectations on the delivery and that these differences aren’t discovered before acceptance starts.

## Hourly or fixed price

In a small study, Jorgensen and Papatheocharous asked software professionals if a certain project was better run as fixed price or time and material based (hourly). The result was a significant confirmation bias $p=0,02$. Professionals with a prior preference for fixed price selected fixed price and vice versa.

This confirms what I have seen myself many times. An over-belief in either contract system can lead to failed relationships between parties and a lack of success. But here, this is not just an individual bias, it is often also institutionalized in organizations. Even if the individuals in the organization want to use a different contract model, it is very often not possible due to e.g. “procurement rules”.

## Confirmation bias and estimates

A team estimating their next iteration using planning poker. What agile estimation units are they using?

Another area they studied was estimates. It is quite obvious that there is a lot of confirmation bias in this field. Just look at the “no estimates” debate. The debate is so ripe with various biases that this blog post about the no estimates debate even specifically points out that it is “unbiased”.

Jorgensen and Papatheocharous do not look at no estimates but at analogy based and regression based estimation. They look at which type of model comes out best in a meta-analysis. At a first glance, analogy based estimation is better, however if “we exclude the comparisons where the researchers’ own analogy-based estimation methods are compared with regression-based methods, the opposite conclusion is reached”.

## What should we do?

True leadership cannot rely on faulty thinking. Given that we know that confirmatory bias is present, even in the scientific literature about estimates and other areas of software engineering and software development, what should we do about it?

The first step is awareness: You, me and everybody else are affected by this bias.

The second step is to always consider at least three valid options before we make a decision. If we consider only one, we will be right in the trap. If we consider two, we are very near the trap. Looking at three options makes it easier for us to avoid this intellectual trap. As consultants we are often asked to give our opinion about how various problems should be solved and that is an area where we can make a difference by always presenting three or more valid options to the client.

What is step three? How do you avoid stepping in the confirmation bias trap?

[/bibshow]

Image sources