The human brain is plagued with cognitive biases – flaws in how we process information that cause our conclusions to deviate from the most accurate description of reality possible with available evidence. This should be obvious to anyone who interacts with other human beings, especially on hot topics such as politics or religion. You will see such biases at work if you peruse the comments to this blog, which is rather a tame corner of the social media world.
Of course most people assume that they are correct and everyone who disagrees with them is crazy, but that is just another bias.
The ruler of all cognitive biases, in my opinion, is confirmation bias. This is a tendency to notice, accept, and remember information which appears to support an existing belief and to ignore, distort, explain away, or forget information which seems to disconfirm an existing belief. This process works undetected in the background to create the powerful illusion that the facts support our beliefs.
If you are not aware of confirmation bias and do not take active steps to avoid it, it will have a dramatic effect on your perception of reality.
Ready for a new cognitive bias? Actually, I think this one is closely related to confirmation bias and just shows how complicated it is to think about how humans think. Our minds are noisy committees with multiple factors interacting in chaotic ways. Teasing apart specific mental phenomena can be tricky, and it is best to assume that no one psychological study can ever definitively do so. We need to look at any psychological question from multiple angles and triangulate to the consensus.
In a new study, psychologists Ben Tappin, Leslie Van Der Leer, and Ryan McKay tried to separate out confirmation bias from desirability bias. They defined confirmation bias as a bias toward a belief we already hold, while desirability bias is a bias toward a belief we want to be true. These are not always the same thing. I may want to believe that George Lucas is a talented and skilled director, but reluctantly have to accept evidence to the contrary.
In their study the authors surveyed 900 people prior to the 2016 presidential election about which candidate they want to win the election, and which candidate they think will win the election. The survey was conducted at a time when the polling data was ambiguous and did not show a clear winner. About half of the participants believed their preferred candidate would win based on the polls.
That in itself is interesting. I would have thought the number would be higher. That was just one point in time during a particularly volatile election, however. I also wonder how a prediction differs from a belief – the question is not what people believe to be true, but what they predict is likely to happen. There are other factors involved, such as whether or not people are more optimistic or pessimistic.
In any case, they then exposed the subjects to new polling data and again asked them who they thought was most likely to win. The new polling data either confirmed or opposed their pre-existing belief and their desired outcome. What the researchers found was that people would change their prediction if the new poll confirmed what they wanted to be true more than if it disconfirmed their desire. However, prior belief did not predict how subjects would react to the new data.
The authors conclude that their data supports a desirability bias, but not a confirmation bias. While this is a reasonable conclusion, I don’t think we can generalize much from this one study (and the authors do not suggest that we can).
As I stated above, human thinking is complicated with many possible factors to consider. I don’t think we can make any general statements about how people treat “beliefs” – we need to at least identify different kinds of beliefs. We already know that people treat emotionally-held beliefs differently from emotionally neutral beliefs. We happily update the latter when we receive new information, but we cling tightly to the former and may even tighten our grip in the face of disconfirming information (a backfire effect). This phenomenon is referred to as motivated reasoning.
So, in this study, some of the subjects may have believed that their candidate would win as an emotional belief. Fewer people probably had an emotional attachment to the belief their candidate would lose. This would mean as new data came in people would cling to the notion their candidate would win, but not that they would lose.
Even this conclusion, however, is too simple. If you recall, Trump was claiming that the election was rigged. It is therefore possible that some Trump supporters believed Hillary would win because the election would be rigged. When new polling showed Trump might actual win, they could easily shift to the conclusion that Trump was popular enough to beat even a rigged election.
I also find that people can moderate or even change their opinion in the face of overwhelming evidence, if it is sufficient to overcome their motivated reasoning. For example, many times I have engaged with people who deny global warming. When confronted with the solid evidence that the Earth is, in fact, warming some deniers will still cling to denial that the warming is real, but others will retreat to the position that even though the Earth may be warming, we don’t know if humans are causing it. Or, even if we know humans are causing it, we don’t know that the consequences will be bad. Or, even if the consequences will be bad, there is nothing we can do about it.
These softer positions, however, are held reluctantly. They may technically constitute a prior belief, but I doubt that confirmation bias would support them. Rather, the denier jumps on any evidence that seems to support what they want to believe, and will quickly revert to – global warming is not even happening – when given the chance.
The same is true for evolution deniers. They may, at some point, reluctantly acknowledge there is evidence for common descent, but always have one eye out for evidence they can use to cast doubt on even that.
In short, I think that desirability bias and confirmation bias are two sides of the same coin, and not easily disentangled. In reality there will often be a complex web of competing beliefs and desires. Further, not all beliefs are the same, as there is a spectrum of emotional and identify implications for specific beliefs. Further still, predictions about what will happen also introduce new biases, such as a potential optimist or pessimist bias.
What the current study mostly accomplishes is reminding us of this complexity.
The post NeuroLogica Blog » Confirmation Bias vs Desirability Bias appeared first on Epeak . Independent news and blogs.