KEEP YOUR FACTS TO YOURSELF: WHY SOME ARGUMENTS CANNOT BE WON

The MMR vaccine causes Autism. Barack Obama was not born on American soil. Man-made climate change is a fallacy and while we’re at it, the moon landing wasn’t real either.

There’s a psychological phenomena as to why people may hold onto such beliefs in the face of indisputable evidence.  When we have a strong belief, we have an instinctive and unconscious need to protect it. Unsurprisingly, we welcome information that reinforces our beliefs. We love this kind of information so much that we will, often unconsciously, seek out new information that supports our current position. This is called confirmation bias. 

It’s never been easier to confirm your bias in the age of the Internet. One of the most egregious examples I’ve encountered was a website (that I'd rather not link to) with a custom search engine that procures scientific studies with whatever conspiratorial results you seek. However, confirmation bias can be more subtle.  Facebook employs algorithms to modify your newsfeed in such a way that your feed becomes biased towards the friends’ opinions and articles you agree with. Your google searches are tailored to your internet browsing history. 

You can't always control the information that comes your way. So what happens when we are presented with information that directly contradicts our belief systems? 

We all like to believe we are thoughtful rational people, capable of critical thought and the ability to learn new things. However psychology has found that often the opposite is true:  in the face of contradictory information, we may defend our positions with even more vigor.

To those of us who can’t resist a futile facebook argument, this is not a surprise. No amount of evidence to the contrary will convince anti-vaxxers that vaccines do not cause autism. Man-made climate change deniers do so in the face of a 99% consensus among climate scientists, and no amount of time listening to Bill Nye will convince a creationist that the earth is more than 6000 years old.

In 2010, psychologists Nyhan and Reifler conducted a study to understand how people process factual information contrary to their political beliefs. Subjects with a range of political ideologies were given one of two articles to read. The fake news article suggested there were weapons of mass destruction (WMD) in Iraq prior to the US invasion.  A 'corrected' news article outlined the absence of WMD stockpiles or an active WMD program, as concluded by the Duelfer Report.

After reading the article, subjects were then asked how much they agreed that Iraq possessed, or had an active program to develop, WMD prior to the US invasion. Comparisons were made between how subjects responded to the fake article and the corrected article, based on their political ideology.  

Generally, liberals did not believe there were WMD in Iraq, while conservatives tended to support Bush’s claim that there were. 

Very liberal subjects that read the corrected article disagreed that there were WMD in Iraq even more strongly than very liberal subjects that read the fake news article. This isn’t surprising, as the corrected information supported the political ideologies of liberals. Those with a more centrist ideology felt similarly about the statement, regardless of which article they read.

However, conservatives that read the corrected article believed more strongly that WMD were in Iraq than conservatives that read the fake article. So in the face of contrary evidence, these subjects held onto their convictions with greater intensity. Presenting new, corrected information had the opposite intended effect-it backfired.  Hence, Nyhan and Reifler coined this phenomenon the backfire effect.

This effect extends to other belief systems as well. In 2015, Nyhan and Reifler sought to understand how misconceptions about the flu vaccine might be corrected.

Subjects were divided into three groups. One group was given information that either highlighted the danger of influenza, another group was given ‘corrective information’ (information that corrected common misconceptions about the flu vaccine), while a third control group received no additional information to read.  The participants were ranked as either having low or high levels of concern over the side effects of the flu vaccine. Finally, subjects were asked if they intended to get vaccinated against the flu.

Overall, corrective information did successfully reduced false beliefs about the flu vaccine. However, it did not increase the likelihood that subjects would actually get vaccinated.  In fact, among those subjects who had high levels of concern over the vaccine’s side effects, their intent to vaccinate actually dropped when given corrective information. Again, the effect backfired.  

Why do people subconsciously resist evidence that contradicts their beliefs? How might such behaviours be beneficial? When I asked a few of my behavioural biologist and psychologist friends (of which I have a disproportional amount) for their opinions about this, they all indicated that such behaviours would have evolved to increase fitness in some way, shape, or form.

According to Dr. Danny Krupp, a research associate with the SALT lab at One Earth Future, belief systems exist as a way to further one’s fitness. That is, people value things that they believe will benefit them. Through defending your position, even in the face of contradictory evidence, you are defending the perceived benefits to yourself that your position supports.

Dr. David Logue, an assistant professor at the University of Lethbridge, highlighted that in the animal kingdom, increases in social standing often translate to an increase in fitness. Assuming this is true for humans, losing an argument can diminish one’s social standing. Consequently stubbornness may be a viable strategy.

In the same vein Dr. Sandeep Mishra, an assistant professor at the University of Regina, suggested that if the beliefs held by an individual represent those held by that individual’s social group, publicly maintaining the group’s position-especially in the face of contradictory evidence-may be a way to demonstrate allegiance to one’s social group.

We may never know why the backfire effect exists. But understanding its implications can offer insight into how we can better inform an electorate, or promote measures of public health, for example. The uncomfortable truth is that we are all susceptible to obtuse thinking.  It begs the question: what corrective information has backfired on you?

Previous
Previous

EXTREME ALTRUISTS: THE ANTI-PSYCHOPATHS

Next
Next

A WOMAN IN SCIENCE.