Have you ever fundamentally disagreed with someone on a personal issue that both of you were very passionate about? Climate change, reproductive rights, election integrity, gun reform… pick a topic. How did that go?
We’ve all been there…. on both sides of the debate. On the one hand, our values and convictions are what make us who we are. On the other hand, acknowledging that a deeply-held belief may be wrong and integrating new information and evolving into a more enlightened view of the world is neither instinctual nor easy. But it is the cornerstone of personal growth and evolved thinking.
Generally, the most vocal defenders of any position are the least willing to consider any information that gives credence or value to a different position. Presenting actual facts and data that may be intended to offer a different point of view or challenge their thinking just makes them dig in their heels and even more sure of their “rightness.”
We’d like to believe that we are logical beings and when our beliefs are challenged with facts, we incorporate the new information into our thinking. The truth is that when our convictions are challenged by contradictory evidence, our beliefs get stronger. Once something is added to our cognitive catalog of beliefs, it’s intricately woven into our sense of self.
When we get information that is consistent with our beliefs, our natural tendency is to lean in and add it as reinforcement. (See, I knew it!) Over time, we become increasingly less critical of any information that “proves” you’re correct.
But when the information is dissonant or contrary to our beliefs, we look for any reason to dismiss it. We find it biased, flawed, or not even worthy of consideration. (What a moron!)
When people are forced to look at evidence that conflicts with their mental models, the automatic instinct is to criticize, distort, or dismiss it.
In that process, they recall information stored away in their memory banks, experience negative emotions from the “threat” to their identity, and form stronger neural connections. And just like that, their convictions are stronger than before.
Access to information is “on demand” tailored to your preferences, political identity, purchases, music, and memes – a nonstop stream of information confirming that which you believe to be true without ever leaving the safety of your filtered bubble. Online algorithms, cookies, and tracked advertising have created the perfect conditions for a subconscious psychological beast responsible for this phenomenon.
Here are three subconscious biases that are likely undermining those tough conversations.
The Backfire Effect
Decades of research shows how fervently we protect beliefs when confronted with information that conflicts with them. It’s an instinctive defense mechanism that kicks in when someone presents information – even statistically sound research or data – that disputes your position. Those facts and figures backfire and only strengthen our position.
A good example of the backfire effect was discovered in a 2010 study. Researchers asked people to read a fake newspaper article containing a real quotation of George W. Bush, in which the former president asserted, “The tax relief stimulated economic vitality and growth and it has helped increase revenues to the Treasury.”
In some versions of the article, this false claim was then debunked by economic evidence: A correction appended to the end of the article stated that in fact, the Bush tax cuts “were followed by an unprecedented three-year decline in nominal tax revenues, from $2 trillion in 2000 to $1.8 trillion in 2003.” People on opposite sides of the political spectrum read the same articles and the same corrections. In both groups, when new evidence threatened their existing beliefs, they doubled down. The corrections backfired. The evidence made them more certain that their original beliefs were correct. But researchers found that conservatives who read the correction were twice as likely to believe Bush’s claim was true as were conservatives who did not read the correction. Why? Researchers theorize it is because conservatives had the strongest group identity than the other groups. They were subconsciously defending their group by believing the claim.
Here is a more recent example of the backfire effect perfectly illustrated at the 2022 NRA Convention:
Embed this twitter video:
I had a good conversation with this guy about hammers. https://t.co/zCjj4mgsue
— Jason Selvig (@jasonselvig) May 31, 2022
A second cousin to the backfire effect is confirmation bias – the unconscious force that nudges us to seek out information that aligns with our existing belief system. Once we have formed a view, we embrace information that confirms that view and dismiss information to the contrary. We pick out the bits of data that confirm our prejudices because it makes us feel good to be “right.” When we want to be right badly enough, we become prisoners of inaccurate assumptions.