CLICK BELOW TO REDISCOVER HUMANITY

A DECADE+ OF STORYTELLING POWERED BY THE BEST WRITERS ON THE PLANET

Why Facts Don’t Win Arguments

Have you ever noticed how easy it is to dismiss a positive remark but a negative one can bounce around your head for days?  What makes opposing, adverse, or unfavorable information so sticky?  And why do conversations that don’t align with our mental models of the world consume so much mental energy?

Information that matches what you already believe is somewhat fleeting, but that which conflicts with your existing beliefs grabs your attention – often with irrational or illogical thought processes.

Think about the last time you engaged in an online discussion with someone who was either misinformed or just plain ignorant about a topic that you have researched and studied and have very strong convictions about. Take climate change, for example. Many of the most vocal climate change deniers will freely admit they aren’t “experts” before launching into a litany of reasons why the science is wrong.

A classic example of misinformed ignorance posing as an expert is Senator James Inhofe (R-OK) (and not just because his “proof” of the global warming hoax was the snowball he brought for Senate show-and-tell.)  Inhofe has repeatedly maintained that “man-made global warming is the greatest hoax ever perpetrated on the American people.  It’s worth noting that, According to Oil Change International, Inhofe has received over $2 million in donations from the fossil fuel industry.  His ignorance is astounding, as is his inability to hear any kind of data, research, or information that conflicts with his position.

Don’t get me wrong. Scientific skepticism is healthy – necessary even. It forces scientists to examine claims (their own and those of others) and systematically question all information in search of flaws and fallacies. But, deniers like Inhofe vigorously criticize any evidence that substantiates climate change and embrace any argument that refutes it. Presenting actual facts and data that challenge their thinking just makes them dig in their heels and even more sure of their position.

Skepticism is healthy both for science and society. Denial is irresponsible and dangerous.

Three Related Blind Spots to Objectivity

Confirmation Bias

Confirmation bias is that unconscious force that nudges us to seek out information that aligns with our existing belief system.  Once we have formed a view, we embrace information that confirms that view while ignoring information to the contrary. We pick out the bits of data that confirm our prejudices because it makes us feel good to be “right.”

When we want to be right badly enough, we become prisoners of inaccurate assumptions.

Extending the example of climate change, a 2018 study was conducted with more than 1,000 residents living in South Florida who were at risk from either the direct or indirect effects of flooding to their homes, including a decrease in property values as coastal property is perceived as a less desirable destination. Half of the participants received a map of their own city that illustrated what could happen just 15 years from now at the present rate of sea-level rise if there were a Category 3 hurricane accompanied by storm surge flooding. Those who had viewed the maps were less likely to say they believed that climate change was taking place than those who had not seen the maps. Furthermore, those who saw the maps were less likely than those who had not seen the maps to believe that climate change was responsible for the increased intensity of storms.

The Backfire Effect

A second cousin to confirmation bias is the backfire effect.  Not only do we search out information consistent with our beliefs, but we instinctively and unconsciously protect those beliefs when confronted with information that conflicts with them.  It’s an instinctive defense mechanism that kicks in when someone presents information – even statistically sound research or data – that disputes your position.  Those facts and figures backfire and only strengthen our misconceptions. In addition, the cognitive dissonance produced by conflicting evidence actually builds new neural connections that further entrench our original convictions.

A 2006 study examined why sound evidence fails to correct misperceptions. Subjects read fake news articles that included a misleading claim from a politician, or a misleading claim and a correction about polarizing political issues.  People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence threatened their existing beliefs, they doubled down. The corrections backfired. The evidence made them more certain that their original beliefs were correct.

The Dunning-Kruger Effect

The Dunning-Kruger Effect is based upon the notion that we all have pockets of incompetence with an inverse correlation between knowledge or skills and confidence. People who are ignorant or unskilled in a particular subject area tend to believe they are much more competent than they are. Bad drivers believe they’re good drivers, cheapskates think they are generous, and people with no leadership skills think they can rule the world. How hard can it be?

Those who have the slightest bit of experience think they know it all. Then, as people gain experience, they begin to realize how little they do know. This is the point at which they search out the knowledge they need to build their expertise. Those at the level of genius recognize their talent and demonstrate the confidence commensurate with their ability.

There is also a corollary to the effect.  Just as highly incompetent people overestimate their abilities, highly competent tend to underestimate their abilities. Dunning and Kruger found that most people, regardless of how they perform on any given task, rate themselves at 7-8 out of 10.

So, the next time you are convinced of your “rightness,” it might be worth it to take a minute to examine your biases.

Mark Twain may have said it best:

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.

CLICK HERE TO GET TODAY'S BEST WRITING ON THE PLANET DELIVERED TONIGHT

Melissa Hughes, Ph.D.
Melissa Hughes, Ph.D.https://www.melissahughes.rocks/
Dr. Melissa Hughes is a neuroscience geek, keynote speaker, and author. Her latest book, Happier Hour with Einstein: Another Round explores fascinating research about how the brain works and how to make it work better for greater happiness, well-being, and success. Having worked with learners from the classroom to the boardroom, she incorporates brain-based research, humor, and practical strategies to illuminate the powerful forces that influence how we think, learn, communicate and collaborate. Through a practical application of neuroscience in our everyday lives, Melissa shares productive ways to harness the skills, innovation and creativity within each of us in order to contribute the intellectual capital that empowers organizations to succeed with social, financial and cultural health.

DO YOU HAVE THE "WRITE" STUFF? If you’re ready to share your wisdom of experience, we’re ready to share it with our massive global audience – by giving you the opportunity to become a published Contributor on our award-winning Site with (your own byline). And who knows? – it may be your first step in discovering your “hidden Hemmingway”. LEARN MORE HERE


17 CONVERSATIONS

  1. Thanks for putting this out there!

    I’ve read these details in other places.

    And I have been observing as I see all of this in play — in myself and in others!!

    An interesting phenomenon for sure.

    Best response seems to be to smile and acknowledge 🙂

    Ahhh… and then be ready to repeat.

    blessings,
    Cynthia

  2. Melissa, this article is fascinating. Thanks so much for sharing your insight and knowledge with us. I wasn’t familiar with the Dunning-Kruger effect. Or, should I say, I am familiar with it but never knew it had an official name. Thanks for such good food for thought and for helping me to learn something new.

    • Thank you, Kimberly! Many learning experiences are painful… few as painful as this is likely to be. I appreciate your support!

    • Thanks for taking the time to read and share your reflection, Mary. For me, understanding all of those mental shortcuts is empowering and the best way to spot the flawed thinking when I’m guilty of it.

  3. I love the topic of cognitive biases. I’ve never heard of backfire bias, but it makes complete sense. Our brains are such an interesting organ. But as with most things, knowing about them is half the battle. If you know they exist you can check yourself and laugh – knowing your control is minimal.

    Cognitive biases thrive on social media. When you are sitting alone – you are correct – regardless of what the other person is saying. In person you may see their humanity, but online its a persona.

    Here are two lighter pieces I’ve written on the topic.

    https://obriencg.com/believe/
    https://obriencg.com/correctness-in-the-gsot/

    • Thanks for sharing your thoughts on this one, JoAnna. I enjoyed reading your pieces, too. The Santa example is a great analogy! It’s so interesting how complex and amazing the brain is, and how faulty it can be. This piece, in particular, has demonstrated how we jump to inaccurate assumptions when our belief systems are threatened.

  4. What’s a “meta” for? “Biases” of any type are what observers encounter when any/all members of an experiential set, including intensified experimental effects, endorse their own recidivism. Cf., Polanyi on “the personal heresy.”

  5. What a profound and impactful post, Melissa. Thank you for the ideas you’ve presented and the way you have done so.

    What might support people who are interested in this is consistently and over time asking ourselves-what don’t I know that I don’t know? For years I have steeped myself (and continue to do so!) in that question to access many limiting beliefs that entangled themselves inside the world of my head (heart and body). In the rigorous work to untangle myself from “limiting beliefs” (which I continue to do to this very moment), I began to pay attention to my body, to my gut instincts, to so much more “data” if you will than “words I had been told.” I began to watch myself, listen closely to the words that actually came out of my mouth, listened closely to the words that other people spoke, watched behavior, facial expressions-the whole nine yards-and I would do my own research (you learn to go to original sources rather than what someone said about what someone said). Grounding in a quieter place of a witness self, I could begin to better rely on the truth of my lived experiences along with the soundness of another person’s or several people’s research–you learn to train the mind to be discerning–rather than only believing “words someone spoke” or “what my parents or what the neighbors said” even “facts” presented in a book. Looking with curiosity from many angles and then really trusting my body wisdom, body sensations, lived experiences-I have gotten better at discernment and trusting truths when I hear them or read about them. Engaging the prefrontal cortex helps immensely!! Living beyond the amygdala!! Yey!!

    A fascinating topic you’ve brought to our awareness! Thank you so much, Melissa!!

    • It’s so true, Laura, that we all fall prey to “limiting beliefs.” It’s just how the brain works. What I’ve found fascinating about this particular piece is the negative push back I’m getting about the science. It’s a prime example of how we instinctively dismiss information that challenges our mental models. I’ve never had so many negative remarks about the science of biases until this one – that happens to use a polarizing topic as an example. Fascinating to me!

TIME FOR A "JUST BE." MOMENT?

TAKE STROLL INSIDE 360° NATION

ENJOY OUR FREE EVENTS

BECAUSE WE'RE BETTER TOGETHER