Facts. We know them, we love them, we employ them in Internet arguments and heated discussions with family members during dinner. Indeed, using facts—as long as they are, you know, factual—is one of the most beloved ways we have of demonstrating our mastery of a subject.
Need to convince someone your political point is correct? Reference the president of your choosing to show history is on your side.
Want to demonstrate the efficacy of a particular medical treatment? Launch into a passionate recitation of your favorite statistics.
Hoping to persuade a friend to see things your way? Provide them with examples of how your approach has worked for others.
Wielded properly, facts can be powerful things. The challenge, however, is that the range of “properly” can be fairly narrow, especially when emotions get involved.
And for humans? Emotions are almost always involved.
When Facts are Right but not Really Effective
In 2010, researchers Brendan Nyhan and Jason Reifler used the phrase “backfire effect” to describe a persistent phenomenon they observed while studying political beliefs. The two were interested in the effects that factual corrections could have on misperceptions and devised several experiments to see what would happen when people were “fact-checked.”
The findings? Factual corrections didn’t do much to persuade the study’s participants to change their beliefs, even if those beliefs were based on misperceptions and misunderstandings. And in some cases, the researchers discovered that the factual corrections actually increased a participant’s commitment to his or her misperceptions, thus resulting in what they termed “the backfire effect.”
Now, there are a few caveats to any lessons we might draw from this research. One, the study’s groups were comprised of undergraduates, and age tends to affect how we view everything, including our beliefs. Two, Nyhan and Reifler were specifically studying political opinions and ideology, two things which people may view as integral to their identities. (And thus, get pretty emotional about.)
Does that mean facts will be effective as long as you’re not trying to convince Aunt Sally to vote for your favorite mayoral candidate? Unfortunately not.
Starting in the 1960s, psychologists have routinely observed that people tend to seek out information that confirms their already existing beliefs while rejecting or ignoring anything that could undermine their opinions or conclusions. As such, confirmation bias (as the phenomenon is usually known) often keeps us focused on one possibility to the exclusion of all others. Regardless of how many facts might disprove our favored hypothesis.
Our predisposition to adhere to what we already believe is also complicated by another factor as well: the tendency to employ facts without sufficient or appropriate context.
When the Facts are Right but the Story is Wrong
About halfway through my doctoral program, I was wandering the halls in one building or another on LSU’s campus when I spotted a big, glossy poster featuring Abraham Lincoln. I think some savvy graphic designer had photoshopped Ray-Ban sunglasses on him, and there was also a blurb of text floating over his head about how timing is everything. To support that assertion, the poster’s creators quoted the following fact at the bottom of the image: just hours before John Wilkes Booth shot him, Lincoln signed the Secret Service into existence.
Which is true! The president did indeed sign a piece of legislation that created the Secret Service Division of the Treasury Department on the very day he made his ultimately fatal trip to Ford’s Theater. Of course, the new office’s responsibilities focused on investigating counterfeit money and had nothing to do with protecting the president’s personal safety. (That development didn’t happen until after 1901.) So, while the fact was true, the poster distorted its context.
And context matters just as much as factual accuracy if you want to credibly prove your point. Most facts can be manipulated to bolster one’s claims with enough selectivity, especially when debates or controversies become tinged with emotion.
Given these complications, do facts even matter? Yes. Sometimes, anyway.
Rebecca, I listened to the “You Are Not So Smart” Podcast (one of my favorites on behavioral psychology) on the Backfire Effect. It was great. Your 4 strategies are more important than ever in this polarized country. It’s a strategy that would allow people to talk to one another without conceding defeat or trying to win-over the opposition. It is simply a conversation strategy. I would also add to #2, it’s not the emotion we should try to understand, but really the underlying reason for their belief. Emotion may really just be a symptom of something deeper. Often, emotions themselves are based in other factual beliefs or personal experiences.
Wow… this is so true, Rebecca! Unconscious biases are powerful forces that often keep us so firmly entrenched in faulty thinking. Thanks so much for sharing this one!
Facts and truth are not always joined at the hip.
And therein lies the difficulty, Rebecca: How do we get smart or knowledgeable enough about which facts are true, even if they don’t fit our personal narrative — our confirmation bias? Our need to have been correct about what we thought was true is certainly huge for many (most?) of us.
It’s a murky subject for sure, and thanks for the article!