Everyone is sharing this comic about the ‘backfire effect’ … but there’s a huge catch
You’ve probably already seen The Oatmeal comic in your social media feed several times now.
Its beautiful illustrations are paired with an elegant, clever explanation about something called the “backfire effect.” Basically, it describes why people double down on their beliefs when presented with contradictory information.
The cartoon is powerful because we can all relate to that feeling of using facts to inform a heated political debate or sway someone’s opinion and getting nowhere. And that’s exactly the problem the comic’s author, Matthew Inman, wanted to address, especially in the wake of Donald Trump’s election.
The only problem is that political scientists aren’t sure the backfire effect is a real thing, and if it does exist, it may be rare.
We know what you’re thinking: Why do the fact police have to ruin the best thing that happened to your social media feed all week? The cartoon is pretty, funny, smart, and even hopeful about the importance of finding common ground when we vehemently disagree.
That’s all great stuff, and very important. But what you should keep in mind while reading the cartoon is that the backfire effect can be hard to replicate in rigorous research. So hard, in fact, that a large-scale, peer-reviewed study presented last August at the American Political Science Association’s annual conference couldn’t reproduce the findings of the high-profile 2010 study that documented backfire effect.
FWIW idea that backfire fx always happen = not even true in our initial study. But I’ve revised my priors a lot as we & others did more work
— Brendan Nyhan (@BrendanNyhan) March 26, 2017
Tom Wood and Ethan Porter, political scientists and assistant professors at The Ohio State University and George Washington University, respectively, and co-authors of the recent study, say they came to the subject of backfire effect as “acolytes.”
They found this particular explanation of human behavior so compelling that they wanted to dedicate a good portion of their research to understanding and identifying it. So they challenged 8,100 people’s knowledge of abortion, gun violence, undocumented immigration, fracking pollution, and dozens of other issues that stir intense emotions. But study participants didn’t demonstrate the tendency to embrace falsehoods even more after being told the truth.
Technically, they did observe a backfire effect when people were questioned about the existence of weapons of mass destruction in Iraq, but even that finding came with caveats because of the question’s complicated wording.
“If we believe that everybody is backfiring all the time, there’s very little hope for political engagement.”
“We were desperately looking for any evidence … and to our dismay it’s impossible to replicate,” says Wood.
This is important, Wood and Porter say, because if the backfire effect exists, it means something really depressing about our politics. After all, if sharing objective facts with someone leads them to believe the falsities you challenge more intensely, then what’s the point?
“If we believe that everybody is backfiring all the time, there’s very little hope for political engagement,” says Porter.
Now, this doesn’t mean that Inman’s comic is inherently wrong. Brendan Nyhan, the political scientist who co-authored the 2010 study, has found evidence in subsequent research that people may insist on false beliefs despite being presented with new information. At the same time, Nyhan has since collaborated with Porter and Wood on research that shows fact-checking can be effective.
Whether or not there is a backfire effect, the behavior Inman describes is real; political scientists know it as motivated reasoning and confirmation bias. These well-researched psychological phenomena mean that we can be prone to choosing information and data that support our worldview while diminishing or dismissing evidence that contradicts it. To be clear, that’s a lot different than learning something is false and endorsing that lie or half-truth even more. Moreover, Porter and Wood’s study indicates people do actually heed corrective information.
The trouble is that even when we learn that something is false, we may be able to acknowledge those facts without changing our political position accordingly. A person’s political identity, say Porter and Wood, isn’t easily influenced by learning, for example, that Trump routinely spreads false information about pretty much everything, or that Hillary Clinton has told her share of half-truths.
You can sum up that tension like this, says Wood: “My guy happened to tell a fib — sure no one is perfect — but I’m not going to go out and vote for the other guy.”
That still leaves the rest of us trying to figure out how to talk through our dueling beliefs, which is where Inman’s comic shines. “The emotional core of this is about this idea of how we resist things and how do we get [people] to soften,” he says.
Inman knows from his own experience on the internet that marshaling all the facts in the world can’t, for example, convince some people that climate change is real.
If the backfire effect is real, nihilism might be the most appropriate response to the prospect of influencing anyone’s attitudes or beliefs with facts.
But Inman rejects that approach and instead invokes our common humanity and ends with a bipartisan plea to listen.
“I’m not here to take control of the wheel,” he writes. “Or to tell you what to believe. I’m just here to tell you that it’s okay to stop. To listen. To change.”
Those common sense words of wisdom are the best part of the comic, and you don’t really need science to confirm that the ability to listen and change is essential to a more civil, informed politics.