BOB GARFIELD: From WNYC in New York, this is On the Media. I’m Bob Garfield.
BROOKE GLADSTONE: And I’m Brooke Gladstone. This show is devoted to the idea of doubt, its costs and benefits. America runs on the casting of doubt, doubt over global warming, doubt over the veracity of the Congressional Budget Office, in the nation's intelligence agencies, in the honesty of all the engines of accountability, including the mainstream media and, lately, over the functioning of democracy, itself. Here’s Kris Kobach, the Kansas Secretary of State and the Vice Chairman of Trump's Advisory Council on Election Integrity, Wednesday on MSNBC.
MSNBC HOST KATY TUR: Again, you think that maybe Hillary Clinton did not win the popular vote.
SEC. OF STATE KRIS KOBACH: We may never know the answer to that question. I know there are allegations –
KATY TUR: How do you think we may never know the answer to that question, really? The votes for Donald Trump that led him to win the election in doubt, as well?
SEC. OF STATE KOBACH: Absolutely, they – if, if there are…
BROOKE GLADSTONE: It can be awfully hard to un-sow the seeds of doubt. Doubt and denial, even in the face of hard evidence, is often the easiest course, especially when it comports with your worldview. In fact, studies have shown the existence of a “backfire effect,” whereby people actually double down on their false ideas when they're confronted with irrefutable facts. I learned this in 2010, when I spoke to Brendan Nyhan, professor of government at Dartmouth College, and one of the researchers behind the discovery of the backfire effect.
BRENDAN NYHAN: My coauthor, Jason Reifler, and I looked at, can the media effectively correct misperceptions, which seems like a simple question but no one had really tested that scientifically.
BROOKE GLADSTONE: And you found, actually, that when people had their misperceptions challenged, certain people, at least, were more likely to become more firmly entrenched in that belief.
BRENDAN NYHAN: That's right. People were so successful at bringing to mind reasons that the correction was wrong that they actually ended up being more convinced in the misperception than the people who didn't receive the correction. So the correction, in other words, was making things worse.
BROOKE GLADSTONE: The “backfire effect,” a delicious paradox and now, sadly, subject to its own correction. Do I really have to let it go? Brendan, welcome back to the show.
BRENDAN NYHAN: Thanks for having me.
BROOKE GLADSTONE: Okay, so you – we - have been talking about this for years. I even quoted you in my book, cited your research that a group of Republican volunteers, when shown evidence that there were no weapons of mass destruction in Iraq, doubled down on their beliefs. But recently, you've had your own views challenged. You were presented with the work of other researchers who found much less evidence of the backfire effect across a wider range of factual questions and, in fact, found that evidence tended to move people in the direction of accuracy.
BRENDAN NYHAN: That’s right. They found giving people factual information did tend to improve the accuracy of their beliefs. They found little evidence of a backfire effect, which is encouraging. So my co-author, Jason Reifler, and I who did the original backfire effect study, we joined up with Ethan Porter and Thomas Wood to do a study together to try to continue to move the field forward, using a set of research designs that we all agreed would be a good test of the hypotheses of interest in this case: what effect would corrective information have during the 2016 campaign when partisan-motivated reasoning might have been at its high pitch?
BROOKE GLADSTONE: So you took misstatements by Donald Trump, one during his convention speech and one during the first presidential debate, as a test case.
BRENDAN NYHAN: That's right, and we wanted to see what would happen if we gave people corrective information. In the convention speech, we were interested in the suggestion that crime rates were increasing. There was a small uptick in the data that came out soon before the Republican Convention, which Trump leveraged to suggest this epidemic of violent crime which didn't really exist. We were actually in a long-term trend of decreasing violent crime rates in the United States and we wanted to see what happened if people were given evidence about the longer-term decline in violent crime that we’d seen.
Well, let me start with the good news. We found that telling people that violent crime was actually down over the long term increased the accuracy of people's beliefs. They were less likely to think that violent crime was increasing in the long-term, including among Trump supporters. So, in other words, that corrected information didn't backfire. Trump supporters did not double down on his suggestions of this violent crime epidemic.
BROOKE GLADSTONE: Okay, so that was the good news.
BRENDAN NYHAN: The bad news is that exposure to that corrected information didn't change how favorable they felt towards Trump, which is what you might hope if fact checking posed a significant reputational risk to the politicians who were making false claims.
BROOKE GLADSTONE: So the suggestion is they just don't care that their favorite candidate plays fast and loose with the truth.
BRENDAN NYHAN: By the time we conducted our study, in the summer of 2016, it was hard for anybody to pretend that everything Donald Trump said was perfectly factually accurate.
BROOKE GLADSTONE: I’m still wondering, do you have to carve out an exception for Trump?
BRENDAN NYHAN: I don’t think we need to carve out an exception for Trump. It’s just we should take into account the extent to which people already had well-formed views of him. So if you're trying to think about why people might accept that he was making false claims, well, the pattern of that had become overwhelming by the time of the study. If you’re trying to think of why people wouldn’t change their opinion of him, people had very strong fixed opinions of him, relative to other people in politics. It may be the case that we can change people's minds about facts but they don't change their minds in the way we expect about the politician in question or the issue in question or even the behavior in question.
BROOKE GLADSTONE: Well, there is a recent example on a national scale of minds that may have been changed. This week, we saw the collapse of the Senate's attempt to repeal and replace the Affordable Care Act. According to an NPR/PBS NewsHour/Marist poll from the end of June, only 17% of Americans approved of the Senate Republican replacement, pretty striking, when there was a time not that long ago when support for Obamacare had dipped below 50%.
BRENDAN NYHAN: I think healthcare is a very instructive example. One of the things that researchers have been trying to figure out is when will reality break through for people, in a way that registers even when there are strong partisan filters? And what we saw in healthcare was people seemed to be reacting to the bill in a way that didn't break neatly along partisan lines. Compare that case, for instance, with Russia, where the issues are much more abstract, have less to do with people's lives and the personal stakes are lower, that looks much more like the partisan divide in this country and the divide over Donald Trump.
BROOKE GLADSTONE: It’s hard to study how facts change beliefs over time, isn’t it?
BRENDAN NYHAN: It’s very difficult. Our findings were quite encouraging about the immediate effect of corrective information. It seemed to make people’s voice [?] more accurate. But when we look in the long term at public opinion polls, asking people, do you think there were weapons of mass destruction in Iraq or was Barack Obama born in this country, we don't see corrected information registering in the same way. Those beliefs often are quite resilient. The effect of this information may briefly register and then disappear over time, as we saw after the release of President Obama's long-form birth certificate, for instance.
BROOKE GLADSTONE: There's one thing that I've seen repeated over and over again. If the fact check comes from the same side as the person who told the lie, then you've got a really potent and highly credible source.
BRENDAN NYHAN: The need for sources who seem to be acting against their political interests –
BROOKE GLADSTONE: Mm-hmm.
BRENDAN NYHAN: - is something that’s come out of recent research by Adam Berinsky at MIT, among others. Those speakers may be especially credible to audiences that would otherwise be skeptical. When you see a Republican speaking out, saying that something Donald Trump said is untrue, that may cause you to update your beliefs more than if you heard that from a Democrat who always says that everything Trump claims is wrong.
Reporters should try to invest the effort to find those especially credible sources when they can. It’s a partisan world out there. There aren’t as many people as we’d like who will speak out against their side, but if we can find them they can be great messengers.
BROOKE GLADSTONE: You know, this brings me back to the process that you underwent with the people who found contradictory information. Despite the fact that you've made hay of the backfire effect for quite some time, you were not inclined to double down.
BRENDAN NYHAN: I'm trying to practice what I preach, Brooke –
- as, as best I can. It would be a terrible irony if evidence contradicting the backfire effect provoked me into doubling down on the backfire effect.
BROOKE GLADSTONE: You could have created one of those famous scientific feuds. Aren't they more typical than what you did?
BRENDAN NYHAN: Unfortunately, they do happen a lot. I think if you talk to anyone in any academic field, they can tell you about years- or decades-long feuds between competing research camps. I hope that our study can be one tiny contribution towards a more constructive approach to resolving conflicting findings and moving scientific knowledge forward.
BROOKE GLADSTONE: How has this process of being challenged, of going back into the research perhaps changed your process? Where is it taking you?
BRENDAN NYHAN: It’s really exciting for me. If I was just banging the drum on the backfire effect for the next 20 years –
- I would be a bored-bored professor. [BROOKE LAUGHING] I, I want to challenge. Doubt is fundamental to the scientific process. If you don’t doubt your own findings, you’re not doing science.
[MUSIC UP & UNDER]
It’s healthy for all of us to call our beliefs into question. And that’s, obviously, not something that our political system, in particular, is encouraging or rewarding right now. It’s difficult, of course, to doubt yourself but hopefully I’m a better researcher and person for it. And, and, again, this is great for the world.
BROOKE GLADSTONE: Brendan, thank you very much.
BRENDAN NYHAN: My pleasure.
BROOKE GLADSTONE: Brendan Nyhan is a professor of government at Dartmouth College.
BOB GARFIELD: Coming up, more dubious science, doubting scientists and the promise of more certainty, down the road.
BROOKE GLADSTONE: This is On the Media.