Conspiracy Theories and the Problem of Disbelief
Some people believe the most extraordinary things. Earth is flat, and airplane GPS is rigged to fool pilots into thinking otherwise. COVID-19 vaccines are a pretext to inject thought-controlling microchips into us all. The true president of the United States is Donald Trump; his inauguration will happen on January 20, make that March 4, make that a date to be arranged very soon.
The question “How could anybody believe this stuff?” comes naturally enough. That may not be the most helpful question, however. Conspiracy theorists believe strange ideas, yes. But these outlandish beliefs rest on a solid foundation of disbelief.
To think that Trump is actually still the president, as some in the QAnon movement do, you first have to doubt. You have to doubt the journalism practiced by any mainstream media outlet of any political persuasion; you have to doubt all the experts and the political elites; you have to doubt the judiciary, the military, and every other American institution. Once you have thoroughly disbelieved all of them, only then can you start to believe in Trump’s ascension being just around the corner—or in lizard overlords or alien prophets.
Is the line between excessive doubt and excessive belief a distinction without a difference? I don’t think so, because it helps inform how to bring a conspiracy theorist back to reality. One must recognize that this is a person who already mistrusts what most authoritative sources say. One should ask calm questions, inviting the conspiracy theorist to explain and reflect on his beliefs, rather than advance evidence or quote the experts. The evidence and the experts, remember, are exactly what the conspiracy theorist has already rejected.
When someone has dismissed the obvious facts, repeating them will not persuade him to see sense. But when people are given time and space to explain themselves, they may start to spot the gaps in their own knowledge or arguments. The psychologists Leonid Rozenblit and Frank Keil coined the phrase “the illusion of explanatory depth” to refer to the way our self-assurance crumples when we are invited to explain apparently simple ideas.
A focus on excessive credulity distracts from the problem of excessive doubt, which is everywhere in our modern information ecosystem. We are all capable of motivated reasoning, of believing what we want to believe. But we are all also capable of doubting what we want to doubt, and studies have found that motivated reasoning has a special power when it takes the form of doubt.
A couple of decades ago, the psychologists Kari Edwards and Edward Smith conducted an experiment in which they asked their subjects to read simple arguments about politically fraught topics such as the death penalty. They then invited these people to produce further arguments and counterarguments. Unsurprisingly, Edwards and Smith found that preconceptions mattered: People found it easier to argue with the grain of their prior beliefs.
More striking was that this bias was clearer when people were on the attack, trying to refute an argument they disliked, than when they were weighing arguments they were inclined to defend. When trying to rebut an unwelcome position, people found it easy to make long lists of reasons to doubt. Disbelief flowed freely, and the bias in what people rejected was much clearer than the bias in what they accepted.
Propagandists have long understood this quirk of human psychology. In the 1950s, when Big Tobacco faced growing evidence that cigarettes were deadly, the industry turned doubt into a weapon. Realizing that smokers dearly wished to believe that their habit wasn’t killing them, Big Tobacco concluded that the best approach was not to try to prove that cigarettes were safe. Instead, it would merely raise doubts about the emerging evidence that they were dangerous. The famous “Frank Statement to Cigarette Smokers” from 1954 managed to look socially responsible while simultaneously reassuring smokers that “research scientists have publicly questioned” the significance of the new findings.
Publicly questioning things is what research scientists always do, but that didn’t matter. The artful message from the tobacco industry to smokers was “This is complicated, and we’ll pay attention to it so that you don’t have to.” When we are confronted with unwelcome evidence, we don’t need much of an excuse to reject it.
Trump seemed to channel this body of thought when he seized upon a moral panic about a few transparently silly stories—“fake news”—and created a catchphrase to smear serious journalists. While we in the media wrung our hands at the idea that people might believe the Pope had endorsed Trump, Trump himself realized that the real danger—and for him, the real opportunity—was different. It was not that people would believe such nonsense, but that they could be persuaded to disbelieve authoritative, carefully sourced journalism.
“Deepfakes”—the technology that creates plausible footage of people saying and doing things that they did not—provide a similar lesson. (Deepfakes of Tom Cruise seem to be popular right now.) One researcher reassured Radiolab that “if people know that such technology exists, then they will be more skeptical.” She may be wrong about that, but I am more worried that she is right—that deepfakes create a world of unlimited deniability. Say anything, do anything, and even if the cameras are rolling, you can claim it never happened. We’re not yet at that point, but the trajectory is hardly reassuring.
Journalists need to take the problem of weaponized doubt more seriously. Fact-checking outfits in particular, such as PolitiFact, FactCheck.org, and Snopes, must take care not to breed cynicism. The risk is of creating the sense that lies are ubiquitous—which is why the best fact-checkers spend as much effort explaining what is true as they do exposing what is false.
The ultimate cautionary tale here is Darrell Huff’s 1954 classic, How to Lie With Statistics. Huff’s book is clever, insightful, and impish, and it may be the best-selling book about statistics ever written. It is also, from cover to cover, a warning that statistics are all about misinformation, and that one should no more believe in them than in stage magic. Huff ended up testifying at a Senate hearing that the evidence linking smoking and cancer was as spurious as the evidence linking storks and babies. His unpublished sequel, How to Lie With Smoking Statistics, was paid for by a tobacco-lobby group.
Yes, it is easy to lie with statistics, but it is much easier to lie without them. It is dangerous to warn that the lies are universal. Skepticism is important—but we should recognize how easily it can curdle into cynicism, a reflexive dismissal of any data or testimonies that do not fit neatly into our preconceived ideas.
The events of January 6 showed us that conspiracy thinking can have serious consequences. But this is not just about the conspiracy theorists. The psychological traits that lead one down the conspiracy-theory rabbit hole are to some extent present in most of us. We all like to listen to people who agree with us. We are all prone to reject unwelcome evidence. We are all more engaged by dramatic stories than by gritty policy detail. And we all like to feel that we have insights into the world that others lack. Nobody likes to feel that they are being taken for a fool, so doubting early and often can seem like the smart thing to do. And if we want to think clearly about the world, skepticism is a good thing.
But it’s possible to have too much of a good thing. Indiscriminate belief is worrying, but indiscriminate doubt can be even worse.
*** This article has been archived for your research. The original version from The Atlantic can be found here ***