Saturday, November 23, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

Don’t Believe What They’re Telling You About Misinformation

Don’t Believe What They’re Telling You About Misinformation

Globe with a spinning black and white spiral

Railing against social media for manipulating our zombie minds might just distract us from our collective societal failures.Illustration by Till Lauer

Millions of people have watched Mike Hughes die. It happened on February 22, 2020, not far from Highway 247 near the Mojave Desert city of Barstow, California. A homemade rocket ship with Hughes strapped in it took off from a launching pad mounted on a truck. A trail of steam billowed behind the rocket as it swerved and then shot upward, a detached parachute unfurling ominously in its wake. In a video recorded by the journalist Justin Chapman, Hughes disappears into the sky, a dark pinpoint in a vast, uncaring blueness. But then the rocket reappears and hurtles toward the ground, crashing, after ten long seconds, in a dusty cloud half a mile away.

Hughes was among the best-known proponents of Flat Earth theory, which insists that our planet is not spherical but a Frisbee-like disk. He had built and flown in two rockets before, one in 2014 and another in 2018, and he planned to construct a “rockoon,” a combination rocket and balloon, that would carry him above the upper atmosphere, where he could see the Earth’s flatness for himself. The 2020 takeoff, staged for the Science Channel series “Homemade Astronauts,” was supposed to take him a mile up—not high enough to see the Earth’s curvature but hypeworthy enough to garner more funding and attention.

Flat Earth theory may sound like one of those deliberately far-fetched satires, akin to Birds Aren’t Real, but it has become a cultic subject for anti-scientific conspiratorialists, growing entangled with movements such as QAnon and COVID-19 skepticism. In “Off the Edge: Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything” (Algonquin), the former Daily Beast reporter Kelly Weill writes that the tragedy awakened her to the sincerity of Flat Earthers’ convictions. After investigating the Flat Earth scene and following Hughes, she had figured that, “on some subconscious level,” Hughes knew the Earth wasn’t flat. His death set her straight: “I was wrong. Flat Earthers are as serious as your life.”

Weill isn’t the only one to fear the effects of false information. In January, the World Economic Forum released a report showing that fourteen hundred and ninety international experts rated “misinformation and disinformation” the leading global risk of the next two years, surpassing war, migration, and climatic catastrophe. A stack of new books echoes their concerns. In “Falsehoods Fly: Why Misinformation Spreads and How to Stop It” (Columbia), Paul Thagard, a philosopher at the University of Waterloo, writes that “misinformation is threatening medicine, science, politics, social justice, and international relations, affecting problems such as vaccine hesitancy, climate change denial, conspiracy theories, claims of racial inferiority, and the Russian invasion of Ukraine.” In “Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity” (Norton), Sander van der Linden, a social-psychology professor at Cambridge, warns that “viruses of the mind” disseminated by false tweets and misleading headlines pose “serious threats to the integrity of elections and democracies worldwide.” Or, as the M.I.T. political scientist Adam J. Berinsky puts it in “Political Rumors: Why We Accept Misinformation and How to Fight It” (Princeton), “a democracy where falsehoods run rampant can only result in dysfunction.”

Most Americans seem to agree with these theorists of human credulity. Following the 2020 Presidential race, sixty per cent thought that misinformation had a major impact on the outcome, and, to judge from a recent survey, even more believe that artificial intelligence will exacerbate the problem in this year’s contest. The Trump and the DeSantis campaigns both used deepfakes to sully their rivals. Although they justified the fabrications as transparent parodies, some experts anticipate a “tsunami of misinformation,” in the words of Oren Etzioni, a professor emeritus at the University of Washington and the first C.E.O. of the Allen Institute for Artificial Intelligence. “The ingredients are there, and I am completely terrified,” he told the Associated Press.

The fear of misinformation hinges on assumptions about human suggestibility. “Misinformation, conspiracy theories, and other dangerous ideas, latch on to the brain and insert themselves deep into our consciousness,” van der Linden writes in “Foolproof.” “They infiltrate our thoughts, feelings, and even our memories.” Thagard puts it more plainly: “People have a natural tendency to believe what they hear or read, which amounts to gullibility.”

But do the credulity theorists have the right account of what’s going on? Folks like Mike Hughes aren’t gullible in the sense that they’ll believe anything. They seem to reject scientific consensus, after all. Partisans of other well-known conspiracies (the government is run by lizard people; a cabal of high-level pedophilic Democrats operates out of a neighborhood pizza parlor) are insusceptible to the assurances of the mainstream media. Have we been misinformed about the power of misinformation?

In 2006, more than five hundred skeptics met at an Embassy Suites hotel near O’Hare Airport, in Chicago, to discuss conspiracy. They listened to presentations on mass hypnosis, the melting point of steel, and how to survive the collapse of the existing world order. They called themselves many things, including “truth activists” and “9/11 skeptics,” although the name that would stick, and which observers would use for years afterward, was Truthers.

The Truthers held that the attacks on the Pentagon and the World Trade Center were masterminded by the White House to expand government power and enable military and security industries to profit from the war on terror. According to an explanation posted by 911truth.org, a group that helped sponsor the conference, George W. Bush and his allies gagged and intimidated whistle-blowers, mailed anthrax to opponents in the Senate, and knowingly poisoned the inhabitants of lower Manhattan. On that basis, Truthers concluded, “the administration does consider the lives of American citizens to be expendable on behalf of certain interests.”

“Out of this dispute a clear leader will emerge.”

“Out of this dispute, a clear leader will emerge.”
Cartoon by Frank Cotham

The Truthers, in short, maintained that the government had gone to extreme measures, including killing thousands of its own citizens, in order to carry out and cover up a conspiracy. And yet the same Truthers advertised the conference online and met in a place where they could easily be surveilled. Speakers’ names were posted on the Internet along with videos, photographs, and short bios. The organizers created a publicly accessible forum to discuss next steps, and a couple of attendees spoke to a reporter from the Times, despite the mainstream media’s ostensible complicity in the coverup. By the logic of their own theories, the Truthers were setting themselves up for assassination.

Their behavior demonstrates a paradox of belief. Action is supposed to follow belief, and yet beliefs, even fervently espoused ones, sometimes exist in their own cognitive cage, with little influence over behavior. Take the “Pizzagate” story, in which Hillary Clinton and her allies ran a child sex ring from the basement of a D.C. pizzeria. In the months surrounding the 2016 Presidential election, a staggering number of Americans—millions, by some estimates—endorsed the account, and, in December of that year, a North Carolina man charged into the restaurant, carrying an assault rifle. Van der Linden and Berinsky both use the incident as evidence of misinformation’s violent implications. But they’re missing the point: what’s really striking is how anomalous that act was. The pizzeria received menacing phone calls, even death threats, but the most common response from believers, aside from liking posts, seems to have been leaving negative Yelp reviews.

That certain deeply held beliefs seem insulated from other inferences isn’t peculiar to conspiracy theorists; it’s the experience of regular churchgoers. Catholics maintain that the Sacrament is the body of Christ, yet no one expects the bread to taste like raw flesh or accuses fellow-parishioners of cannibalism. In “How God Becomes Real” (2020), the Stanford anthropologist T. M. Luhrmann recounts evangelical Christians’ frustrations with their own beliefs. They thought less about God when they were not in church. They confessed to not praying. “I remember a man weeping in front of a church over not having sufficient faith that God would replace the job he had lost,” Luhrmann writes. The paradox of belief is one of Christianity’s “clearest” messages, she observes: “You may think you believe in God, but really you don’t. You don’t take God seriously enough. You don’t act as if he’s there.” It’s right out of Mark 9:24: “Lord, I believe; help my unbelief!”

The paradox of belief has been the subject of scholarly investigation; puzzling it out promises new insights about the human psyche. Some of the most influential work has been by the French philosopher and cognitive scientist Dan Sperber. Born into a Jewish family in France in 1942, during the Nazi Occupation, Sperber was smuggled to Switzerland when he was three months old. His parents returned to France three years later, and raised him as an atheist while imparting a respect for all religious-minded people, including his Hasidic Jewish ancestors.

The exercise of finding rationality in the seemingly irrational became an academic focus for Sperber in the nineteen-seventies. Staying with the Dorze people in southern Ethiopia, he noticed that they made assertions that they seemed both to believe and not to believe. People told him, for example, that “the leopard is a Christian animal who observes the fasts of the Ethiopian Orthodox Church.” Nevertheless, the average Dorze man guarded his livestock on fast days just as much as on other days. “Not because he suspects some leopards of being bad Christians,” Sperber wrote, “but because he takes it as true both that leopards fast and that they are always dangerous.”

Sperber concluded that there are two kinds of beliefs. The first he has called “factual” beliefs. Factual beliefs—such as the belief that chairs exist and that leopards are dangerous—guide behavior and tolerate little inconsistency; you can’t believe that leopards do and do not eat livestock. The second category he has called “symbolic” beliefs. These beliefs might feel genuine, but they’re cordoned off from action and expectation. We are, in turn, much more accepting of inconsistency when it comes to symbolic beliefs; we can believe, say, that God is all-powerful and good while allowing for the existence of evil and suffering.

In a masterly new book, “Religion as Make-Believe” (Harvard), Neil Van Leeuwen, a philosopher at Georgia State University, returns to Sperber’s ideas with notable rigor. He analyzes beliefs with a taxonomist’s care, classifying different types and identifying the properties that distinguish them. He proposes that humans represent and use factual beliefs differently from symbolic beliefs, which he terms “credences.” Factual beliefs are for modelling reality and behaving optimally within it. Because of their function in guiding action, they exhibit features like “involuntariness” (you can’t decide to adopt them) and “evidential vulnerability” (they respond to evidence). Symbolic beliefs, meanwhile, largely serve social ends, not epistemic ones, so we can hold them even in the face of contradictory evidence.

One of Van Leeuwen’s insights is that people distinguish between different categories of belief in everyday speech. We say we “believe” symbolic ones but that we “think” factual ones are true. He has run ingenious experiments showing that you can manipulate how people talk about beliefs by changing the environment in which they’re expressed or sustained. Tell participants that a woman named Sheila sets up a shrine to Elvis Presley and plays songs on his birthday, and they will more often say that she “believes” Elvis is alive. But tell them that Sheila went to study penguins in Antarctica in 1977, and missed the news of his death, and they’ll say she “thinks” he’s still around. As the German sociologist Georg Simmel recognized more than a century ago, religious beliefs seem to express commitments—we believe in God the way we believe in a parent or a loved one, rather than the way we believe chairs exist. Perhaps people who traffic in outlandish conspiracies don’t so much believe them as believe in them.

Van Leeuwen’s book complements a 2020 volume by Hugo Mercier, “Not Born Yesterday.” Mercier, a cognitive scientist at the École Normale Supérieure who studied under Sperber, argues that worries about human gullibility overlook how skilled we are at acquiring factual beliefs. Our understanding of reality matters, he notes. Get it wrong, and the consequences can be disastrous. On top of that, people have a selfish interest in manipulating one another. As a result, human beings have evolved a tool kit of psychological adaptations for evaluating information—what he calls “open vigilance mechanisms.” Where a credulity theorist like Thagard insists that humans tend to believe anything, Mercier shows that we are careful when adopting factual beliefs, and instinctively assess the quality of information, especially by tracking the reliability of sources.

Van Leeuwen and Mercier agree that many beliefs are not best interpreted as factual ones, although they lay out different reasons for why this might be. For Van Leeuwen, a major driver is group identity. Beliefs often function as badges: the stranger and more unsubstantiated the better. Religions, he notes, define membership on the basis of unverifiable or even unintelligible beliefs: that there is one God; that there is reincarnation; that this or that person was a prophet; that the Father, the Son, and the Holy Spirit are separate yet one. Mercier, in his work, has focussed more on justification. He says that we have intuitions—that vaccination is bad, for example, or that certain politicians can’t be trusted—and then collect stories that defend our positions. Still, both authors treat symbolic beliefs as socially strategic expressions.

After Mike Hughes’s death, a small debate broke out over the nature of his belief. His publicist, Darren Shuster, said that Hughes never really believed in a flat Earth. “It was a P.R. stunt,” he told Vice News. “We used the attention to get sponsorships and it kept working over and over again.” Space.com dug up an old interview corroborating Shuster’s statements. “This flat Earth has nothing to do with the steam rocket launches,” Hughes told the site in 2019. “It never did, it never will. I’m a daredevil!”

Perhaps it made sense that it was just a shtick. Hughes did death-defying stunts years before he joined the Flat Earthers. He was born in Oklahoma City in 1956 to an auto-mechanic father who enjoyed racing cars. At the age of twelve, Hughes was racing on his own, and not long afterward he was riding in professional motorcycle competitions. In 1996, he got a job driving limousines, but his dream of becoming the next Evel Knievel persisted; in 2002, he drove a Lincoln Town Car off a ramp and flew a hundred and three feet, landing him in Guinness World Records.

When Hughes first successfully launched a rocket, in 2014, he had never talked about the shape of the planet. In 2015, when he co-ran a Kickstarter campaign to fund the next rocket flight, the stated motivation was stardom, not science: “Mad Mike Hughes always wanted to be famous so much that he just decided one day to build a steam rocket and set the world record.” He got two backers and three hundred and ten dollars. Shortly afterward, he joined the Flat Earth community and tied his crusade to theirs. The community supported his new fund-raising effort, attracting more than eight thousand dollars. From there, his fame grew, earning him features in a documentary (“Rocketman,” from 2019) and that Science Channel series. Aligning with Flat Earthers clearly paid off.

Not everyone believes that he didn’t believe, however. Waldo Stakes, Hughes’s landlord and rocket-construction buddy, wrote on Facebook that “Mike was a real flat earther,” pointing to the “dozens of books on the subject” he owned, and said that Hughes lost money hosting a conference for the community. Another of Hughes’s friends told Kelly Weill that Flat Earth theory “started out as a marketing approach,” but that once it “generated awareness and involvement . . . it became something to him.”

The debate over Hughes’s convictions centers on the premise that a belief is either sincere or strategic, genuine or sham. That’s a false dichotomy. Indeed, the social functions of symbolic beliefs—functions such as signalling group identity—seem best achieved when the beliefs feel earnest. A Mormon who says that Joseph Smith was a prophet but secretly thinks he was a normal guy doesn’t strike us as a real Mormon. In fact, the evolutionary theorist Robert Trivers argued in “Deceit and Self-Deception” (2011) that we trick ourselves in order to convince others. Our minds are maintaining two representations of reality: there’s one that feels true and that we publicly advocate, and there’s another that we use to effectively interact with the world.

“I can say literally anything and they use it for spa music.”

“I can say literally anything and they use it for spa music.”
Cartoon by Sarah Morrissette

The idea of self-deception might seem like a stretch; Mercier has expressed skepticism about the theory. But it reconciles what appear to be contradictory findings. On the one hand, some research suggests that people’s beliefs in misinformation are authentic. In “Political Rumors,” for example, Berinsky describes experiments he conducted suggesting that people truly believe that Barack Obama is a Muslim and that the U.S. government allowed the 9/11 attacks to happen. “People by and large say what they mean,” he concludes.

On the other hand, there’s research implying that many false beliefs are little more than cheap talk. Put money on the table, and people suddenly see the light. In an influential paper published in 2015, a team led by the political scientist John Bullock found sizable differences in how Democrats and Republicans thought about politicized topics, like the number of casualties in the Iraq War. Paying respondents to be accurate, which included rewarding “don’t know” responses over wrong ones, cut the differences by eighty per cent. A series of experiments published in 2023 by van der Linden and three colleagues replicated the well-established finding that conservatives deem false headlines to be true more often than liberals—but found that the difference drops by half when people are compensated for accuracy. Some studies have reported smaller or more inconsistent effects, but the central point still stands. There may be people who believe in fake news the way they believe in leopards and chairs, but underlying many genuine-feeling endorsements is an understanding that they’re not exactly factual.

Van der Linden, Berinsky, and Thagard all offer ways to fight fabrication. But, because they treat misinformation as a problem of human gullibility, the remedies they propose tend to focus on minor issues, while scanting the larger social forces that drive the phenomenon. Consider van der Linden’s prescription. He devotes roughly a third of “Foolproof” to his group’s research on “prebunking,” or psychological inoculation. The idea is to present people with bogus information before they come across it in the real world and then expose its falsity—a kind of epistemic vaccination. Such prebunking can target specific untruths, or it can be “broad-spectrum,” as when people are familiarized with an array of misinformation techniques, from emotional appeals to conspiratorial language.

Prebunking has received an extraordinary amount of attention. If you’ve ever read a headline about a vaccine against fake news, it was probably about van der Linden’s work. His team has collaborated with Google, WhatsApp, the Department of Homeland Security, and the British Prime Minister’s office; similar interventions have popped up on Twitter (now X). In “Foolproof,” van der Linden reviews evidence that prebunking makes people better at identifying fake headlines. Yet nothing is mentioned about effects on their actual behavior. Does prebunking affect medical decisions? Does it make someone more willing to accept electoral outcomes? We’re left wondering.

The evidential gap is all the trickier because little research exists in the first place showing that misinformation affects behavior by changing beliefs. Berinsky acknowledges this in “Political Rumors” when he writes that “few scholars have established a direct causal link” between rumors and real-world outcomes. Does the spread of misinformation influence, say, voting decisions? Van der Linden admits, “Contrary to much of the commentary you may find in the popular media, scientists have been extremely skeptical.”

So it’s possible that we’ve been misinformed about how to fight misinformation. What about the social conditions that make us susceptible? Van der Linden tells us that people are more often drawn to conspiracy theories when they feel “uncertain and powerless,” and regard themselves as “marginalized victims.” Berinsky cites scholarship suggesting that conspiratorial rumors flourish among people who experience “a lack of interpersonal trust” and “a sense of alienation.” In his own research, he found that a big predictor of accepting false rumors is agreeing with statements such as “Politicians do not care much about what they say, so long as they get elected.” A recent study found a strong correlation between the prevalence of conspiracy beliefs and levels of governmental corruption; in those beliefs, Americans fell midway between people from Denmark and Sweden and people from middle-income countries such as Mexico and Turkey, reflecting a fraying sense of institutional integrity. More than Russian bots or click-hungry algorithms, a crisis of trust and legitimacy seems to lie behind the proliferation of paranoid falsehoods.

Findings like these require that we rethink what misinformation represents. As Dan Kahan, a legal scholar at Yale, notes, “Misinformation is not something that happens to the mass public but rather something that its members are complicit in producing.” That’s why thoughtful scholars—including the philosopher Daniel Williams and the experimental psychologist Sacha Altay—encourage us to see misinformation more as a symptom than as a disease. Unless we address issues of polarization and institutional trust, they say, we’ll make little headway against an endless supply of alluring fabrications.

From this perspective, railing against social media for manipulating our zombie minds is like cursing the wind for blowing down a house we’ve allowed to go to rack and ruin. It distracts us from our collective failures, from the conditions that degrade confidence and leave much of the citizenry feeling disempowered. By declaring that the problem consists of “irresponsible senders and gullible receivers,” in Thagard’s words, credulity theorists risk ignoring the social pathologies that cause people to become disenchanted and motivate them to rally around strange new creeds.

Mike Hughes was among the disenchanted. Sure, he used Flat Earth theory to become a celebrity, but its anti-institutionalist tone also spoke to him. In 2018, while seeking funding and attention for his next rocket ride, he self-published a book titled “ ‘Mad’ Mike Hughes: The Tell All Tale.” The book brims with outlandish, unsupported assertions—that George H. W. Bush was a pedophile, say—but they’re interspersed with more grounded frustrations. He saw a government commandeered by the greedy few, one that stretched the truth to start a war in Iraq, and that seemed concerned less with spreading freedom and more with funnelling tax dollars into the pockets of defense contractors. “You think about those numbers for a second,” he wrote, of the amount of money spent on the military. “We have homelessness in this country. We could pay off everyone’s mortgages. And we can eliminate sales tax. Everyone would actually be free.”

Hughes wasn’t a chump. He just felt endlessly lied to. As he wrote near the end of his book, “I want my coffee and I don’t want any whipped cream on top of it, you know what I mean? I just want this raw truth.” ♦

***
This article has been archived for your research. The original version from The New Yorker can be found here.