Why dangerous conspiracy theories about the virus spread so fast — and how they can be stopped
Misinformation spreads online much like a virus itself. Although various types spread slightly differently, the transmission of the 5G conspiracy theory offers some insight into how false claims grow online.
How it starts
A “calamitous event” like the pandemic creates a “very fertile breeding ground for conspiracy theories,” said John Cook, an expert on misinformation with George Mason University’s Center for Climate Change Communication.
The onslaught of information and misinformation on social media, on cable news and in general conversation may create confusion, but it’s made even worse by human discomfort with ambiguity, especially when our lives are at stake.
Katie Pine, an assistant professor in the College of Health Solutions at Arizona State University, is currently interviewing people around the United States on how they’re navigating covid-19. She said people “feel like they’re inundated with information, but they don’t have the information they want,” and as a result, they might be more willing to believe outlandish claims.
In this case, it involved 5G, the newest, fastest type of cellular network, which began deploying globally in 2019. As is usually the case with new technologies, it has attracted its fair share of conspiracy theories. A general practitioner in Belgium named Kris Van Kerckhoven baselessly told the newspaper Het Laatste Nieuws in a Jan. 22 story that 5G was life-threatening and linked to the coronavirus, as Wired reported.
The newspaper quickly issued a correction and deleted the offending article from its website, but it was too late. Anti-5G groups began spreading the rumor, and some members of a frightened public, desperate for some sense of order, believed this deeply implausible lie.
“When people feel threatened or out of control or they’re trying to explain a big significant event, they’re more vulnerable or prone to turning to conspiracy theories to explain them,” Cook said. “Somewhat counterintuitively, it gives people more sense of control to imagine that, rather than random things happening, there are these shadowy groups and agencies that are controlling it. Randomness is very discomforting to people.”
The actual origin of a piece of misinformation, many experts agreed, doesn’t matter all that much, because it takes on a life of its own once released into the world.
“The big challenge here is human psychology, because our brains are built for making quick snap judgments,” Cook said. “It’s really hard for us to take the time and effort to think through things, fact-check and assess.”
How it spreads
“Misinformation spreads everywhere the same way information spreads everywhere,” said Leticia Bode, an associate professor in the communication, culture and technology master’s program at Georgetown University. And “repeating misinformation makes it seem more plausible over time.”
It’s certainly more difficult to fight misinformation if someone is purposely and relentlessly spreading it, which is partially the case with various 5G conspiracy theories. The Russian network RT America has been peddling disinformation about the mobile network since long before covid-19, in part, according to the New York Times, to slow the rollout in the United States and give Russia time to catch up.
RT helped plant the seeds of mistrust surrounding 5G. Van Kerckhoven watered them.
“There are people who believe one conspiracy theory or another because it fits their political beliefs, and there are some people for whom conspiracy theories are their beliefs,” said Mike Wood, a psychologist and expert on belief in conspiracy theories who studied the spread of misinformation during the Zika outbreak in 2016. “For those people, the specifics of the conspiracy theory don’t matter all that much.”
“A lot of times, these are recycled from earlier conspiracy theories,” he added. “Some of them are very predictable. If there’s a mass shooting, you know there are going to be conspiracy theories that it was a false flag done by a government, because there always are. In a pandemic, there’s immediately going to be conspiracy theories that the virus is either harmless, a bioweapon that’s going to kill everybody or an excuse for the government to give a vaccine that is going to kill everybody.”
Eventually, the 5G false claims spread from conspiratorial corners of the Internet into the mainstream, fueled by celebrities.
Singer M.I.A. baselessly tweeted, “I don’t think 5G gives you COVID19. I think it can confuse or slow the body down in healing process as body is learning to cope with new signals [wavelengths] frequency etc.” Actor John Cusack baselessly said that 5G will “be proven to be very very bad for people’s health,” in a tweet he later deleted. And Woody Harrelson explicitly linked 5G and the virus in two now-deleted Instagram posts. An African pastor named Chris Oyakhilome shared this conspiracy theory with his 2 million Facebook followers, while music producer Teddy Riley spread it on Instagram Live.
At that point, it became almost impossible to stop.
How it’s stopped
While the origin of a piece of information might not matter, the type of misinformation does. The idea that eating garlic can help fight the disease is a natural rumor — it’s just bad information. But a conspiracy theory supposes that a nefarious group of people are carrying out a plan, and that’s much more difficult to disprove.
“More natural rumors will go really fast and burn out,” said Kate Starbird, an associate professor with the Department of Human Centered Design and Engineering at the University of Washington. “But conspiracy theory rumors build up, and they persist.”
One reason for the disparity: Conspiracy theories often have something of a built-in safety mechanism in that they falsely implicate fact-checking organizations as being part of the conspiracy.
“For a conspiracy theory where the misinformation is wrapped up around this air of distrust, distrust of institutions, distrust of mainstream accounts, even distrust the science, then any evidence that comes in that disproves the theory is seen as being part of the conspiracy,” Cook said.
So while there haven’t been many reports of people following some of the more bogus claims that have gone around — such as the erroneous claim that drinking bleach will cure the virus — the 5G conspiracy theory has led to the real-world destruction of cell towers across Europe, according to government officials. In Britain alone, the New York Times reported, “more than 30 acts of arson and vandalism have taken place against wireless towers and other telecom gear” in the month of April, including towers in Liverpool and Birmingham on April 2 and 3.
Even when your readers or social media followers are open to a correction, it’s difficult to write about misinformation without spreading the very misinformation you’re debunking. “You need to put emphasis on facts,” Cook said. And “when a piece of misinformation is introduced, you must flag it as misinformation. So when people read it, they’re cognitively on-guard and less likely to believe it.”
But it can be done. Bode emphasized the importance of correcting untruths in person or on social media, saying, “You need to be nice and provide some indication that you know what you’re talking about. We generally recommend you provide some kind of link to an authoritative source like the CDC or the WHO.” Having multiple people weigh in with the truth is even more effective.
Perhaps most important is being careful what information you choose believe. Pine’s early findings suggest some people “wait until they see a headline in multiple places” before believing it. Even then, looking deeper is a good idea.
As Starbird put it, “Some of the conspiracy theories during a crisis event can be very attractive, so they can attract new sets of eyes.”
*** This article has been archived for your research. The original version from The Washington Post can be found here ***