Tuesday, November 26, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

COVID-19

A Sense of Community Helped Spread COVID Conspiracies on Twitter

As COVID-19 spread around the world in 2020, it was followed by a mutating mass of conspiracy theories: The virus was a bioweapon created by a globalist cabal. Or China. Or Bill Gates. It was spread through 5G cell phone towers. Perhaps it didn’t exist at all, and all those reports of overwhelmed hospitals and thousands of deaths were a hoax.

Like infectious diseases, conspiratorial ideas can be traced as they disperse through social networks. “People aren’t born believing in conspiracy theories,” says Paul Vicinanza, a doctoral student in organizational behavior at Stanford Graduate School of Business who coauthored a new study in the American Sociological Review about the emergence and trajectory of various COVID-19 conspiracy theories on Twitter.

With Hayagreeva “Huggy” Rao, a professor of organizational behavior at Stanford GSB; Henrich Greve, PhD ’94, a professor of entrepreneurship at INSEAD; and Echo Yan Zhou, also a doctoral student in organizational behavior at Stanford GSB, Vicinanza found that people embrace conspiracy theories as a way to make sense of real threats and they join online communities that not only spread new theories but also provide attention and approval.

“We find that social connections are really what license this kind of thinking,” Rao says. “People seek positive attention, in this case through retweets, and voicing an extreme idea is often what gets rewarded.”

The pandemic offered a particularly rich opportunity to study online conspiracy theories as they took shape, evolved, and spread in real time. To understand the “cognitive architecture” of these ideas, the researchers collected around 700,000 COVID conspiracy tweets from January through July 2020. The tweets contained overlapping and often contradictory ideas. Some portrayed the virus as a bioweapon; others said it was a hoax or exaggeration. Some promoted anti-vaccination or QAnon-related messages. From this swirl of conspiracies, the researchers identified 13 specific topics with help from a machine learning model.

Quote

This undercuts the whole ‘people who believe in conspiracies are crazy’ idea.

Attribution

Hayagreeva “Huggy” Rao

With this defined sample of conspiratorial tweets, the researchers could observe when and why people engaged with specific theories and how those theories gained and lost prominence over time. They found that 40% of conspiracy theorists started by exploring relatively innocuous or credible ideas, what the researchers call “gateway conspiracies.”

“This undercuts the whole ‘people who believe in conspiracies are crazy’ idea,” Rao says. For instance, someone might first latch on to the notion that COVID-19 deaths were being miscounted or misreported. “That’s entirely plausible, and people then move from these gateway conspiracies to other more extreme ones.” Users first spread theories about false-positive COVID tests 27% of the time, compared with 7% for theories about Bill Gates.

Are Algorithms and Bots to Blame?

Once people began tweeting about COVID conspiracies, their movement toward more extreme ideas took two basic steps. First, individuals found related ideas and their proponents through searches or with help from Twitter’s algorithm. Then, people began tweeting and retweeting new conspiracy theories to gauge the public response. “It’s almost like testing the waters to see how people react to the theories they’re spreading,” Zhou says. And in a competitive market for attention, more extreme ideas tended to get more retweets.

A key takeaway from these findings is that Twitter’s algorithms aren’t entirely to blame for the spread of conspiracy theories on the platform. A substantial body of research contends that social media companies’ recommendation engines incrementally push people toward fringe or radical beliefs. This new research instead points to the importance of peer interaction: Twitter users who share conspiratorial ideas become part of a larger “conspiracy community” that reinforces and amplifies their beliefs.

“It’s really important to recognize this social dynamic,” Greve says. “Algorithms create the exposure and might direct people into echo chambers, but algorithms don’t on their own radicalize people. That happens in these little societies.”

The researchers also revealed the flexibility, and apparent illogic, with which people adopt and discard different conspiracies. When COVID case counts climbed, Twitter users who had said the virus was a hoax might quickly switch to claiming it was a bioweapon. That these assertions can’t both be true, Greve says, indicates that people don’t necessarily propagate conspiracy theories because they wholeheartedly believe in them. More likely, conspiracy theories serve as a tool for finding meaning in a complex world and defending against unpredictable threats. Assembling a patchwork of diverse and divergent ideas, the researchers write, functions as “a hedge against increasing uncertainty.”

This insight helped the researchers differentiate between people and bots, which also spread conspiracy theories on Twitter. Individuals deployed conspiracies defensively and adopted new theories in response to social validation or growing case counts. Bots were less responsive to the news and other Twitter users. They were also more aggressive in their attempts to generate moral outrage and more consistent in their claims that nefarious forces created COVID-19.

“Any study of online conspiracy theories must account for the integral role bots play in amplifying the discourse,” Vicinanza says. “To the untrained observer, bots might appear more human-like than real human accounts: They specialize in sowing discord and distrust.”

The study’s results suggest the challenge of creating policies that effectively tamp down the spread of conspiracy theories on platforms like Twitter. “We often think about targeting the most extreme theories and banning people who proliferate them,” Vicinanza says. “But we demonstrate that what appear to be the most benign ideas can, in fact, become the most carcinogenic, as they allow more extreme beliefs to anchor.”

On top of that, the shifting nature of conspiratorial beliefs makes them difficult to eradicate: For every disputed or debunked idea, several more may evolve to take its place. “The broader issue is that the conventional ways of persuading people, which we understand so well from marketing, may not be that effective when it comes to conspiracy theories,” Rao says.

***
This article has been archived for your research. The original version from Stanford Graduate School of Business can be found here.