conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

COVID-19

The race to curb the spread of COVID vaccine disinformation

NEWS

Researchers are applying strategies honed during the 2020 US presidential election to track anti-vax propaganda.
A woman talks into a microphone, surrounded by people holding various anti-vaccine placards. A woman talks into a microphone, surrounded by people holding various anti-vaccine placards.

Anti-vaccine protesters voice their disapproval outside pharmaceutical giant Pfizer’s New York City headquarters. Credit: Matthew McDermott/Polaris/eyevine

In March, Twitter put its foot down: users who repeatedly spread false information about COVID-19 vaccines will have their accounts suspended or shut down. It was a new front in a high-stakes battle over misinformation that could help to determine how many people get vaccinated, and how swiftly the pandemic ends.

The battle is also being fought in computer-science and sociology labs across the United States, where scientists who track the spread of false information on social media honed their skills during the US presidential election last year. They are now shifting focus, from false claims that the election was ‘stolen’ to untruths about COVID-19 vaccines. Some surveys suggest that more than one-fifth of people in the United States are opposed to receiving a vaccine.

Researchers are launching projects to track and tag vaccine misinformation and disinformation on social media, as well as collecting massive amounts of data to understand the ways in which misinformation, political rhetoric and public policies all interact to influence vaccine uptake across the United States.

Scientists have identified a wide variety of disinformation surrounding COVID-19 and vaccines, ranging from conspiracy theories that the pandemic was engineered to control society or boost hospital profits, through to claims that the vaccines are risky and unnecessary.

One research consortium, dubbed the Virality Project, is expanding on strategies pioneered during the election to help inform how platforms such as Twitter and Facebook tackle vaccine disinformation. Created by researchers at multiple US institutions — including Stanford University in California, the University of Washington in Seattle and New York University — the team is working with public-health agencies and social-media companies to identify, track and report disinformation that violates their rules.

Election and vaccine focus

US disinformation researchers have focused on the election and COVID-19 vaccines because of the potential for significant public harm in these areas, says Renée DiResta, research manager at Stanford’s Internet Observatory.

Although social-media companies would prefer not to be the truth police, these are topics where the stakes are so high that they have to take action, she says, adding that when it comes to misinformation online the potential for harm has to be weighed closely against the right to free speech.

Efforts to counter misinformation were scaled up during and after the election, culminating earlier this year with decisions by both Twitter and Facebook to kick former US president Donald Trump off their platforms. More recently, both companies have unveiled policies aimed at ending disinformation about COVID-19 vaccines.

In February, Facebook announced that it was expanding efforts to take down false claims on its main platform and on Instagram, which it also owns. Twitter followed in early March. Both companies declared that they would not only remove posts and tweets that perpetuate false information, but also shut down accounts that repeatedly violate their policies.

Impact of super-spreaders

These policies align with research showing that false information on the web is propagated mainly by a relatively small number of super-spreaders, often high-profile partisan media outlets, social-media influencers and political figures, such as Trump. Twitter went one step further by revealing its five-strike policy, which clarifies when repeat offenders will see their accounts suspended or permanently revoked.

That clarity is a good thing, say Virality Project researchers. “If people think they can just keep violating the policies, they are not a good deterrent,” says Carly Miller, a research analyst at the Stanford Internet Observatory.

Efforts such as the Virality Project do seem to help. In a separate project focused on election integrity last year, the same team of researchers issued more than 600 notifications to social-media platforms regarding accounts that had violated their policies, both before and after the November election. Facebook, Instagram, Twitter, TikTok and YouTube took notice and labelled, blocked or removed up to 35% of the offenders, according to a summary report released last month.

Assessing the precise impact of Twitter and Facebook’s COVID-19 policies will be difficult, because researchers don’t have access to the internal data and decisions of social-media companies, says DiResta. Nor did the companies respond to Nature’s approach for comment.

Although the latest efforts by Twitter and Facebook should help to reduce disinformation, they won’t necessarily get at the larger social and political dynamics that drive disinformation and concerns over vaccination, argues Amir Bagherpour, a political scientist studying disinformation at the Federation of American Scientists, an advocacy group in Washington DC.

Information observatory

A desire to understand what people are thinking about COVID-19, and why, inspired the COVID States Project, a massive effort to track public opinion that was launched last March with a US$200,000 grant from the US National Science Foundation.

Co-led by David Lazer, a political scientist at Northeastern University in Boston, Massachusetts, researchers have been conducting surveys of as many as 25,000 people per month, across all 50 US states, as well as collecting information on Twitter use by nearly 1.6 million people.

In February, around 21% of survey respondents said that they would not get the vaccine; that figure was 24% among health-care workers1. As with the broader population, Lazer says level of education is a driving factor : 33% of health care workers with only a high school education say they would not get a vaccine, compared to just 11% of those with graduate degrees.

Already, the team is learning about what does and does not work when it comes to countering health misinformation. Its results suggest that doctors and scientists are the most trusted sources, whereas messages from overtly partisan political figures are less likely to be believed.

“I think it’s going to be primary-care providers who will be leading the battle against vaccine resistance,” says Lazer. “People listen to their doctors, and if their doctors say it’s OK, that will affect their choices.”

References

Latest on:

Nature Briefing

An essential round-up of science news, opinion and analysis, delivered to your inbox every weekday.

*** This article has been archived for your research. The original version from Nature.com can be found here ***