Facebook fails to label 80% of posts promoting bioweapons conspiracy theory
As social media companies promise to crack down on Russian disinformation about the war in Ukraine, studies show they continue to fall short, allowing disproven narratives to reach millions.
Facebook failed to label 80% of articles on its platform promoting a fast-spreading conspiracy theory that the US is funding the use of bioweapons in Ukraine, according to a study released Friday by the Center for Countering Digital Hate (CCDH).
The nonprofit disinformation research group studied a sample of posts from between 24 February and 14 March sharing external articles containing baseless claims about bioweapons. It found Facebook in 80% of cases failed to label posts as either “missing context”, containing “partly false information” or “false information” outright.
“If our researchers can identify false information about Ukraine openly circulating on its platform, it is within Meta’s capability to do the same,” said CCDH chief executive Imran Ahmed. “But we found that in the vast majority of cases, conspiracy theories are given a free pass.”
The bioweapons theory began to spread in the early days of the war on Ukraine, among fringe QAnon accounts, ultimately making its way to larger platforms such as Fox News. The White House has condemned the myth, saying it may have been manufactured by Moscow to justify a possible use of chemical weapons against Ukraine.
But it continues to spread across social media, including Facebook.
CCDH researchers used the social analytics tool NewsWhip to identify more than 120 articles from external sites that had false or misleading claims about bioweapons labs or misrepresented statements made by US officials and found articles in the sample had received more than 150,000 likes, comments and shares on Facebook.
CCDH has called on Meta to more thoroughly enforce its “false information” and “partly false information” labels and expand the use of its existing “missing context” label.
“Russia’s propaganda campaigns have benefited for years from Meta turning a blind eye to disinformation,” Ahmed said. “Despite taking action against state channels under enormous pressure, Meta is failing badly to contain major disinformation narratives that benefit Putin’s regime.”
Previous studies published by CCDH found Facebook struggled to enforce its own rules surrounding the labeling of state-sponsored news sources. And it is not alone: another study from Media Matters for America found YouTube had not only failed to remove thousands of videos about the biolabs theory, but had also profited off them through monetized channels.
While some experts have conceded Facebook is now stepping up to more thoroughly crack down on state propaganda, others say disinformation will continue to spread on social media as long as it is baked into the views-driven business model.
“At the end of the day, the algorithm will always prioritize misinformation because it is contentious, and Facebook makes more money when we are arguing with one another,” Ahmed said. “To reduce disinformation spread, we have to cut it off at the source.”
This article has been archived for your research. The original version from The Guardian can be found here.