Sunday, November 24, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

COVID-19

Banning anti-vax groups on Facebook only drives them to Twitter, study suggests

Banning groups posting anti-vaccination content on Facebook promoted more such content on Twitter in the following month, according to a new study that may lead to a reassessment of policy measures aimed at tackling vaccine misinformation.

Scientists warn that anti-vaccine content and health-related misinformation on social media have been amplified by the Covid-19 pandemic.

They say the 150 largest anti-vaccine social media accounts have gained at least 7.8 million followers since 2019, growing nearly by a fifth during the period.

While a growing number of social media platforms are taking an active role in content moderation, scientists, including Tamar Mitts from Columbia University in New York, say little is known about the spillover effects of such online speech regulation across platforms.

In a new study, presented at WebSci ‘22: 14th ACM Web Science Conference 2022, researchers analysed the impact of removing groups promoting anti-vaccine content on Facebook on engagement with similar content on Twitter.

As part of the research, scientists followed 160 Facebook groups discussing Covid-19 vaccines and tracked their removal from the platform between April and September 2021.

Researchers also identified users who cited these Facebook groups on Twitter, and then examined their online behavior over time.

The study found that users citing removed Facebook groups promoted more anti-vaccine content on Twitter in the month following the removals.

One of the findings of the study, researchers say, is that users exposed to the removal of anti-vaccination pages from Facebook become more committed to those ideas in their Twitter activity by increasingly posting anti-vax content.

Based on the research, scientists say a cross-platform approach is needed for content moderation to work.

Compared to Twitter accounts citing Facebook groups that were not removed, scientists say users citing the removed Facebook groups used 10-33 per cent more anti-vaccine keywords on the microblogging platform.

“Our results suggest that taking down anti-vaccine content on one platform can result in increased production of similar content on other platforms, raising questions about the overall effectiveness of these measures,” scientists wrote in the study.

Scientists say there is also a need to develop a better model to estimate the likelihood that an idea expressed on one social media platform originated from another.

***
This article has been archived for your research. The original version from The Independent can be found here.