Facebook broadens measures against QAnon, will remove groups
The company said Tuesday that it will remove Facebook pages, groups and Instagram accounts for “representing QAnon” – even if they don’t promote violence. Facebook did not immediately explain what it means for Facebook groups to “represent” QAnon.
Less than two months ago, Facebook said it would stop promoting the group and its adherents, although it faltered with spotty enforcement. It said it would only remove QAnon groups if they promote violence. That is no longer the case.
The company said it is starting to enforce the policy as of Tuesday but cautioned that it “will take time and will continue in the coming days and weeks.”
The QAnon phenomenon has sprawled across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos in recent years. QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax.
But the conspiracy theory has also seeped into mainstream politics. Several Republicans running for Congress this year are QAnon-friendly.
By the time Facebook and other social media companies began enforcing – however limited – policies against QAnon, critics said it was largely too late. Reddit, which began banning QAnon groups in 2018, was well ahead, and to date it has largely avoided having a notable QAnon presence on its platform.
Twitter did not immediately respond to a message for comment on Tuesday.
Copyright © 2020 by The Associated Press. All Rights Reserved.
*** This article has been archived for your research. The original version from WTVD-TV can be found here ***