Facebook bans QAnon across its platforms
Facebook said Tuesday that it is banning all QAnon accounts from its platforms, a significant escalation over its previous action and one of the broadest rules the social media giant has put in place in its history.
Facebook said the change is an update on the policy it created in August that initially only removed QAnon accounts that discussed violence, which resulted in the termination of 1,500 pages, groups and profiles.
A company spokesperson said the enforcement, which started Tuesday, will “bring to parity what we’ve been doing on other pieces of policy with regard to militarized social movements,” such as militia and terror groups that repeatedly call for violence.
“Starting today, we will remove Facebook Pages, Groups and Instagram accounts for representing QAnon. We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks,” Facebook wrote in a press release. “Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”
The spokesperson said the company believed it needed to limit the “ability of QAnon and Militarized Social Movements to operate and organize on our platform.”
QAnon is a conspiracy theory that grew out of the fringes of the internet and posits that high-profile Democrats and Hollywood celebrities are members of a child-eating cabal that is being secretly taken down by President Donald Trump, and that members of this fictitious cabal will soon be marched to their execution. The conspiracy theory relies on posts from Q, an anonymous user of the extremist message board 8kun, which was formerly called 8chan, who has been wrongly predicting the roundup of prominent Democrats since October 2017.
The Facebook spokesperson said the company is “not going after individual posts,” but whole accounts that spread the conspiracy theory, which has been tied to acts of violence.
QAnon accounts have become centralized hubs for coordinated disinformation campaigns in the last several weeks. Before last Tuesday’s debate, QAnon accounts pushed the conspiracy theory that former Vice President Joe Biden was secretly wearing an earpiece before the debate even began.
In the last week, the QAnon community has pushed the conspiracy theory that Trump is not sick with the coronavirus, but carrying out secret missions in a fictitious war that has been predicted by QAnon followers.
Generally, QAnon accounts are also spreaders of coronavirus disinformation, as many followers do not believe the virus exists or that it’s as deadly as scientists say.
“We have to think about the QAnon networks as the rails upon which misinformation is driven,” said Joan Donovan, research director of the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School. “Every account, event and page are tracks where disinformation can be spread. so it is imperative that Facebook dismantle their infrastructure. Without Facebook, they are not rendered inert, but it will make it more difficult to quickly spread disinformation.”
“Of course, this all could have been done sooner, before Q factions aligned with militia groups and anti-vaxxers, to curtail the spread of medical misinformation and the mobilization of vigilante groups,” Donovan said.
With the new and complete ban, Facebook faces new hurdles to identifying accounts and enforcement. Reacting to the partial ban in August, QAnon groups and followers shifted tactics to evade moderation, dropping explicit references to Q, and “camouflaging” QAnon content under hashtags ostensibly about protecting children.
*** This article has been archived for your research. The original version from NBC News can be found here ***