Facebook imposes major new restrictions on QAnon, stepping up enforcement against the conspiracy theory
“QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement,” the company said in its blog post.
The ban encompasses all Facebook pages and groups devoted to QAnon, as well as Instagram accounts that have names representing the deluded philosophy. It does not reach individual Facebook profiles or posts, meaning conversation about QAnon will hardly be forbidden on the platform.
This action comes after more than two years of mounting evidence that the QAnon conspiracy is rife with violent, hateful themes that regularly violated policies across Silicon Valley and also inspired numerous real-world crimes.
“Ultimately the real test will be whether Facebook actually takes measures to enforce these new policies — we’ve seen in a myriad of other contexts, including with respect to right-wing militias like the Boogaloos, that Facebook has repeatedly failed to consistently enforce its existing policies,” said Sen. Mark R. Warner (D-Va.), who has been pushing Facebook for more action against QAnon.
At the core QAnon, which began in October 2017, are baseless allegations that Democratic officials and Hollywood celebrities engaged in unconscionable crimes, including raping and eating children, while seeking to subvert the Constitution. President Trump, the conspiracy theory holds, is quietly battling these evils.
The “Q” of QAnon is supposedly a high-level government official privy to these secrets because of a top-secret security clearance. The shadowy figure speaks only on the site 8kun, a successor to the now-closed 8chan, but the information for years spread almost instantly across mainstream social media platforms, powered by those analyzing Q’s pronouncements.
The conspiracy theory has grown particularly popular on the political right, with more than 70 Republican candidates for office embracing at least some elements of QAnon this year, according to tracking by liberal research group Media Matters. One adherent, Marjorie Taylor Greene, is virtually guaranteed to win a seat in Congress in November’s election.
QAnon this year has played a key role in spread disinformation related to Covid-19 and the vaccines that might help remedy it, as the conspiracy theory has expanded to take on new themes, such as the supposed dangers of 5G cellular technology.
Facebook moved quickly on Tuesday to scrub QAnon content that had been widespread on the platform. A page with 130,000 followers, called “Q Pin” and devoted to “all things Q,” remained active on the platform six days after The Washington Post had raised questions about violent language appearing on its posts. It was removed within an hour of the announcement of the new policy.
Three times last month the page shared an “Army for Trump” website seeking to recruit volunteers to stand watch at the polls, among other responsibilities. On one post, a user commented to call Democrats “dead ducks in the water.” Another user falsely suggested Democrats had promised “Riots and Murders” surrounding the election and asked how Republicans would respond.
The language illustrates how difficult it was for the platform to police only QAnon content trafficking in violent rhetoric or imagery — Facebook’s previous standard but a fine line for a movement that envisions the mass arrest of Democrats and celebrities.
Facebook’s action drew praise on Tuesday from those that had been calling on the platforms to do more to combat QAnon.
“I’m pleased to see Facebook taking this action. And I hope we see other social media platforms follow their lead swiftly. But the growth of QAnon online is not just the fault of the social media platforms. We need political leadership— leadership from all elected officials, and from all sides of the political spectrum— to denounce QAnon and similar groups,” said Daniel J. Jones, a former FBI analyst and Senate investigator who lead the review of the CIA’s torture program, now president of Advance Democracy.
The group found similar problems on Twitter following its enforcement action in July, which led to QAnon content dropping by roughly half while large amounts remained on the platform.
*** This article has been archived for your research. The original version from The Washington Post can be found here ***