QAnon groups have millions of members on Facebook, documents show
An internal investigation by Facebook has uncovered thousands of groups and pages, with millions of members and followers, that support the QAnon conspiracy theory, according to internal company documents reviewed by NBC News.
The investigation’s preliminary results, which were provided to NBC News by a Facebook employee, shed new light on the scope of activity and content from the QAnon community on Facebook, a scale previously undisclosed by Facebook and unreported by the news media, because most of the groups are private.
The top 10 groups identified in the investigation collectively contain more than 1 million members, with totals from more top groups and pages pushing the number of members and followers past 3 million. It is not clear how much overlap there is among the groups.
The investigation will likely inform what, if any, action Facebook decides to take against its QAnon community, according to the documents and two current Facebook employees who spoke on the condition of anonymity because they were not authorized to speak publicly on the matter. The company is considering an option similar to its handling of anti-vaccination content, which is to reject advertising and exclude QAnon groups and pages from search results and recommendations, an action that would reduce the community’s visibility.
An announcement about Facebook’s ultimate decision is also expected to target members of “militias and other violent social movements,” according to the documents and Facebook employees.
Facebook has been key to QAnon’s growth, in large part due to the platform’s Groups feature, which has also seen a significant uptick in use since the social network began emphasizing it in 2017.
There are tens of millions of active groups, a Facebook spokesperson told NBC News in 2019, a number that has probably grown since the company began serving up group posts in users’ main feeds. While most groups are dedicated to innocuous content, extremists, from QAnon conspiracy theorists to anti-vaccination activists, have also used the groups feature to grow their audiences and spread misinformation. Facebook aided that growth with its recommendations feature, powered by a secret algorithm that suggests groups to users seemingly based on interests and existing group membership.
Facebook has been studying the QAnon movement since at least June. In July, a Facebook spokesperson told NBC News that that company was investigating QAnon as part of a larger look at groups with potential ties to violence.
A small team working across several of Facebook’s departments found 185 ads that the company had accepted “praising, supporting, or representing” QAnon, according to an internal post shared among more than 400 employees. The ads generated about $12,000 for Facebook and 4 million impressions in the last 30 days.
Some of the most recent ads included one for a “QAnon March for Children” in Detroit, and several retailers selling QAnon merchandise, according to Facebook’s searchable ad library. Many now-inactive QAnon ads also ran recently on Instagram, which is owned by Facebook. Most of the Instagram accounts that ran those ads were abandoned or removed, according to the ad library.
A Facebook spokesperson said the company has routinely enforced its rules on QAnon groups.
“Enforcing against QAnon on Facebook is not new: we consistently take action against accounts, Groups, and Pages tied to QAnon that break our rules. Just last week, we removed a large Group with QAnon affiliations for violating our content policies, and removed a network of accounts for violating our policies against coordinated inauthentic behavior,” the spokesperson, who asked not to be named for fear of harassment from the QAnon community, wrote in an emailed statement. “We have teams assessing our policies against QAnon and are currently exploring additional actions we can take.”
The potential crack down follows a campaign from mainstream advertisers as well as lawmakers to curtail misinformation and hate speech on Facebook. Along with a significant summer advertising boycott, 20 state attorneys general and the congressional Democratic Women’s Caucus wrote separate letters last week urging Facebook to enforce its policies and clean up its platform.
Some members of Facebook’s cross-departmental team tasked with tracking QAnon for the internal investigation say they are concerned the company will decline to ban QAnon groups outright, opting for weaker enforcement actions, according to one current employee. Those employees have shared concerns with one another that QAnon could influence the 2020 election, the employee added, noting that the pages and groups most likely violate Facebook’s existing policies against misinformation and extremism.
Facebook and other platforms face a unique challenge in moderating QAnon communities, said Joan Donovan, director of the Kennedy School’s Shorenstein Center on Media Politics and Public Policy at Harvard. The platforms act both as the “base infrastructure” for networking and spreading content and a target of the conspiracy theory itself, which frames Facebook and other platforms as “oppressive regimes that seek to destroy truth,” Donovan said.
“Facebook is definitely the largest piece of the QAnon infrastructure,” Donovan said. “While people who have bought into these disinformation campaigns are already affected, preventing it from spreading to new groups and new audiences is one intervention, among many, that are needed. Unless there is some kind of coordination between platform companies to get rid of the main QAnon influencers, it will continuously pop back up.”
Facebook’s anticipated move follows Twitter’s more aggressive action against QAnon. In July, Twitter announced it had banned 7,000 QAnon accounts for breaking its rules around platform manipulation, misinformation and harassment. Twitter also said it would no longer recommend QAnon accounts and content, would stop such content from appearing in trends and search, and would block QAnon’s internet links.
QAnon is a right-wing conspiracy theory that originally formedaround the idea that President Donald Trump is leading a secret war against the “deep state,” a group of political, business and Hollywood elites who, according to the theory, worship Satan and abuse and murder children. These baseless claims emerge from posts by an anonymous user on a fringe internet forum who goes by “Q.”
QAnon grew out of the “pizzagate” conspiracy theory, which claimed that Hillary Clinton ran a pedophilia ring from a Washington pizza shop. Many of the most popular QAnon groups are also pizzagate groups, according to the leaked documents.
Both pizzagate and QAnon have been implicated in real-world violence, including armed standoffs, attempted kidnappings, harassment campaigns, a shooting and at least two murders — events noted by Facebook as part of its investigation, according to the documents. In 2019, the FBI designated QAnon as a potential domestic terrorist threat.
While QAnon is a product of the internet, born on fringe forums and spread through social media, the conspiracy has become politically mainstream in recent months. “Q” signs and merchandise were first spotted at Trump campaign rallies in 2018. More than 70 congressional candidates have endorsed some part of the QAnon ideology in 2020, according to the liberal watchdog Media Matters.
In 2019, Facebook took action against anti-vaccination pages and content, hoping to reduce the visibility of misinformation by strangling its reach, but it stopped short of a total ban. Despite that action, the largest anti-vaccination pages and groups have continued to grow in the last year, according to data from CrowdTangle, Facebook’s social media analysis tool.
Facebook has taken down QAnon accounts before, but previous removals have been based on behavior rather than content that violated policy. Last week, Facebook removed a QAnon group with nearly 200,000 members “for repeatedly posting content that violated our policies,” according to a Facebook spokesperson. In May, Facebook purged a small section of the U.S. QAnon community that included five pages, six groups and 20 profiles, citing “coordinated inauthentic behavior,” whereby accounts work together to push content and obscure their own networks.
Last week, Facebook removed 35 Facebook accounts, three pages and 88 Instagram accounts that operated from Romania and pushed pro-Trump messages, including the promotion of QAnon.
*** This article has been archived for your research. The original version from NBC News can be found here ***