Why Facebook needs to be more paranoid about QAnon
In May, Facebook casually invited me to join a conspiracy cult that believes the world is controlled by a Satan-worshipping, baby-eating, deep-state coterie and can only be saved by US president Donald Trump.
“Join groups to connect with people who share your interests,” the social media network implored in a recommendation email. Below was a suggestion that I become part of a 135,000-strong Facebook group called “QAnon News & Updates — Intel drops, breadcrumbs, & the war against the Cabal”.
QAnon is an outlandish far-right conspiracy theory; in essence, an anonymous individual “Q” is drip-feeding believers “classified” information about Trump’s fight against a diabolical collective of Democrats and business elites. As QAnon has ballooned, it has taken on menacing undertones: followers, calling themselves “digital soldiers”, are encouraged to take an oath to “defend” the US constitution. Last year, the FBI labelled fringe political conspiracies, QAnon included, a domestic extremist terror threat.
But in 2020 it has metastasised from the fringes of internet culture to a mainstream phenomenon — Trump himself has publicly praised the group for its support — and has become a topic of consternation for observers of the presidential election, now less than a month away. That is a problem for Facebook and for the US.
What is particularly jarring is that this is history repeating itself: once again, short-sightedness from Silicon Valley has allowed extremist thinking to flourish.
In 2018, former YouTube staffer Guillaume Chaslot criticised the video site’s recommendations algorithm for pushing some users down a conspiracy-theory rabbit hole. Google-owned YouTube’s recommendations generate 70 per cent of views on the video platform. They have been crafted to keep you engaged for as long as possible, allowing more opportunity to serve advertising. This could mean repeatedly showing you similar content, Chaslot argued, deepening existing biases you might have. These are blind spots in the business model. The company promised in 2019 to do more to downrank the biggest conspiracy theories, though critics say it is yet to convincingly solve the problem.
So what had warranted Facebook’s QAnon advances towards me? The email was linked to my work Facebook page, which I use to monitor posts and live streams from Mark Zuckerberg and other Facebook executives. According to my search history, I had looked up the phrase “QAnon” several days earlier, likely triggering its recommendations algorithm.
By design, Facebook’s algorithms seem no less toxic and stubborn today than YouTube’s back then. Permitting such dangerous theories to circulate is one thing, but actively contributing to their proliferation is quite another.
Internal Facebook research in 2016 found that 64 per cent of new members of extremist groups had joined due to its recommendation tools. Its QAnon community grew to more than four million followers and members by August, up 34 per cent from around three million in June, according to the Guardian.
Facebook has since made moves to clamp down on QAnon, removing pages from its recommendations algorithms, banning advertising and downranking content in a bid to “restrict their ability to organise on our platform”.
Still, that it was three years after the theory was born before Facebook took action is alarming, particularly since Zuckerberg has announced a shift from an open friends-focused social network towards hosting more walled-off, private interest-based groups.
There is no denying such groups pose unique challenges. Flagging and taking down foreign terrorist groups such as Isis is a fairly unambiguous exercise. But how does one rank conspiracy theories? Can an algorithm assess where collective paranoia ends and a more violent conspiracy theory begins — and what is the appropriate response if it can?
The irony is that companies like Facebook pride themselves on innovating and delivering the future. But they don’t seem to be able to escape their past, which dangerously affects our present.
With deep pockets, Facebook should have the expertise for fiercer monitoring of its public and private groups and its recommendations algorithms and a lower bar for downranking questionable conspiracy theory content. Perhaps tech companies themselves need to be paranoid about the unintended consequences of their business model. Otherwise, in elections to come, we’re going to see history repeating itself.
Hannah Murphy is an FT technology correspondent
*** This article has been archived for your research. The original version from Financial Times can be found here ***