How can Big Tech best tackle conspiracy theories?
During the 2016 US election campaign, a conspiracy theory known as #Pizzagate went viral on rightwing websites.
The theory, which claimed that Democratic presidential candidate Hillary Clinton was involved in a child sex ring run from a Washington pizza restaurant, seemed laughable. Until a #Pizzagate believer used an assault rifle to fire at the restaurant.
Thankfully, nobody was hurt. But the episode raised two questions that remain pertinent four years on amid continued political polarisation: why do conspiracy theories proliferate? And is there a way to counter them effectively?
America’s tech giants have done extensive research into this subject, largely based on big data analysis and supplemented with insights from psychologists. But last year, a team of researchers at Jigsaw, a Google arm, joined forces with ethnographers at the consulting company ReD to take a different approach: face-to-face qualitative research that explored the attitudes of 42 conspiracy theorists in the UK and US in relation to ideas that ranged from the seemingly benign (say, that the earth is flat) to the dangerous (the white genocide theory) and, most recently, to pandemics.
Some of this material remains under wraps. But the Jigsaw and ReD team recently presented broad findings to a group known as Ethnographic Praxis in Context — and they are thought-provoking.
The key issue is how you approach conspiracy theorists. As Joseph Uscinski, a political scientist at the University of Miami, has pointed out, it is not clear that conspiracy theorists are more prevalent today than they were in earlier eras. “It’s a continuity,” he says.
But what makes our modern age striking is how quickly conspiracy theories can spread on the internet, and get picked up by mainstream media (and some politicians). Tech groups try to stop this with tactics that YouTube executives call the “four Rs”: removing dangerously misleading content; relegating such material in search results; raising better offerings in search rankings; and rewarding groups that debunk conspiracies. (The remarkable metabunk.org, created by science writer Mick West, is an example of the latter.)
This “four Rs” approach suggests that the issue is separating dangerous conspiracies from less dangerous ones. But according to the Jigsaw and ReD research, this may not be sufficient.
When the ethnographers tracked conspiracy theorists, they realised that what mattered most was not whether theories were dangerous, but the degree to which people did (or did not) have an overwhelming attachment to them. “It is more important to distinguish between types of theorists rather than types of conspiracy theories,” their paper explained.
People who were deeply in the grip of a conspiracy mentality were as likely to believe benign conspiracies as dangerous ones — there is “no such thing as an innocuous conspiracy per se”, the researchers noted. But some people could be persuaded out of their ideas — so that even the dangerous theories they held became less threatening.
The team therefore suggested the need for a multi-tier strategy. Those deeply gripped by conspiracies will not accept logical counterarguments, but they may respond to emotional cues that are presented with empathy and respect (as West has also argued). Meanwhile, wavering theorists can sometimes be influenced by “upstream” interventions (say by elevating material that rebuffs conspiracy theories in search engines).
Either way, what causes someone to become hooked on a conspiracy theory is not just a result of individual psychological issues (although these play a part) but of social affiliations too. In Montana, the team studied “Jennifer”, who embraced conspiracy theories because these defined her friendship group.
Anyone hoping to debunk these ideas also needs to think hard about cultural signals. Take website design. Twenty-first century professionals typically give more credibility to information that comes from sites that look polished.
Conversely, the ethnographers discovered that conspiracy theorists are more likely to believe information that comes from scruffier, amateurish sites, since these seem more “authentic”. This point may not be obvious to techies at places such as Google — and is not the type of insight that big data analysis will reveal. But it is crucial.
Can such insights be harnessed by tech companies to avoid future #Pizzagates? There have been small successes: the research describes how one user, “Lois” in San Diego, backed away from a conspiracy theory linked to chemtrails (the exhaust fumes emitted by planes) after Google elevated alternative material to the top of the search engine.
It will not be easy to scale these up, however, or to counter the speed at which new conspiracy theories keep morphing. (The debunked #Pizzagate theory recently resurfaced unexpectedly on social media platforms — this time dragging in the singer Justin Bieber.)
This is particularly alarming when you consider that Uscinski’s research suggests that 51 per cent of Americans now partly believe at least one of the major conspiracy theories floating around; and that, with Covid-19 and its putative vaccine, we are witnessing a potent new source of conspiracy ideas — irrespective of the US election.
Follow Gillian on Twitter @gilliantett and email her at gillian.tett@ft.com
Follow @FTMag on Twitter to find out about our latest stories first. Listen to our podcast, Culture Call, where FT editors and special guests discuss life and art in the time of coronavirus. Subscribe on Apple, Spotify, or wherever you listen.
*** This article has been archived for your research. The original version from Financial Times can be found here ***