Thursday, December 19, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

From QAnon to anti-vax: Can you cure a conspiracy theorist?

Jitarth Jadeja first fell down the rabbit hole when Donald Trump won the presidential election on 9 November 2016. As an Australian who had previously lived in the US, he had developed an avid interest in American politics through Reddit. He was a Bernie Sanders supporter and the election result left him disillusioned, confused and exasperated. He had recently been diagnosed with attention deficit hyperactivity disorder (ADHD), and as a university student found himself with plenty of spare time. He turned to the internet to make sense of the world around him.

He soon found Infowars – a far-right conspiracy theory website, which hosts professionally produced video interviews on YouTube with high-profile figures, including Trump himself. “I was socially isolated and had a chaotic mental state,” Jadeja tells Spotlight. “[Watching videos] was almost like an addiction.”

YouTube soon led him to a darker place. The algorithm recommended an interview with QAnon. Considered as more of a cult than a political movement, QAnon centres around the idea that a secret “cabal” of liberal elites, who worship the devil and run a global paedophile ring, conspired against Trump during his term in office. Jadeja says with incredulity that its beliefs extend to a “judgement day” of public executions, followed by Trump unleashing a technology-fuelled “nirvana” onto the world. “Because it was on Infowars, it gave it this air of legitimacy,” he says. He swiftly moved to fringe social media such as 4Chan and Voat, where conspiracies thrive.

Jadeja became miserable and lonely. He had been manipulated into believing that everyone else was “asleep” while he was “awake”. But his world started to unravel in 2019 when he read an article by the journalist Mike Rothschild exposing holes in QAnon theories. Over six months, Jadeja crawled his way out of the rabbit hole the same way he’d burrowed his way in – through content consumption. In June 2019, he left conspiracies behind. He had also recently started medication for bipolar disorder, a life event that he feels provided him with mental clarity.

He is now a vocal anti-conspiracy advocate and shares his experiences to warn people of QAnon’s dangers. He says with remorse that he introduced his father to the cult, and now cannot convince him to leave. “I feel really guilty,” he says. “For me, hearing any conspiracies is triggering. I really try not to think of anyone as having bad intent now, or of being a bad person.”

Jadeja is one of millions who have fallen victim to social media’s algorithms. What started as one video cascaded into far-right content on the deepest enclaves of the internet. He likens the internet to “junk food”: “People binge and binge, and at first it’s tasty, but in the end, you feel like shit,” he says. “This [feature] of just offering people more of what they want is a real problem.”

Content from our partners
How the UK can fight global plastic pollution
The role of prevention in supporting the nation’s oral health
Insuring the future

Where misinformation comes from

A protester with a flare poses holding a sign calling for people to "wake up" during a "World Wide Rally For Freedom" protest on March 20, 2021 in London, England.
Anti-vax protesters at the World Wide Rally For Freedom protest on 20 March 2021 in London. Image courtesy of Hollie Adams / Stringer / Getty Images

Significant world events often lead to a spike in conspiracies. Like the presidential election, Covid-19 brought misinformation (spreading false information under the belief that it is true) and disinformation (intentionally spreading false information to deceive others) back into public consciousness again, from theories around 5G towers to vaccines causing infertility. But they are not a creation of the internet age, says Daniel Jolley, a social psychologist and assistant professor at Nottingham University. We need only look at past events such as the John F Kennedy assassination to know that rumours can thrive offline too.

The most significant change is the speed and ease at which they can spread. “Conspiracy content can now be shared within seconds,” says Jolley. “Back in the era of radio and print, journalists were the gatekeepers and it would be down to an editor to publish a reader letter. Now, anyone can make an account on Twitter and start pumping out a range of ideas.”

Social media has democratised knowledge; we no longer formulate our opinions based on official news alone. “Now everyone can be a publisher,” says Steve Nowottny, editor of the fact-checking organisation Full Fact. “Influencers can publish to very large audiences without the usual editorial checks and balances.”

In a sense, this has been liberating, increasing our access to accurate as well as false information. But the internet’s vastness means that our consumption can still be quite limited. Rather than getting a balanced view from opposing sources, people become trapped within echo chambers. “The internet is a big place and it’s very easy to get lost in it,” says Nowottny. “It’s possible to find misinformation that supports other misinformation, and spiral from there.” Jadeja thinks that information overload has polarised people further and taken away the collective feeling that used to come from ubiquitous news sources. “There is no sense of community or shared reality anymore,” he says. “Ironically, by connecting with so many people, we’ve somehow never had less in common.”

Conspiracies in times of crisis

Anti-vax protesters in London
Protesters at a Unite For Freedom anti-lockdown protest in London, 24 April 2021. Image courtesy of Jessica Girvan / Shutterstock

Distressing national or global events can cause people to try to fill information gaps where there are currently no answers, says Nowottny. Facts are also commonly manipulated to fit conspiratorial agendas. Last year, the UK Health Security Agency released data showing that there were higher rates of Covid-19 among vaccinated people. This was due to many factors, including individual behaviour. However, this morphed into the claim that the vaccine was ineffective, proliferated by Brazilian president Jair Bolsonaro and consequently spread to 13 countries in 11 languages.

The impact of influential figures legitimising false claims can be frightening. The US Capitol Building riots on 6 January 2021 were fuelled by Trump tweeting allegations of voter fraud and resulted in several deaths. Boris Johnson’s outburst linking Keir Starmer to a failure to prosecute Jimmy Savile resulted in the Labour leader receiving death threats and being mobbed by anti-vax protesters.

“Misinformation needs to be challenged in public forums,” says Nowottny. “It’s really important that the media, politicians and public figures are saying stuff that is true and are willing to correct themselves if they’ve said something that isn’t.” The spread of misinformation also has insidious longer-term effects, including increased animosity, hatred and prejudice towards minority groups, and higher levels of societal polarisation.

What makes someone susceptible?

Links have been found between conspiracy believers and those who distrust power and authority. For example, people with anti-vax views are more inclined to believe in conspiracies, be sensitive to infringement on personal freedom and support individualistic world views, according to analysis of more than 5,000 people in 24 countries.

As such, believers of one conspiracy, such as around vaccines, might be inclined to believe another, such as climate change denial. Common threads include threats to personal and global health, the need to adhere to government policy and the need to cooperate with and trust science.

However, any of us could potentially find conspiracy theories attractive, says Jolley. In times of turbulence, being able to “blame” powerful forces can help people feel temporarily empowered, although this is short-lived – the feeling that someone is “out to get you” can ultimately lead to greater feelings of mistrust and powerlessness, he says.

Anyone might also use conspiracies to satisfy psychological needs that are not being met, says Karen Douglas, professor of social psychology at the University of Kent. These could be: “epistemic”, the need to know the truth and have certainty; “existential”, the need to feel safe and have control; and “social”, the need to maintain self-esteem and feel positive about the groups we belong to. This might explain why conspiracies were so prolific during the pandemic. “People were scared and looking for ways to cope with uncertainty, insecurity and loss of social contact,” says Douglas.

A feeling of disenfranchisement or societal exclusion can also be a strong propellant for seeking support elsewhere. In 2014, Caleb Cain, who was 21 at the time, found himself spiralling into alternative right-wing content on YouTube. Growing up in West Virginia, he had liberal, left-wing views as a teenager but came from a “poor, rural” background, often had “clashes with authority” and had an unstable relationship with his family, he tells Spotlight. After dropping out of college, Cain became depressed and isolated from friends, and the internet became his pastime.

What started as self-help videos soon became anti-feminist, Islamophobic and racist theory. Having always been interested in counterculture, he says the anti-establishment mentality of the alt-right appealed to him and “brainwashed” him for five years. “I had a traditionalist view of the world around the decline of the West, and started picking out enemies of that,” he says. “There is a big victim complex to it.”

It was only when he watched a fierce debate between political commentator Steven Bonnell and an alt-right activist that he began to doubt his views. “This made me curious and was a gateway to other material,” he says. “The same way I went into the alt-right, was the same process that I left it.” When the Christchurch mosque shootings happened in 2019, he started a YouTube channel called Faraday Speaks to help people who have been radicalised, and he now also does academic research in this area. “I realised this wasn’t an isolated thing,” he says. “I was influencing the world – I had tried to convince my friends and family [of my beliefs]. I was a little crumb of a poisonous cookie.”

The free speech conundrum

QAnon supporter holding sign up outside the White House
A QAnon supporter holds a sign up outside the White House in Washington. Image courtesy of Orlowski Designs LLC / Shutterstock

Research suggests that trauma, whether personal or collective, can attract people towards conspiracy. Simon*, an NHS health analyst, says that when his brother died in 2019, his sister-in-law’s behaviour started changing. The first Covid-19 lockdown then “turbocharged her descent into libertarian-style thinking” where she opposed restrictions and started consuming conspiracy content online. She has since abandoned her family to join a cult in the US called 5D Full Disclosure. Simon believes that the ease with which unverified, one-sided information can spread is dangerous, and worries that broadcasters such as Fox News, and now GB News, exacerbate cultural division. “We have been devastated,” says Simon. “It has torn our family apart. I think [the fact that] the UK is following the US is an existential threat.”

Full Fact’s Nowottny argues that free speech is as fundamental to democracy as the right to balanced, substantiated information. He is wary of unscrupulous removal of content and thinks there should be an emphasis from government on promoting accuracy, as much as there is on censoring false information. “The Online Safety Bill could do more to mandate the filling of information vacuums or gaps,” he says. “The absence of information can be as harmful as misinformation itself.” Recent platform features such as Twitter’s prompt to read an article before sharing are helpful additions, he says, as they provide “friction”, encouraging people to re-evaluate and slow down.

While Jadeja thinks that algorithms need to be drastically altered, he does not believe that conspiratorial content should be banned outright. “You can deplatform a person but you cannot deplatform an idea,” he says. “These people don’t just disappear. They move to other more unofficial Telegram or Signal groups where it’s harder to keep track of them.”

Teaching emotional resilience

There is an argument that there should be better education in evaluating online sources. Nowottny says there are core principles that can guide people: think before you share; pause if something gives you a strong emotional reaction; and if something sounds too good to be true, it probably is.

But this goes beyond digital literacy. Holistic education around societal tensions would help people have a more balanced world view, says Cain, such as explaining why racial divisions exist or why crime rates are high. Perhaps controversially, he also believes that conspiracy videos should be shown in school followed by a discussion about why they are wrong.

With such strong links to psychological vulnerability, therapy should also be instilled into prevention and rehabilitation, he says. Cain himself now sees a therapist and thinks that teaching young people about emotional resilience, empathy, narcissism and sociopathy would help them become less reactive and stop them getting drawn in.

A more understanding approach is key to getting through, he says: “Liberal friends would just call me a racist – that didn’t work. It just pushed me further away.” Jadeja agrees that ostracising indoctrinated individuals does not encourage them to reintegrate. Discussing their behaviour – such as why they are isolating themselves from friends – can be more effective than chastising their beliefs. “We need to offer a path back into society for these people,” he says. “They need an incentive – they shouldn’t just be maligned, ignored and made fun of.”

Psychologist Jolley says that exhibiting empathy, through asking about the reasons behind someone’s beliefs and whether they feel anxious about something, can be more productive. “Having a humanised conversation [can be more effective] than instantly debunking what they’ve said,” he says.

A psychological “vaccine”

[embedded content]

Courtesy of the University of Cambridge

While there is not much in the way of official conspiracy deradicalisation programmes, there are experimental projects. Sander van der Linden, professor of social psychology at the University of Cambridge, has co-created a novel approach to tackling misinformation, which involves treating it as a cognitive “virus”. He uses epidemiological models to assess the rate at which information “pathogens” spread online and their “infection” rate based on shares and size of online network. Research has found that falsehoods spread at six times the rate of facts on Twitter.

He has gone further to develop a psychological “vaccine”, which involves exposing someone to a “weakened” form of misinformation to trigger “cognitive antibodies and develop psychological immunity”. The misinformation is then followed by a “strong refutation”, often using humour or sarcasm, and exposes the manipulation techniques used by conspiracy groups. People may then require a “booster” as their cognitive “immunity” – or memory – fades.

Van der Linden says that treating people early is most effective, as over time misinformation can settle more “deeply” in the brain and individuals continue to retrieve false details from memory. An example is the now disregarded link between autism and the measles, mumps and rubella (MMR) vaccine, which has been difficult for scientists to debunk because it took 12 years for the Lancet medical journal to retract the original study.

This technique may seem maverick and maybe even controversial. But van der Linden’s team has conducted 15 trials and results show that it helps people spot fake news, makes them more confident at doing so and limits their sharing of false information. The research was realised through Go Viral, a game launched in 2020 jointly by the government and Cambridge University that helps people spot Covid-19 misinformation through a social media simulation.

The university previously launched another gamified study called Radicalise, which uses this same “psychological inoculation” technique to prevent people being recruited by extremist groups online. It uses a fictitious WhatsApp conversation to teach players about radicalisation methods used by terrorists. The study found that it significantly improved people’s ability to spot manipulative messages. Other interventions such as integrative complexity training, where individuals are taught to recognise and consider multiple perspectives, can also be used to help people who have been radicalised to develop empathy and “cognitive flexibility”, says van der Linden – the ability to adapt their thinking and behaviour to new situations.

Social media needs to be “fundamentally reshaped” to better incorporate similar psychology-based interventions, he says. Authority figures can also use techniques such as “ethical persuasion” – where you let people know you’re persuading them and explain why – and offering a more balanced view to empower people to make their own decisions. “People with the most negative attitudes towards vaccination react much less negatively if you tell them the vaccine is not 100 per cent effective and explain the [possible] side effects,” he says. “It’s about being a trustworthy actor and persuading people at the same time.”

There is no single solution to solving the proliferation of “fake news”. But whether malicious or unintentional, it is unanimous that there needs to be a humanised approach to confronting this. It is clear that tech giants have a duty to better control the insidious nature of their algorithms, but governments also have a role to play in reintegrating people back into society. Whether it is the rise in white nationalism or a widespread aversion to a life-saving vaccine, this is not an internet issue – it is a real-world problem and it impacts all of us.

*Name has been changed to protect identity

Sign up for The New Statesman’s newsletters Tick the boxes of the newsletters you would like to receive.

Morning Call

Quick and essential guide to domestic and global politics from the New Statesman’s politics team.

World Review

The New Statesman’s global affairs newsletter, every Monday and Friday.

The New Statesman Daily

The best of the New Statesman, delivered to your inbox every weekday morning.

Green Times

The New Statesman’s weekly environment email on the politics, business and culture of the climate and nature crises – in your inbox every Thursday.

This Week in Business

A handy, three-minute glance at the week ahead in companies, markets, regulation and investment, landing in your inbox every Monday morning.

The Culture Edit

Our weekly culture newsletter – from books and art to pop culture and memes – sent every Friday.

Weekly Highlights

A weekly round-up of some of the best articles featured in the most recent issue of the New Statesman, sent each Saturday.

Ideas and Letters

A newsletter showcasing the finest writing from the ideas section and the NS archive, covering political ideas, philosophy, criticism and intellectual history – sent every Wednesday.

Events and Offers

Sign up to receive information regarding NS events, subscription offers & product updates.

***
This article has been archived for your research. The original version from The New Statesman can be found here.