Fake or fact: how to recognise a conspiracy theory
Cast your mind back to 2008, and you may remember the imminent end of the world. According to various doomsayers, the opening of the Large Hadron Collider (LHC) in Cern was set to create a black hole that would destroy the Earth and everyone on it.
We are still standing today – but this fact has not stopped continued speculation about other dark deeds in those subterranean tunnels. In the stranger corners of the internet, you can find reports that the institute is riddled with Satanists, who are set to blast open a multi-dimensional portal into hell – rumours that have been fuelled by a video purporting to reveal a human sacrifice.
Unless you are a particularly devoted – and credulous – fan of Buffy the Vampire Slayer, it is easy to laugh at the idea that the LHC is the opening to the Hellmouth – but many other conspiracy theories are far harder to detect. Whether they concern the causes of Queen Elizabeth II’s death to misinformation about vaccine side-effects and the origins of the war in Ukraine, our social media newsfeeds are flooded with concocted claims, and there’s a good chance that you may have entertained at least one of them yourself.
If we are to protect ourselves from falling for these tall tales, it helps to understand a little about the psychology of the “conspiracy mindset”, and the reasons that stories of back-door deals and global cover-ups can sometimes seem so appealing
Let’s begin by confronting the myth that belief in conspiracy theories is simply a sign of low intelligence: surveys show that people with higher education are surprisingly susceptible to misinformation. (Even the chess grandmaster, Garry Kasparov, has apparently entertained the bizarre “new chronology” conspiracy, which claims that our accounts of Ancient Greece, Egypt and Rome have been fabricated to hide the true history of the world.) The truth is that many conspiracy theories are designed to play on our brain’s biases when we are feeling particularly vulnerable, and a stellar academic record provides little protection.
It is no coincidence that conspiracy theories are most likely to emerge and spread during times of great uncertainty – brought on, for example, by an economic recession, war, or a global pandemic. These unsettling events often have complex causes that are hard to comprehend and they threaten our feelings of control over our lives – and so we look for ways to find sense in the chaos.
Conspiracy theories can offer a neat-and-tidy narrative that helps to settle our existential angst, often through the identification of scapegoats that can be blamed for the crisis. There is an absurd comfort in believing that specific people or organisations have planned a disaster for their own profit – particularly if you feel that you are one of the few who know the “truth” – rather than accepting that, often, terrible events can happen at random without some grand mastermind behind them.
Various psychological studies show that we are more likely to fall for conspiracy theories when we already feel anxious – and this is often surprisingly simple to prime. Dutch researchers, for example, found that simply asking someone to recall a time when they had felt disempowered and helpless was enough to encourage conspiratorial beliefs about a local political controversy.
Whether or not you fall for a false narrative to ease your existential angst will also depend on your thinking style. Consider how strongly you agree with the following statements:
I hardly ever go wrong when I listen to my deepest “gut” feelings to find an answer.
I tend to use my heart as a guide for my actions.
Or these statements:
I like the responsibility of handling a situation that requires a lot of thinking.
I find satisfaction in deliberating hard and for long hours.
People who agree with the first two statements have a more intuitive thinking style, while those who agree with the latter two have a more analytical thinking style.
There is nothing wrong with relying on our intuitions, per se; gut feelings can often serve us well, particularly when we are drawing on expertise built through years of experience in a particular field. (If you are a doctor, for example, a quick look at a patient may be enough to form a tentative diagnosis.) An overreliance on intuitive thinking can, however, cause someone to jump too rapidly to conclusions in areas where their judgment is less refined. And this tendency makes them more susceptible to conspiracy theories, since they don’t stop to think of the holes in the argument or the potentially contradictory evidence that would cause it to crumble, and instead accept any claims that feel “truthy”.
Like a mind-altering virus, conspiracy theories can sometimes become so deeply embedded in someone’s brain that they may struggle to see sense, even when they are presented with contradictory evidence. To abandon it would cause their whole world view to collapse – and so they double down on their beliefs. Fortunately, these people are in the minority – and the psychological research suggests that there are ways to reinforce our brain’s bullshit detection.
The first step is self-awareness. Given that uncertainty primes a conspiracy mindset, you might try to look for proactive ways to cope with the stress of unsettling world events, rather than doomscrolling through social media, where you are more likely to find misinformation that feeds your anxiety. If you do come across alternative explanations for the crisis, watch out for overly simplistic stories that pin the blame on a convenient scapegoat, and try to scrutinise the logic of their claims and the basis of their evidence.
Have the photos and videos been taken out of context? Or might they have been faked or Photoshopped? Are the experts’ credentials as impressive as they claim, and do they have proven knowledge in the relevant field? And do they cite surveys or reports created by a credible academic organisation? The peddlers of conspiracy theories often varnish their claims with a veneer of authority by using false experts and made-up institutions.
You might also look closely at the way that people respond to criticisms of their theories. Do they engage directly with the point in question? Or do they resort to ad hominem attacks and red herrings that draw attention from the weak spot in their argument? And do they call on “special pleading” – arguing, for instance, that the very absence of evidence is itself proof of a cover-up? This argument conveniently explains away a lack of objective facts, meaning that it can be used to support literally any absurd claim.
If you can spot these tricks, you’ll find it far easier to see the substance of their claims and the strength of their evidence for what it is, rather than being blinded by the smoke and mirrors of their rhetoric. Fortunately, there are plenty of available resources that detail the most important logical fallacies and the ways to identify them.
Don’t be ashamed if you have ever fallen for a conspiracy theory in the past: the scientific research consistently shows that we are all vulnerable to misinformation. Critical thinking involves learnable skills that improve through practice, so you can continue to build your bullshit detection kit throughout your life – whatever your initial starting point. The important point is to avoid making the same mistakes again. With these sharpened critical faculties, you will be well equipped to sidestep many intellectual potholes – including those that had led some to see the LHC as the road to hell.
The Expectation Effect: How Your Mindset Can Transform Your Life by David Robson is published by Canongate in paperback (£10.99). To support the Guardian and Observer, order your copy through guardianbookshop.com. Delivery charges may apply.
This article has been archived for your research. The original version from The Guardian can be found here.