Why We Love Conspiracies: 3 Causes and 3 Cures

Up to a third of people globally endorse at least one conspiracy theory, from COVID origins to assassination cover-ups, climate change, and secret cabals.
Why are we so drawn to absurd tales and false truths? Are we open‑minded, bored, sensation-seeking, gullible, or all of the above?
The world of deepfakes, QAnon, and Elvis sightings can span a spectrum of absurdity. These might be fun to entertain initially. But they become reality—and dangerous. Understanding why we love them is the first step to knowing what to consider and what to dismiss.
So why do conspiracy theories spread?
It’s not because we’re foolish. They’re titillating and comforting. During political or social uncertainty, a false pattern can feel safer than a knowledge vacuum.
Novelty also makes falsehoods more shareable. It’s why “new news” spreads significantly faster and further than the truth. Platform algorithms then reward outrage and conspiracy‑leaning content.
The impulse to share is understandable. We want to be first with the gossip. Immediate gratification follows with an instant dopamine hit. Plus, we feel both special and superior as part of a select club.
The result? At best, misinformation, tension, and distorted decisions. At worst, polarization, tribalism, and fractured communities.
Conspiracy theories spread by rumor, often maliciously. Alex Jones claimed the 2012 Sandy Hook school massacre was a hoax, despite its 26 victims. People believed it. During COVID-19, large-scale U.S. estimates suggest anti‑vax narratives led to 200,000 preventable deaths. Repetition then overrides verification, and illusory truth sets in.
As there is no dominant personality type for someone who believes a conspiracy theory, what makes us vulnerable to false narratives?
3 causes and 3 cures of conspiracy thinking
As a behavioral scientist, I find that susceptibility depends on three factors: our need for closure, our need for focus, and our need for belonging.
This starts with a desire to make sense of the world.
(i) Need for closure
We’re explanation-seeking creatures with a low tolerance for ambiguity. We seek cognitive closure on everything from Agatha Christie mysteries to missing person cases. Logic gaps in data or inconsistencies in narratives irritate us. For instance, was Marilyn Monroe really murdered?
Conspiracy theorists exploit this in those who need closure and pattern consistency. Misleading beliefs often explain complex or emotional events in simple terms.
The “no smoke without fire” reasoning takes hold.
Repeated exposure to misinformation makes sharing feel less morally wrong. Why? The information feels familiar.
How might you mitigate this risk? It starts with uncertainty tolerance.
1. Notice your instinct: How often do you think, “It might be true,” especially if the narrative is compelling? Watch for paranoia, the tendency to think others lie more than we do.
2. Normalize checking: Standardize the phrase, “Are you sure?” Studies have found that a simple reminder tripled accuracy in spotting untruths.
3. Empower reflection: In teams, label certain questions as “currently unanswered.” Use prompts like “What else could be true?” or “What might change my mind?” You give others the license to be wrong.
You don’t have to enjoy uncertainty, just tolerate it—and don’t outsource it to drama-loving storytellers.
If closure reduces uncertainty, in contrast, focus, the second driver, ensures you test what you hear.
(ii) Need for focus
Distracted thinkers are especially vulnerable to conspiracies. How often are you bombarded with stimuli, interruptions, and conflicting narratives? Too much, I suspect. Exhaustion lowers our cognitive armor and intellectual defenses. We skim ads, emails, or headlines. But when we’re interrupted, our limited attention is hijacked. This makes deception harder to anticipate. Half-truths emerge
When attention is fragmented, we rely on salient stories and word of mouth. Conspiracy theories find fertile ground in illusory patterns and invented connections.
How might you mitigate this risk? Structured skepticism offers a partial cure.
1. Introduce decision friction: Pause before forwarding, reposting, or repeating claims. Consider the impact on reputation if this is found to be untrue.
2. Ask questions and question answers: Our default is fast system one thinking rather than slower system two thinking. Even simple accuracy prompts help: “What did you not hear?” or “Could the opposite apply?”
3. Personalize rules: Consider “Never share based on a headline” or “Wait 24 hours before reacting.”
Our brains once rewarded spotting threats. Now they reward retweets with kudos and social belonging: the third driver of conspiracy theories.
(iii) Need for belonging
The idea of “a secret vendetta” or “over‑50s targeting” can become a perceived reality. Like-minded thinkers and selective facts reinforce beliefs, ignoring weak or absent evidence. For instance, committed Holocaust or climate change deniers are unlikely to entertain alternative explanations. Echo chambers then amplify misinformation.
Solomon Asch found that group pressure led 75 percent of participants to deny obvious truths. Cult followers at Jonestown and Heaven’s Gate also doubled down, even after prophecies failed.
As described in Tune In: How to Make Smarter Decisions in a Noisy World, we tune out common sense in 10 misjudgment traps. These are both predictable and controllable.
Intellectual humility mitigates the lure of false narratives.
1. Curate contradictory sources: Avoid flattery of existing views. Reward people who backslide or change positions rather than dig in.
2. Practice humility: Treat beliefs as provisional. People who show greater intellectual humility tend to process information more accurately. They’re less attached to conspiratorial narratives.
3. Challenge consistently: Extreme theories thrive on what people think they know. As we’re motivated to mishear, instead of overconfident explanations, invite counterfactuals.
Many think they’re immune to conspiracies. Ironically, reasoning is similar when you resist a challenge. It’s the confirmation bias cycle. In essence, pause to ask, “What might be true?” Then probe, “What other views might exist?” And maintain perspective, “What might change my mind?”
Gain clarity, not comfort
People may secretly love reading about these false beliefs, but they cost. Intellectual humility, uncertainty tolerance, and structured skepticism can combat conspiracy thinking. Believers often feel smarter than skeptics. But true intelligence is knowing when to question yourself first.
It’s not possible to fully eliminate conspiracy theories, but it is possible to reduce their impact. You can influence how people think rather than what they think.
That preserves not only the integrity of thought but also content clarity—and common sense.