And if not, why do people keep believing that it is?

Whether it’s self-deception aimed at avoiding distress, false memories, contradictory religious notions about what happens when we die, or just getting the facts wrong, we all believe things that aren’t true. And not infrequently, we cling to those false beliefs, even when they butt up against other opinions—or evidence—to the contrary.
In today’s political climate, disputes over reality have become all too common. It’s come to the point that we can often predict one’s beliefs about certain hot-button issues—like the threat of climate change, the safety of vaccines, or whether there’s more than one gender—based on who they voted for in the 2024 Presidential election.
Based on that political polarization, when we try to imagine how our ideological opponents can get it so wrong, we often conclude that it’s because they’re stupid, or brainwashed, or victims of “mass psychosis” or “mass delusion.”
As a psychiatrist who knows a thing or two about delusion, I can assure you that we don’t usually believe things that aren’t true due to a lack of intelligence or because we’re mentally ill.
There’s not a single one of us who is immune to false belief. Not knowing everything (and not always admitting it) is simply a part of being human.
In my new book False: How Mistrust, Disinformation, and Motivated Reasoning Make Us Believe Things That Aren’t True, I try to paint a humanizing view of false belief. And while there are myriad reasons why we each of us on an individual level can fall victim to falsehoods while insisting we’re right—including naïve realism, the Dunning-Kruger effect, confirmation bias, and cognitive dissonance—there’s a universal framework that can help us better understand the stubbornness of our conviction.
Although I don’t refer to it as such in the book (and the publisher wasn’t a fan of alliteration in the title), I’ve recently started calling it the “3M Model” of false belief based on the following components:
- Mistrust. As I explained in a previous post, the rationale for our beliefs is often rooted in intuition, faith, personal experience, or trust. We depend on trust when we look to others—as we often must—when don’t have the answers ourselves. But with so many sources of information vying for our attention today, accessible in an instant through the peripheral brains that are our phones, it’s easy to end up placing our trust and mistrust in the wrong people. When we come to hold false beliefs, and cling to them despite counter-evidence, it’s often because we mistrust the people who are telling the truth and trust those who aren’t.
- Misinformation. When other people tell us things that aren’t true, they’re supplying us with misinformation. And when they deliberately feed us misinformation that they know is untrue—like when Big Tobacco told us that cigarette smoking doesn’t cause cancer or when Big Oil told us that global warming is a hoax—they’re spreading disinformation. Still, whether deliberate or not, false beliefs often take root in our brains because we encountered misinformation somewhere and decided that it was true. We all know that “fake news” exists, but we’re often not the best at distinguishing between it and the real thing, It doesn’t help that we’re often told that real news is fake and fake news is real.
- Motivated Reasoning. While “confirmation bias” means that we tend to gravitate toward information that supports what we already believe, or want to believe, while swiping past or rejecting information that contradicts it, the related concept of motivated reasoning is a bit more involved. While confirmation bias refers to what we do when we encounter information, motivated reasoning describes what we do after we’ve actually scrutinized something, rationalizing to ourselves whether it represents quality information or not depending on our ideological identities. If it supports and preserves that identity (like how Fox News supports conservative viewpoints, while MSNBC supports liberal views), we reason that it’s great information. When it doesn’t, we reason the opposite, telling ourselves—and our ideological opponents—that the information is bogus, biased, or otherwise unreliable.
As I mentioned earlier, there are many reasons why we believe and disbelieve, sometimes getting it right and sometimes getting it wrong. But keeping the 3 M’s—mistrust, misinformation, and motivated reasoning—in mind helps us to understand why our beliefs are often predicted by our ideological affiliation and why we may refuse to concede that we might be wrong. Within our ideological groups, we’re told who to trust and mistrust, we rely on informational sources that amount to echo chambers, and we employ motivated reasoning that closes ourselves off from counter-evidence and meaningful counter-argument that could change our mind.
People often ask why facts don’t change our minds. Mistrust, misinformation, and motivated reasoning go a long way toward explaining that it’s because we often don’t agree on what the facts are.