3 Keys to Reversing Misinformation With Cognitive Dissonance

Confirmation bias is when people only believe information that reinforces what they already believe. For example, vaccine opponents may only believe information about vaccines being unsafe, and will reject any contrary information or facts.
Confirmation bias is one reason people find it hard to let go of their belief in misinformation. Misinformed people often trust virtual or online sources, such as social media or podcasts. However, misinformation is increasingly coming from state and federal government agencies.
Overcoming confirmation bias to loosen a misinformed person’s grip on misinformation can happen if trusted messengers (particularly people they know), present them with alternative views. Cognitive dissonance occurs when misinformed people consider those alternatives. The cognitive conflict (or dissonance) can be resolved over time by their choosing one alternative over another.
A trusted messenger’s goal is to guide a misinformed friend or relative toward factual reality and away from misinformation or disinformation.[i] This process can be assisted and accelerated by bringing, in addition, trusted messengers who add nuance and new contexts to the reality side of the cognitive dissonance. One aspect of misinformation that trusted messengers can present is the motivations and incentives underlying those spreading misinformation, which are often profit, political advantage, or other gains.
An ongoing, high-profile area of misinformation in the United States involves vaccines—how they are developed, approved for use, and recommended by the U.S. government. The current leadership of the federal public health agencies (e.g., Centers for Disease Control and Prevention and Food and Drug Administration) is deviating dramatically from previous scientifically established and evidence-based practices.[ii],[iii] Those actions are contributing to more belief in misinformation (and mistrust) about vaccines, which expanded dramatically during the COVID-19 pandemic.[iv] It even seems that people who never questioned vaccinations are now having cognitive dissonance because previously trusted sources of information are presenting contrary views and recommendations.
Creating Cognitive Dissonance to Reverse Misinformation Can Happen in 3 Ways
1. Emotional engagement from a trusted messenger
To successfully create cognitive dissonance, trusted messengers need to engage emotionally, not with counter-facts, which generally only leads to “my facts v. your facts” arguments. Presenting actual facts is counter-productive because the misinformed person thinks that you (i.e., a trusted messenger) have been duped by sources they don’t trust, such as mainstream media, big corporations, or the health care system.
A better—and more successful—alternative is to listen to what a misinformed person is saying, seek to understand where their mistrust comes from, and engage with them on an emotional level. Specifically, by listening, you will be able to understand what motivates them, what fears they have that are reinforcing the misinformation (such as harms from medicines or vaccines), and where those fears come from (e.g., personal experience, family history, or “stories” they’ve heard or read on the internet). That enables you to inject emotion-connected perspectives that align with their motivations and fears.
2. Challenging perspectives from multiple trusted sources
While a trusted messenger (i.e., you) can guide a misinformed person to consider alternative perspectives, that process can be reinforced and accelerated by bringing in additional trusted sources to your conversations. Those sources can be other friends or respected community leaders, like local business people, clergy, coaches, baristas, or barbers/hair stylists. Basically, this is anyone who understands the truth behind the misinformation, is trusted by the misinformed person, and is willing to engage with them on an emotional level rather than trying to correct their misinformation with facts and figures.
The value of bringing in additional perspectives to this process has been seen in consumer marketing, where presenting new and different information needs to occur 5-12 times before potential customers will consider changing brands. And presenting that information in different ways—such as in a magazine ad as well as TV ads—makes the effect more powerful compared to just multiple presentations from the same source or in the same format.
That multiple sources, multiple times strategy enhances the cognitive dissonance necessary for someone to reconsider their belief in misinformation.
3. Pre-loading people with skills that can prevent belief in misinformation
Another way cognitive dissonance can help with misinformation is to pre-load some perspectives and insights so that there is immediate dissonance when someone sees or hears misinformation.
This “pre-loading” can be done in several ways:
The first way is called “pre-bunking,” which is showing people ways that misinformation is presented so they recognize it in the future. Pre-bunking gets people to ask themselves who is behind the information source and why they are distributing it (i.e., what do they gain or how do they benefit).
A second, more focused way to pre-load people with skills to prevent belief in misinformation is by enhancing media literacy, which is the ability to determine the validity of information sources and to recognize an invalid information source that is likely presenting misinformation. A second level of media literacy is learning how to research and evaluate questionable information sources by looking elsewhere for confirmation.
Media literacy training for school-age children can be integrated into specific subject matter curriculum (such as civics, history, or science), as well as learning about the use of technology. Similar training for adults can be taught through public libraries, community colleges, and continuing education-type programs. Adult media literacy training can be combined with financial literacy and fraud prevention sessions since both are based on providing people with the skills to recognize and reject false information sources and messages.
A third way to pre-load people with skills and knowledge to prevent misinformation from taking hold is better education about the scientific method, which enables people to understand why correlations are not the same as causation. This is important, since misinformation is often derived from correlations that are then mischaracterized as causation.
Conclusions
Trusted messengers can help reverse misinformation by emotionally engaging with misinformed people to introduce alternative perspectives that create cognitive dissonance.
Pre-loading people with the skills and understanding to recognize invalid information when they see it can prevent misinformation from taking hold because they may experience cognitive dissonance when they encounter it, thereby preventing the misinformation from “infecting” them.
This process is not as simple as giving a vaccine injection, but the long-term benefits to individuals, families, and society may be similar. And with enough resistance to misinformation, herd immunity to our current infodemic of misinformation may be achievable.