MSU expert: How misinformation and disinformation influence elections
Ahead of the 2022 midterm elections, Johannes M. Bauer, the Quello Chair for Media and Information Policy in Michigan State University’s College of Communication Arts and Sciences explains how to recognize and combat the spread of misinformation and disinformation in today’s complex media landscape.
What is the difference between misinformation and disinformation?
Misinformation is spread without the intent to mislead. It is wrong or false information that is disseminated via personal communications or on electronic channels. It could consist of wrong facts or of false interpretations or explanations of correct facts. In contrast, disinformation is false information that is spread with the intent to mislead, deceive, manipulate, or to generate a revenue stream, often by creating attention and inciting outrage. Propaganda, conspiracy theories and deliberate dissemination of knowingly wrong explanations — for example, that COVID-19 is caused by 5G wireless antennas — are forms of disinformation.
How do misinformation and disinformation spread?
In today’s hyper-connected world, there is an overwhelming amount of information available: some of it is true, some of it is false, some are opinions, and some is deliberately misleading. While the internet has increased access to useful, factual information, it also enables business models that allow disinformation to be disseminated on a profitable scale. The business model upon which many internet companies, social media in particular, rest, is one that thrives on anger and outrage. These reactions keep people engaged and create attention that can be used to place ads and other pieces of information to generate revenue.
Both misinformation and disinformation are related to the fact that we know about our world mostly indirectly, not through direct, personal experience. That knowledge is often incomplete: we may not know the ‘truth,’ only a preliminary version of it. It is also not evenly distributed among individuals in a population. For example, a medical doctor typically knows more about diseases than a patient. We often ‘patch’ the parts we do not know by trusting others, trusting media, or by using our own problem-solving skills to determine what might be closest to ‘truth.’ This incompleteness opens a crack, through which misinformation and disinformation may come in.
How do misinformation and disinformation influence elections and the American political system?
Misinformation, and especially disinformation, can bias voters’ choices and hence, affect election outcomes. The strength of such effects is controversial, and the empirical evidence is often ambiguous. For example, although we know that there was strong foreign interference with our past presidential elections, detailed statistical studies do not suggest that it had a decisive impact on the outcomes. Nonetheless, the concerns are real, and the strategies to influence voters are becoming more deceptive and more difficult to discern.
A broader, and more concerning, effect is that misinformation and disinformation undermine trust in elections, their outcomes, the media system reporting on elections and the broader political and governmental institutions that any prosperous, peaceful society needs. This could result in long-term damage to our country’s ability to solve the most challenging economic and social problems, which often require finding common ground. People need to be able to talk to each other and find workable solutions to bridge different opinions about political issues. If disinformation becomes such a corrosive agent that it reduces our ability to talk to each other, then society is at risk.
What can be done to stop misinformation and disinformation from spreading? Who is responsible for doing so?
The only way to deal with misinformation and disinformation is to have sufficient knowledge and practices that allow us to continuously assess and evaluate the information we have. We need to train people to verify, check and triangulate different pieces of information. In our present fast-paced online and news ecology, given variations in digital literacy, and given the time constraints most of us face, that is often difficult.
We can also explore technological solutions that can make it easier for people to discern whether a piece of information is from a reliable source. However, the First Amendment makes it difficult for government to impose rules that restrict speech, or have the potential to restrict speech, on private players in the media ecosystem. Currently and in the foreseeable future, it’s largely on the shoulders of media and technology companies to come up with algorithms and tools that help individuals discern whether information is trustworthy. In addition, strong efforts to improve digital literacy by including it across K-12, postsecondary, and continuing education are necessary.