conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

QAnon

From COVID to QAnon, U of M researcher Emily Vraga on debunking false information

With today’s technology, we’re afforded a lot of conveniences that wouldn’t have been conceivable just a few decades ago. But those conveniences come with consequences. The digital world provides a platform for virtually anyone with a novel idea to find a community to share it with — fast. 

But so too does it for those spreading misinformation, disinformation or “fake news.” And whether it’s QAnon or COVID conspiracies, the deliberate dissemination of false information can have impacts on everything from politics to public health. 

Emily Vraga is a University of Minnesota researcher and an expert on misinformation. She is a co-author of “The Debunking Handbook 2020,” a guide for slowing the spread of misinformation, written by 22 professionals from universities around the world. It advises readers to prevent the spread of misinformation whenever possible, and provides techniques for debunking pieces of misinformation that have already spread.

We talked to Vraga to get her views on the social media landscape as it exists today. This interview has been edited for length and clarity.

MinnPost: How does someone identify something as misinformation?

Emily Vraga

Emily Vraga

Emily Vraga: It’s hard work. There are lots of signals you can use to determine whether something is suspicious, but to know whether something is misinformation requires a validation process. If you’re coming across content and you’re not sure whether it’s true or false, the best thing to do is often not to look at the content itself, but to look for other sources to substantiate or refute it. It’s easy for manipulators to take on characteristics of good information, and make something that’s bad look like it’s from a reputable source. 

MP: The handbook advises preventing misinformation from spreading by “explaining misleading or manipulative argumentation strategies to people.” What are some of those misleading and manipulative strategies?

EV: One signal we use to decide whether information is good is the credibility of the source. It’s really easy to manipulate that credibility. We trust somebody with a doctor before their name more than we trust somebody without it. But it’s fairly easy to put a doctor in front of someone’s name who has no medical or academic credentials, or take a doctor outside of their expertise. So, I’m Dr. Emily Vraga, I can talk to you with expertise in certain areas, but if you wanted an expert on dental hygiene, my doctorate doesn’t help you at all. A common technique is to either fully misrepresent someone or take somebody who is a doctor, and have them speak on something that is not about their expertise. Once you recognize that technique you might be able to say, “OK, I need to make sure this person is a doctor in an area that’s relevant to the topic being discussed.” 

Article continues after advertisement

Another misinformation cheap technique you might see is conspiracy theories. The hardest part with addressing conspiracy theories is [that] a good one takes the lack of evidence as evidence for the conspiracy. If you can’t find information that proves it, that just means it goes even deeper than you thought. That’s a technique that is incredibly difficult to address, because it undermines your faith in institutions you would normally trust to help you perceive truth from fiction.

MP: What are the main sources of misinformation today?

EV: There are two kinds of bad information we really need to be aware of. There is disinformation, which is spread with a strategic purpose in mind. It could have a monetary incentive: “I want you to click on my website so I get advertising money.” It could have a political incentive: “I’m trying to harm someone else or improve my own chances politically.” One example would be Russia potentially having strategic goals for sowing disinformation. 

The problem is, especially on social media, disinformation can become misinformation quite quickly. If a Russian troll is sharing disinformation with a strategic goal and I retweet it, I’m now spreading misinformation, even though my intent might just be that I genuinely think it’s true, and I want to warn other people. Thinking about the motives is very important as you’re trying to correct somebody who’s misinformed with good intentions. You are going to use a totally different strategy than if you were trying to go after disinformation.

MP: Some social media platforms have responded to false information by removing posts and suspending accounts from the platform. Is this an effective method of reducing misinformation?

EV: I think the platforms need to take some responsibility and take targeted steps towards reducing misinformation on their platforms. To do so is part of their responsibility, and we want them to be really cautious and judicious in that. Misinformation creates real-world harms, and recognizing that is important, because it substantiates their ability to take action.

Article continues after advertisement

I think that finding serial offenders that are consistently sharing misinformation, and doing so in spite of warnings, is a reasonable step to take. Another step could be demonetizing it. That might take away incentive. Another is providing context so when misinformation is there, making sure that you’re getting the accurate information at the same time. I think they all have to be options that social media platforms can invoke. If the information can cause real harm, that’s something they need to be responding to appropriately, including taking it down if appropriate.

MP: People are really divided right now — sometimes on what is and isn’t factual. How do you have constructive conversations with people who don’t agree on fundamental sources of truth?

EV: That’s a really tough question. We talk about misinformation like it’s black and white, and on some issues, it is. On other issues, there are a lot more gray areas. There are a lot of issues where it would be really hard to definitively declare something misinformation, especially when knowledge is constantly changing. 

Thinking about the audience you’re reaching is helpful when you’re making a case about misinformation and correcting it. Some sources are difficult to use effectively for everyone. If I’m trying to correct misinformation that is political in nature, it’s going to be much easier to do that from within the group than from without. A Democrat correcting a Republican is not going to have nearly as much success as a Republican correcting a Republican, and vice versa. 

Another thing you can do is think about who are the trusted experts for a particular group and are there ones that are less political? For example, a trusted local health organization might be much more successful than even the CDC right now. A lot of people have personal connections with these more local agencies or their own doctors, and they’re going to be a more effective messenger than a remote agency that has faced some partisan criticism. 

MP: What are some things anyone can do in their daily lives to prevent the spread of misinformation?

EV: The first is to be really careful about what you’re sharing. Make sure you’ve checked your sources, checked your facts before you click “share.” Just because my aunt shared it doesn’t mean that it’s gone through that careful vetting process. 

Article continues after advertisement

Often, the best way of doing that is the simple act of googling it. Is this something a lot of different news or health organizations are saying, or did I find something that seems to be an outlier? And if that is the case, and you’ve seen someone sharing it, responding empathetically but with accurate information can work. If my aunt shared something, and it’s not true, the best thing I can do is say,”‘I get that this is confusing. It’s really easy to get wrong information, but the latest information from the CDC is actually” and then tell them what it is and share a link for more information. 

Think about it as something that you’re doing not to criticize but to protect other people, because it’s not just your aunt who might be misinformed. If she’s sharing that, a lot of other people could see that and walk away with the wrong impression. And that’s dangerous. So thinking about the ways in which each of us can make our information environments better by being careful about what we share, by being thoughtful and, when we see misinformation, correcting in ways that are validating, but still accurate.

*** This article has been archived for your research. The original version from MinnPost can be found here ***