Monday, November 25, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

Google is trying out ‘pre-bunking’ in an effort to counter misinformation

In the days before the 2020 election, social media platforms began experimenting with the idea of “pre-bunking”: pre-emptively debunking misinformation or conspiracy theories by telling people what to watch out for. 

Now, researchers say there’s evidence that the tactic can work — with some help from Homer Simpson and other well-known fictional characters from pop culture. 

In a study published Wednesday, social scientists from Cambridge University and Google reported on experiments where they showed 90-second cartoons to people in a lab setting and as advertisements on YouTube, explaining in simple, nonpartisan language some of the most common manipulation techniques. 

The cartoons succeeded in raising people’s awareness about common misinformation tactics such as scapegoating and creating a false choice, at least for a short time, they found. 

The study was published in the journal Science Advances and is part of a broad effort by tech companies, academics and news organizations to find new ways to rebuild media literacy, as other approaches such as traditional fact-checking have failed to make a dent in online misinformation. 

“Words like ‘fact-checking’ themselves are becoming politicized, and that’s a problem, so you need to find a way around that,” said Jon Roozenbeek, lead author of the study and a postdoctoral fellow at Cambridge University’s Social Decision-Making Lab

The researchers compared the effects to vaccination, “inoculating” people against the harmful effects of conspiracy theories, propaganda or other misinformation. The study involved nearly 30,000 participants. 

The latest research was persuasive enough that Google is adopting the approach in three European countries — Poland, Slovakia and the Czech Republic — in order to “pre-bunk” anti-refugee sentiment around people fleeing Ukraine. 

The company said it doesn’t have plans to push “pre-bunk” videos in the United States ahead of the midterm elections this fall, but said that could be an option for future election cycles. Or it’s a cause that an advocacy group, nonprofit organization or social media influencer could take up and pay for on their own, Google and the researchers said. (The videos are “freely available for all to use as they wish,” their YouTube page says.) 

To avoid turning off political partisans, the researchers created their five cartoons without using any real political or media figures, choosing instead to illustrate their points with fictional characters. 

Click here to see the cartoons. 

One cartoon explains the concept of an ad hominem attack, in which a person attacks someone making an argument rather than addressing the merits of the argument itself. It features a brief clip from “The Simpsons” to illustrate its point, while other cartoons feature characters from the “Star Wars” franchise, “South Park” or “Family Guy.” 

The result is videos that are half rhetoric class, half pop culture deep cut. 

“We can in a very apolitical way help people gain resistance to manipulation online,” said Beth Goldberg, head of research at Jigsaw, a Google subsidiary that does research into misinformation and other subjects. She is a co-author of the study, and Jigsaw provided funding for the study and for the Ukraine-related media campaign. 

Cambridge researchers previously created an online game called “Bad News” to teach people about shady media practices, but it required people to opt in. 

The cartoons, though, ran as ads on YouTube and so were harder to miss. The cost of the ads was about 5 cents per view. And to measure the effect, researchers used the same technology that YouTube has in place for corporate advertising campaigns. 

Within a day of seeing one of the videos, a random subset of participants was given a one-question quiz to test how well they recognized the manipulation technique featured in the ad. Researchers found that a single video ad boosted recognition by about 5% on average. 

The researchers acknowledged some drawbacks. For example, they don’t know how long the “inoculation effect” remains — a question that Goldberg said they are now studying. 

Brendan Nyhan, a professor of government at Dartmouth College who was not involved in the study, said the results show that inoculation against false claims has potential. 

“It advances the state of the art by demonstrating these effects across multiple preregistered studies and showing that they can be obtained in the field on YouTube and that the effects seem to persist at least briefly after exposure,” he said in an email. 

A “pre-bunking” campaign might do little to stem the tide of disinformation from prominent sources such as far-right influencers on YouTube, said Shannon McGregor, a senior researcher in communication at the University of North Carolina, Chapel Hill. She was also not involved in the study. 

“In the end, the authors propose that those worried about disinformation on social media (including YouTube) should spend more money on those platforms to run ads to protect against disinformation. In many ways, that is wholly unsatisfying for basically all stakeholders except the platforms,” she said in an email. 

Some attempts to counter misinformation have backfired. In 2017, Facebook scrapped a feature that put a “disputed” flag on certain news posts, after academic research found that the flag could entrench deeply held beliefs. 

Interest in “pre-bunking” misinformation has been percolating for a few years. Twitter used “pre-bunking” on subjects including ballot security in the days leading up to the 2020 election, while Facebook and Snapchat put resources into voter education. Other efforts have focused on Covid misinformation. 

Meanwhile, YouTube has grown in importance as a source of political information and partisan warfare. 

Roozenbeek said he’s optimistic that “pre-bunking” videos could educate social media users about manipulation tactics, even if they won’t solve the problem of misinformation entirely. 

“That isn’t the end-all, be-all of what platforms should be doing in my opinion to combat misinformation,” he said. 

YouTube, which operates separately from Jigsaw as a division of Google, declined to comment on the study. 

Goldberg said that “pre-bunking” videos aren’t designed to replace the content moderation programs that tech companies have set up to detect and take down posts that violate their rules, but she said content moderation hasn’t been enough given the volume of misinformation. 

“It’s hard to go after every viral piece of information,” she said. 

But with the “pre-bunking” videos, she added, “We don’t have to anticipate what a politician is going to say or what the vaccine disinformation campaign is going to say next week. We just have to say, ‘We know there’s always going to be fearmongering.’”

***
This article has been archived for your research. The original version from NBC News can be found here.