Monday, November 25, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

Most methods for squashing conspiracy theories don’t work, study finds. Here’s what does. – Livescience.com



Debunking conspiracy theories with counterarguments is often a fruitless effort — but according to a new scientific review, there may be alternative strategies that can successfully fend off conspiratorial beliefs.

Having already grown over the past 10 years, interest in conspiracy theories skyrocketed during the pandemic, when failure to comply with public health recommendations was sometimes associated with conspiracy beliefs (opens in new tab). For example, proponents of the anti-vax movement (opens in new tab) may avoid vaccinations for themselves or their children on the basis that some hazardous outcome of vaccination is being covered up. Although increasingly prominent in public discourse, conspiracy theories have proved a difficult mindset to shift.

“I wouldn’t have a Ph.D. in this project if conspiracy theories were easy to counteract,” said Cian O’Mahony (opens in new tab), a doctoral candidate in psychology at University College Cork in Ireland who led the systematic review reported in the journal PLOS One (opens in new tab). The review doesn’t reveal “a silver bullet” for countering conspiracy theories, he said, but “we have found some interesting avenues for future research that we should follow up.”

The review is the first of its kind, as previous studies have been more concerned with understanding the psychological underpinnings of conspiracy beliefs (opens in new tab), O’Mahony told Live Science. Research into designing interventions to combat conspiracy is still relatively new. “When we did the review, we found that there’s only a handful of papers that are actually published on this topic,” he said. 

Related: Belief that COVID-19 was a hoax is a gateway drug to other conspiracy theories

O’Mahony described a conspiracy theory as “a belief that explains events by invoking malicious groups working in secret.” The role of some underground organization distinguishes conspiracy theories from general misinformation and “fake news.” For instance, the statement “Bigfoot exists” would not be a conspiracy theory unless qualified by adding “and a particular organization is trying to keep it a secret.”

The new review suggested that many methods for changing conspiracy beliefs are ineffective — particularly those that involve straightforwardly arguing against a person’s beliefs after they’re already entrenched. However, the review also highlighted some emerging practices that might be successfully wielded against conspiracy theories.      

The most promising was training to teach people how to critically analyze information to distinguish pseudoscience from the real thing. However, even generic “analytically priming” a study participant’s mental state to be more alert — by presenting them with text in a hard-to-read font, for example — was found to reduce the likelihood of falling for a conspiracy theory they saw shortly afterward. 

Finally, “information inoculation” can also be effective. In this strategy, conspiracy theory counterarguments are presented alongside a warning that exposure to misinformation is to follow, before the subject is exposed to the theory. It is likened to the way a vaccine exposes someone to a fragment or weakened form of a virus so that they are resistant to the disease when they encounter it. 

(Unfortunately, this same approach can also be used to spread conspiracy, if someone “inoculates” with a conspiratorial explanation first, O’Mahony noted.)

“While it is not overly optimistic, this review points out several potentially promising” lines of research, Iris Žeželj (opens in new tab), a professor of social psychology at the University of Belgrade who was not involved in the new review, said in an email. 

However, she highlighted the need to replicate the studies demonstrating successful intervention, as well as the challenge of scaling them up into policies. O’Mahony noted these same caveats and also pointed out the current lack of evidence that any of these interventions have a lasting impact.

Valerie van Mulukom (opens in new tab), a researcher at the Centre for Trust, Peace and Social Relations at Coventry University in the U.K. who was not involved with the review, described it as a “timely endeavour” but emphasized that it is important to consider the spread of conspiracy beliefs as a social process.      

“Interventions may decrease belief in certain conspiracy theories by pointing out issues in the information presented, but they do not take away the social causes underlying belief,” she said in an email. Factors like people’s personalities, paranoias, need for closure, financial insecurities and feelings of marginalization may all  influence what conspiracies they ascribe to and what interventions work on them.

“It is not the case that everyone with lower levels of analytical or scientific reasoning believes in conspiracy theories,” van Mulukom noted.   

As a follow-up to their review, O’Mahony and his colleagues are developing a video game aimed at honing players’ critical thinking skills. Such games have already been shown to be effective in combating fake news (opens in new tab)

“This might sound a little avant-garde, but we’re finding that this is a potentially promising avenue for teaching people to apply critical thinking skills to conspiracy theories,” he said.

***
This article has been archived for your research. The original version from Live Science can be found here.