Thursday, November 21, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

This Chatbot Pulls People Away From Conspiracy Theories

In a new study, many people doubted or abandoned false beliefs after a short conversation with the DebunkBot.

Shortly after generative artificial intelligence hit the mainstream, researchers warned that chatbots would create a dire problem: As disinformation became easier to create, conspiracy theories would spread rampantly.

Now, researchers wonder if chatbots might also offer a solution.

DebunkBot, an A.I. chatbot designed by researchers to “very effectively persuade” users to stop believing unfounded conspiracy theories, made significant and long-lasting progress at changing people’s convictions, according to a study published on Thursday in the journal Science.

Indeed, false theories are believed by up to half of the American public and can have damaging consequences, like discouraging vaccinations or fueling discrimination.

The new findings challenge the widely held belief that facts and logic cannot combat conspiracy theories. The DebunkBot, built on the technology that underlies ChatGPT, may offer a practical way to channel facts.

“The work does overturn a lot of how we thought about conspiracies,” said Gordon Pennycook, a psychology professor at Cornell University and author of the study.

Until now, conventional wisdom held that once someone fell down the conspiratorial rabbit hole, no amount of arguing or explaining would pull that person out.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.


Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.


Thank you for your patience while we verify access.

Already a subscriber? Log in.

Want all of The Times? Subscribe.

***
This article has been archived by Conspiracy Resource for your research. The original version from The New York Times can be found here.