Sunday, November 24, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

ABC News accidentally broadcasted a satanic ritual. Now it’s going viral on conspiracy theory TikTok

The ABC accidentally broadcast a satanic ritual and videos claiming the mistake legitimises baseless conspiracy theories, including those to do with QAnon, have gained millions of views on TikTok.

Skip Twitter Tweet

FireFox NVDA users – To access the following content, press ‘M’ to enter the iFrame.

ABC News Channel was running a story on a proposal for new penalties for hurting police animals but it was briefly interrupted by a clip of a Satanic ritual and a man in a black robe saying “Hail Satan”.

The presenter at the time, Yvonee Yong, tweeted that the footage was from a story which was running earlier that evening to do with the church of Satan. The vision was from the Noosa Temple of Satan, an Australian group pushing for Satanism to be taught at schools.

American conspiracy theorists are now reposting the clip on TikTok and the app’s algorithm is making it go viral. Josh Roose, a senior research fellow specialising in extremism at Deakin University, said the clip feeds into false conspiracy narratives perfectly.

“It’s a core element of contemporary conspiracy theories that there exists a satanic cult or cabal of liberal elites, abusing children,” he told Hack.

“And TikTok has become a bit of a vector for conspiracy theories… so it was only a matter of time before it was reproduced on TikTok.”

The most viral iteration of the video on TikTok has more than 2.6 million views and was posted by an American user who promotes several conspiracy theories. The post has over 15,000 comments, with many commenters believing the video proves a baseless conspiracy theory.

In the post, the user, who has more than 18,000 followers, plays the ABC clip and says, “How do you make a mistake like that? How did they cut perfectly at that exact time?” Josh Roose said that’s a deliberate tactic used by conspiracy theorists to evade moderators.

“People who are preparing a lot of these ideas online have become quite skilled at navigating the TikTok’s requirements, what they can and can’t say, how they demonstrate it, and so on,” he said.

“A more blatant example [of spreading conspiracy theories], might be reported very quickly and then taken down reasonably quickly… by circumventing that you can actually ensure that your content stays up online for a lot longer, so I imagine this is what this person has chosen to do.”

Hack has previously reported that conspiracy videos about Australia’s COVID lockdowns were going viral on TikTok. On Tuesday, the largest independent study of hate on TikTok was released finding anti-Asian and pro-Nazi videos are racking up millions of views, often using pop songs to evade the platform’s auto-moderators.

TikTok has said it removes a vast majority of content that violates its policy within 24 hours, but researchers from the Institute of Strategic Dialogue, who researched extremism on TikTok, found simple measures like misspelling a word was enough to hide a video from moderators.

A pattern of dangerous content going viral

A joint investigation by triple j Hack and Four Corners found the TikTok algorithm is exposing Australians to dangerous content through the app’s For You Page, an endless stream of personally curated videos for each user which is allowing videos to go viral. The more users “like” videos, follow an account, or watch a TikTok video until it ends, the more the algorithm learns, in theory, what users want to see.

Josh Roose said TikTok is struggling to regulate dangerous content, including videos promoting conspiracy theories.

“TikTok, of all platforms, has definitely fallen behind what we’re seeing in terms of Facebook and other social media companies,” he said.

“On the one hand, it’s because the content is so quickly evolving, it’s very difficult for them to regulate that and they can have thousands of posts along a similar vein. It is hard for them to police that.

“However, once it’s actually identified, they do need to act quickly, in terms of being a good corporate citizen, and contributing to the countries in which they actually do business.”

Hack has approached TikTok for comment. The company’s community guidelines do not “permit misinformation that causes harm to individuals, our community, or the larger public regardless of intent”.

The app has said it works with fact checkers to remove content and accounts which violate community guidelines.

*** This article has been archived for your research. The original version from ABC News can be found here ***