conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

Shadowy Russian actors spread Princess Kate conspiracies, analysis finds

LONDON — Social media accounts linked to a prominent Russian disinformation campaign were all too happy to capitalize on conspiracy theories about the whereabouts of Kate, Princess of Wales, according to an analysis by British security experts. 

The role played by these shadowy Russian actors may serve as an alarming test case, experts said, in a year when elections in Washington and Europe will be buffeted by the long-standing fake news threat — which is now being supercharged by artificial intelligence.

However, clear as the malicious foreign involvement in the #KateGate conspiracy was, the researchers at the Security, Crime and Intelligence Innovation Institute at Cardiff University in Wales were quick to point out that these actors were not responsible for originating rumors and conspiracy theories surrounding the princess, before she revealed last week that she was being treated for cancer.

“It’s not as though these Russia-linked accounts were driving the story; they were jumping on it,” said Martin Innes, the institute’s director. “It was already being framed in conspiracy terms, so foreign actors don’t need to set that frame — that’s already there to exploit.”

Conjuring these theories was usually the work of Western influencers with high follower counts, regular social media users engaging with them. While some cracked jokes and posted memes, others took a more sinister tone as people speculated about Kate’s whereabouts. Traditional media played its own role in the feedback loop by amplifying and prolonging the circus.

But Innes and his colleagues said they identified 45 accounts posting about Kate on X that bore the hallmarks of the Russian disinformation campaign known as Doppelgänger. For the researchers who have spent years analyzing this sort of traffic, telltale signs included the accounts’ usernames and the fact that they had apparently been created in batches and were all using the same wording. Some were easy to mark out because they posted pro-Russia or anti-Ukraine content.

The campaign’s aims are twofold, Innes said. First, use the traffic spike associated with Kate to disseminate pro-Russia content, often related to its war in Ukraine. Second, sow discord.

“It’s about destabilization. It’s about undermining trust in institutions: government, monarchy, media — everything,” he said. “These kinds of stories are ideal vehicles by which they do that.”

Doppelgänger was first identified in 2022 by EU DisinfoLab, a nonprofit group of experts based in Europe that investigates the spread of disinformation online. In the past, this “Russia-based influence operation network” has worked by cloning the websites of traditional media companies, posting fake articles and promoting them on social media, EU DisinfoLab says on its website. The technology has likely become more sophisticated since then.

“These are not groups that are part of the state security services, as has happened with other operations,” Innes said. Rather, this campaign is run by “commercial firms who are getting contracts from the Kremlin.”

NBC News has emailed the Kremlin and the Russian Foreign Ministry for comment. 

Britain’s Telegraph newspaper also reported this week that Russia might not be the only country involved. Citing anonymous government sources, it reported that China and Iran were also fueling disinformation related to the princess.

When it came to the Russia-linked accounts, they did not come up with their own conspiracy theories in relation to Kate, but rather replied to existing posts, often but not exclusively with pro-Russia, anti-Ukraine content, Innes said. The researchers focused on X because of their ability to collate and analyze its posts quickly. But that may only be the tip of the iceberg.

“For independent researchers, getting a good view into TikTok is really difficult,” he said. “But just to kind of give you some sense of scale, we’ve done a bit of research and the #KateGate story had 14 billion views in one month.” These were overall views and not only those linked to Russian accounts.

The already rich ecosystem of conspiracy theories — hardly dissuaded by blanket coverage in traditional media — gave them an ample canvas on which to work.

The story was “a perfect cocktail in terms of the things that you need for conspiracy theories to thrive,” said Sander van der Linden, a psychology professor at the University of Cambridge who researches why people are influenced by misinformation.

The royal family has always been the target of conspiracy theories suggesting they are somehow “conspiring behind the scenes and plotting nefarious goals,” van der Linden said. He added that the edited photo of Kate and her three children that Kensington Palace released earlier this month had played right into this mindset.

Added in the mix are the declining global trust in institutions such as media and governments, a “mass panic about AI and manipulated news and imagery online,” and the “newer development” whereby “everyone with a social media account feels that they can be their own sleuth, uncovering details and having fun playing investigator online,” he said.

These factors are all a big worry for experts in a year that will yield a presidential election in the United States, as well as votes in the European Union, India and elsewhere.

The Russian actors “are seeing right now that this can be hugely successful,” van der Linden said. “They just wait for a controversial issue, then massively amplify it. So this could be a sort of training phase for them almost, to see how they would do it during an actual election.”

***
This article has been archived for your research. The original version from NBC News can be found here.