conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Ukraine

How information became a weapon in Russia-Ukraine conflict

Russia’s invasion of Ukraine is the deadliest conflict in Europe since the Second World War, and the first to see algorithms and TikTok videos deployed alongside fighter planes and tanks.

The online fight has played out on computer screens and smartphones around the globe as Russia used disinformation, propaganda and conspiracy theories to justify its invasion, silence domestic opposition and sow discord among its adversaries.

Now in its second year, the war is likely to spawn even more disinformation as Russia looks to break the will of Ukraine and its allies.

Russian invasion of Ukraine
(PA Graphics)

Here is a look at Russia’s disinformation war since the conflict began:

– Divide and conquer

The Kremlin’s propaganda efforts against Ukraine began many years ago and increased sharply in the months immediately before the invasion, according to Ksenia Iliuk, a Ukrainian disinformation expert who has tracked Russia’s information operations.

Russia tailored the messages for specific audiences around the world.

Rescue workers clear the rubble of a residential building destroyed by a Russian rocket in Pokrovsk, Ukraine
Rescue workers clear the rubble of a residential building destroyed by a Russian rocket in Pokrovsk, Ukraine (Evgeniy Maloletka/AP)

In Latin America, Russia’s local embassies spread Spanish-language claims suggesting its invasion of Ukraine was a struggle against western imperialism. Similar messages accusing the US of hypocrisy and belligerence were spread in Asia, Africa and other parts of the world with a history of colonialism.

Russia’s information agencies flooded Ukraine with propaganda, calling its military weak and its leaders ineffective and corrupt. But if the message was intended to reduce resistance to the invaders, it backfired in the face of Ukrainian defiance, Ms Iliuk said.

“Russian propaganda has been failing in Ukraine,” she said. “Russian propaganda and disinformation are indeed a threat and can be very sophisticated. But it’s not always working. It’s not always finding an audience.”

– Blame the victim

Many of Russia’s fabrications try to justify the invasion or blame others for atrocities carried out by its forces.

After Russian soldiers tortured and executed civilians in Bucha last spring, images of charred corpses and people shot at close range horrified the world. Russian state TV, however, claimed the corpses were actors, and that the devastation was faked. Associated Press journalists saw the bodies themselves.

Villagers, reflected in the windows of a van, wait in line to receive humanitarian aid and a medical examination in a mobile clinic in the village of Nechvolodivka, Ukraine
Villagers, reflected in the windows of a van, wait in line to receive humanitarian aid and a medical examination in a mobile clinic in the village of Nechvolodivka, Ukraine (Vadim Ghirda/AP)

“When they realised that civilians were killed and injured, they changed the messaging, trying to promote the idea that it was a Ukrainian missile,” said Roman Osadchuk, a research associate at the Atlantic Council’s Digital Forensic Research Lab, which has tracked Russian disinformation since before the war began.

One of the most popular conspiracy theories about the war also had Russian help. According to the claim, the US runs a series of secret germ warfare labs in Ukraine – labs conducting work dangerous enough to justify Russia’s invasion.

Like many conspiracy theories, the hoax is rooted in some truth. The US has funded biological research in Ukraine, but the labs are not owned by the US, and their existence is far from secret.

The work is part of an initiative called the Biological Threat Reduction Programme, which aims to reduce the likelihood of deadly outbreaks, whether natural or manmade. The US efforts date back to work in the 1990s to dismantle the former Soviet Union’s programme for weapons of mass destruction.

– Extended whack-a-mole

As European governments and US-based tech companies looked for ways to turn off the Kremlin’s propaganda megaphone, Russia found new ways to get its message out.

Early in the war, Russia relied heavily on state media outlets such as RT and Sputnik to spread pro-Russian talking points as well as false claims about the conflict.

Russian President Vladimir Putin, left, and Russian defence minister Sergei Shoigu in Moscow
Russian President Vladimir Putin, left, and Russian defence minister Sergei Shoigu in Moscow (Mikhail Metzel, Sputnik, Kremlin Pool Photo via AP)

Russia then pivoted again to tap its diplomats, who have used their Twitter and Facebook accounts to spread false narratives about the war and Russian atrocities. Many platforms are reluctant to censor or suspend diplomatic accounts, giving ambassadors an added layer of protection.

After its state media was muzzled, Russia expanded its use of networks of fake social media accounts. It also evaded bans on its accounts by taking identifying features – such as RT’s logo – off videos before reposting them.

Some efforts were sophisticated, such as a sprawling network of fake accounts that linked to websites created to look like real German and British newspapers. Meta identified and removed that network from its platforms last autumn.

Others were far cruder, employing fake accounts that were easily spotted before they could even attract a following.

– Getting ahead of the claims

Ukraine and its allies scored early victories in the information war by predicting Russia’s next moves and by revealing them publicly.

Weeks before the war, US intelligence officials learned that Russia planned to carry out an attack that it would blame on Ukraine as a pretext for invasion. Instead of withholding the information, the government publicised it as a way to disrupt Russia’s plans.

A woman walks in a street in Borodyanka, north of Kyiv, Ukraine
A woman walks in a street in Borodyanka, north of Kyiv, Ukraine (Thibault Camus/AP)

The invasion prompted tech companies to try new strategies, too. Google, the owner of YouTube, launched a pilot programme in eastern Europe designed to help internet users detect and avoid misinformation about refugees fleeing the war. The initiative utilised short online videos that teach people how misinformation can trick the brain.

The project was so successful that Google now plans to roll out a similar campaign in Germany.

Ms Iliuk said she believes there is a greater awareness now, a year after the invasion, of the dangers posed by Russian disinformation, and a growing optimism that it can be checked.

“It is very hard, especially when you hear the bombs outside of your window,” she said. “There was this huge realisation that this (Russian disinformation) is a threat. That this is something that could literally kill us.”

Facebook Notice for EU! You need to login to view and post FB Comments!

***
This article has been archived for your research. The original version from Jersey Evening Post can be found here.