conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Ukraine

Russian Disinformation Targets Aid Workers in Syria and Ukraine

In April, I was volunteering with World Central Kitchen along Poland’s border with Ukraine. A barbeque chef from North Carolina stood next to me in the warehouse as we unpacked bread for sandwiches. “I heard the International Red Cross is kidnapping Ukrainians and taking them into Russia,” he said confidently.

The chef said that earlier, at a nearby cafe, a man went from table to table telling diners the humanitarian organization was forcibly deporting Ukrainians to Russia. Social media users shared and posted tweets about the purported kidnappings, with mentions of the Red Cross peaking over a three-day period at the end of March. In May, a video circulated on Telegram claiming to show that the Red Cross had collected thousands of Ukrainian children’s medical records and may be involved in organ trafficking. Pro-Russian media outlets broadcast the video, and the Russian government said it would investigate. The claims, of course, were nonsense. The International Committee of the Red Cross says such rumors stem from a massive online campaign of targeted attacks using disinformation to discredit its work.

It’s a dynamic that started back in Syria, where Russian intelligence targeted the White Helmets, a rescue group, and the Red Cross. Humanitarian organizations shine a light on the human toll of war and raise global awareness and understanding of communities’ needs. In doing so, they provide an unvarnished, firsthand look at facts on the ground in conflict zones and are intrinsically threatening to the narratives of autocratic regimes.

In April, I was volunteering with World Central Kitchen along Poland’s border with Ukraine. A barbeque chef from North Carolina stood next to me in the warehouse as we unpacked bread for sandwiches. “I heard the International Red Cross is kidnapping Ukrainians and taking them into Russia,” he said confidently.

The chef said that earlier, at a nearby cafe, a man went from table to table telling diners the humanitarian organization was forcibly deporting Ukrainians to Russia. Social media users shared and posted tweets about the purported kidnappings, with mentions of the Red Cross peaking over a three-day period at the end of March. In May, a video circulated on Telegram claiming to show that the Red Cross had collected thousands of Ukrainian children’s medical records and may be involved in organ trafficking. Pro-Russian media outlets broadcast the video, and the Russian government said it would investigate. The claims, of course, were nonsense. The International Committee of the Red Cross says such rumors stem from a massive online campaign of targeted attacks using disinformation to discredit its work.

It’s a dynamic that started back in Syria, where Russian intelligence targeted the White Helmets, a rescue group, and the Red Cross. Humanitarian organizations shine a light on the human toll of war and raise global awareness and understanding of communities’ needs. In doing so, they provide an unvarnished, firsthand look at facts on the ground in conflict zones and are intrinsically threatening to the narratives of autocratic regimes.

When I first spoke to James Le Mesurier, the founder of the White Helmets, in 2018, he told me his biggest challenge was Russian disinformation targeting his work. He had trained his teams of volunteers in urban search and rescue and medical evacuation during the Syrian war. Le Mesurier also provided them with video cameras affixed to helmets so he could watch and provide further instruction on their operations. Through filming their rescue efforts in Syria, the White Helmets became potential witnesses to war crimes, spurring Russia’s efforts to discredit their work through online disinformation.

In Ukraine, humanitarian organizations are struggling to reach vulnerable communities in the east, where the fighting is worst, while simultaneously battling narratives that are spreading widely online and particularly prevalent among the Russian-speaking Ukrainians they are attempting to help.

“For me, it looks very similar to the kinds of narratives and the kinds of websites that were spreading content targeting the White Helmets,” said Kate Starbird, an associate professor at the University of Washington and a co-founder of the Center for an Informed Public. “Probably the Venn diagram is almost a perfect circle between the kinds of entities that were pushing the disinformation campaign against the White Helmets and that are supporting the pro-Russian narratives in Ukraine.”

Russian state news organizations, media outlets presenting themselves as independent and alternative, and messages circulating on Telegram share these narratives. Starbird said that “niche and clickbait sites that traffic in conspiracy theories” then pick up the content and distribute it more widely. The echo system has published stories about U.S.-funded biolabs and stories of Nazis wielding influence in Ukraine, with the aim of discrediting the West and boosting domestic support in Russia for the invasion. Starbird says the verbiage has changed but the tactics are the same: “In Syria they would smear them by saying they are working with terrorists, they are terrorists. [In Ukraine] they would switch Nazis for terrorists there.”

“The information terrain is fundamentally changed for future conflict,” said Jacob Kurtzer, a director and senior fellow with the Humanitarian Agenda at the Center for Strategic and International Studies. Narratives are now driving conflict, and social media has created an imperative to decentralize communications, making press releases antiquated and ineffective. This is having a long-term impact on how humanitarian organizations operate. In Ukraine, they are trying to match the pace of modern warfare, become more proactive and assertive about their public profile, and state more quickly and clearly what they are doing and where.

Civilians are helping efforts at transparency through contributing open-source intelligence from the ground. Inga Trauthig, a research manager and senior fellow in the Propaganda Research Lab at the University of Texas at Austin’s Center for Media Engagement, says that unlike in Syria, Ukrainians were primed for Russian efforts to shape news and information in their communities. “Russia has been such a big player in the disinformation space for years, going back to traditions of the Soviet Union of campaigns of disinformation/propaganda, etc. I don’t think the Ukrainians ever outgrew their sense of being under Russia’s umbrella,” Trauthig said. Literate in English and plugged into the Western internet, many Ukrainians can tell their own stories, and they are aggressively contesting the battle space of information by uploading photos and videos of the conflict to social media, the web, and encrypted messaging apps.

Ukraine is different from Syria because of the greater speed with which disinformation is flowing, the volume of open intelligence available to counter it, and the general failure—so far—of Russian narratives to take hold outside of Russia itself. The cohesive response from Kyiv also helps. As a state actor, Ukraine has more control over the flow of information and is backing civilians and humanitarian organizations in their efforts to counter disinformation. “The Ukrainians are immediately uploading every time they take out a tank. It’s up on Twitter within an hour. This can be calibrated against what was observed on satellite imagery,” Kurtzer said.

In Syria, where Russia was fighting nonstate actors, the power dynamic was very different. Ismail Alabdullah, a volunteer with the White Helmets since 2013, said: “The key difference is that in Ukraine, it’s a matter of one country invading another country, whereas in Syria, it’s a government attacking and killing its own people. Therefore, the Syrian people have had to rely on themselves, on grassroots movements.” The complexities of Syria made Russian narratives discrediting specific groups easier to sell, and limited Western attention on Syria allowed them to flourish.

Russian disinformation campaigns linked humanitarian groups with Islamist extremists and claimed they helped stage chemical weapons attacks in Syria—narratives that achieved wide online circulation, support from some British academics, and amplification by celebrities such as Roger Waters of Pink Floyd. Disinformation campaigns also portrayed Syrian refugees as a threat to European society. In 2016, Russian news outlets reported that refugees from the Middle East had gang-raped a young girl in Germany. A German police investigation later found the girl had not been abducted or raped, but the story had already gained traction on social media, sparking protests in Berlin against then-Chancellor Angela Merkel’s refugee policy.

During the Russia-Ukraine war, tech companies have taken more aggressive action to stem disinformation by banning Russian state media and advertisements from their platforms, offering increased data protection for Ukrainians, and adjusting content moderating guidelines. Emerson Brooking, a senior fellow at the Atlantic Council’s Digital Forensic Research Lab, attributes this to greater awareness. “There weren’t widespread conversations about disinformation in 2015 [during the Syrian war]. There weren’t these huge policy teams focused on the subject,” he said. Brooking also cites deep bias by Western media and the tech industry for the motivated response to disinformation this time around. “Ukrainians are white. They are European,” he said. “They have never been otherized the way that Syrians were or Arabs or people of the Middle East.”

It’s possible that some of that response may help with the information fight in Syria, where Alabdullah continues to work in the field and with the White Helmets’ rescue and response operations. He says technology companies should do more: “It is necessary to build a methodology for tracking accounts that deliberately spread false and misleading information on an ongoing basis and in particular that may justify targeting civilians or humanitarian workers.”

But the challenge isn’t limited to these two conflicts. As the intersection among social media, humanitarian work, and state actors grows, future wars are likely to see other attempts to smear groups. “Media and everyday people and our political leaders that are making decisions about things need to be more savvy about this content and how it gets to them and even how they are reading content,” Starbird said. “Even our elected leaders need to be more careful about the content they are reading because you can see these narratives come out through statements from certain political leaders in the U.S. and U.K.”

Failure to do so will impact not just humanitarian aid in conflict zones but accountability for the future. Disinformation’s power to reduce confidence in reporting and facts on the ground inhibits humanitarian groups from building and informing human rights cases after the war is over. “How do those on-the-ground realities get translated into political, diplomatic, economic outcomes?” Kurtzer said. “That’s where the contestation around information becomes increasingly important.” The complexities of reporting on war-torn regions, and in places where that devastation is contested, will continue to be a challenge for humanitarian organizations to engage with.

Starbird says a more immediate goal of Russia’s influence efforts is to reduce international assistance to Ukraine: “Any Ukrainian allies, they want to demotivate those countries from taking action by distracting and pushing the citizens to take away that mandate.” Russia does this, Starbird says, by creating doubt and eroding citizens’ resolve to help. Russia is also trying to build support for accommodations in its favor in the conflict.

Russia’s failings may not last. Last month, the cybersecurity firm Recorded Future reported that the Security Service of Ukraine had intercepted an unverified note from Russia’s Federal Security Service on June 5 discussing Russian information operations failures thus far and providing “recommendations for influence efforts moving forward.” The note advises targeting European communities with information linking deterioration in living standards in the European Union to support for the war in Ukraine and proposes storylines including “arming Ukraine at the expense of European taxpayers” and “forecasts about the number of Ukrainian refugees and the created burden on the budget and socio-economic infrastructure” with a goal of provoking internal “public pressure on the governments and political elites of Western countries.”

Brooking believes Russia will gain narrative ground. “We are at a zenith of support for Ukraine,” he said. Media organizations have withdrawn resources and shifted their attention to other parts of the world. As international news coverage wanes, creating an information gap, propaganda is likely to become more powerful, impacting humanitarian aid and shaping global discourse and Ukraine’s future.

***
This article has been archived for your research. The original version from Foreign Policy can be found here.