Twitter accounts tied to China lied that COVID came from Maine lobsters
Twitter accounts linked to China were discovered spreading misinformation about the origins of COVID-19, such as lies that the virus came from a shipment of Maine lobsters to Wuhan.
Oxford researcher Marcel Schliebs first noticed the misinformation campaign when he saw a tweet from Zha Liyou, the Chinese consul general in Kolkata, India.
Schliebs studies disinformation, propaganda and divisive political news content in the U.K. online information ecosystem at the Oxford Internet Institute. He linked the tweet to hundreds of Twitter accounts, some real and some fake, all of them spreading pro-China misinformation.
The tweet by Liyou said: “Major suspect of covid via cold chain identified: A MU298 of Nov. 11, 2019 carrying food from Maine, US to Huanan Seafood Market, Wuhan, Hubei via Shanghai. During the next few weeks, many workers around moving this batch of seafood got infected.”
These narratives spread by China-linked accounts are nothing new, according to Kathleen Hall Jamieson, director of the Annenberg Public Policy Center at the University of Pennsylvania.
“Early in the pandemic, Chinese sources spread the theory that SARS CoV-2 originated at Fort Detrick and was spread to China by U.S. military,” says Jamieson. “The platforms can remove it, or if they decide against doing so, can downgrade it or flag it and attach fact-checking content.”
Schliebs echoes similar insights from his Oxford research.
“Almost since the beginning of the outbreak, the question of the origin of COVID has been of core importance to the Chinese propaganda apparatus,” Schliebs says. “This coordinated operation was clearly trying to promote narratives in line with Beijing’s general propaganda strategy and geopolitical objectives.”
China’s historical disinformation campaigns: China spread disinformation videos on Uyghur Muslims two years ago. YouTube let them stay up.
Twitter has imposed strict rules around COVID misinformation, stating in its rules and policies that any demonstrably false or misleading content is banned and will be deleted. Certain posts may be labeled as misinformation, and repeat offenders will have their accounts deleted.
The violations include attempts to “invoke a deliberate conspiracy by malicious and/or powerful forces,” according to Twitter’s guidelines.
Misinformation can have a powerful effect and impact how people respond to public health guidance.
“Acceptance of misinformation and/or conspiracy theories is associated with a reduced likelihood to mask or vaccinate,” Jamieson says.
COVID-related misinformation: COVID, vaccine misinformation spread by hundreds of websites, analysis finds
Kai Yan, a spokesperson for the Chinese Embassy in the U.K., told NBC that China urges “all members of the international community to work together in opposing and resisting such disinformation, which will inevitably disrupt global cooperation in fighting the pandemic.”
Once Schliebs sent the information to Twitter, they suspended the accounts tied to misinformation.
“We notified Twitter last week, and they were very responsive and suspended the accounts very rapidly within a few hours. Fortunately, we detected the campaign as it was still in its early growth phase and before it could really start to reach and impact real genuine audiences,” Schliebs told USA TODAY.
The fight against myths and misinformation on social media
With disinformation spreading on social media, platforms can take an increasingly active role, according to Schliebs.
“Platforms can and should continuously monitor suspicious behavior particularly around sensitive geopolitical issues like the origin of COVID-19,” Schliebs says. “To do so and detect coordinated networks of fake accounts, they can for example monitor whether there are patterns in the language or timing of tweets that raise red flags of suspicious coordination.”
Research from the University of Pennsylvania’s Social Policy Lab found misinformation works much more easily than the efforts to undo it. In fact, the data they gathered showed misinformation was accepted as fact 99.6% of the time, whereas attempts to correct it succeed only in 83% of cases.
The researchers also discovered that people who believe in science are actually more susceptible to misinformation because pseudoscience often uses terms that mimic the language of real scientific studies.
The Social Policy Lab recommends succinct corrections to misinformation as opposed to detailed ones, which were found to be less effective. They also pointed out that interacting with real people, such as family and friends, tended to reduce vaccine hesitancy.
Michelle Shen is a Money & Tech Digital Reporter for USA TODAY. You can reach her @michelle_shen10 on Twitter.
*** This article has been archived for your research. The original version from USA TODAY can be found here ***