Anti-vax groups on Facebook are using the carrot emoji to spread misinformation
To avoid censorship for the spread of COVID-19 misinformation, anti-vax groups on Facebook have begun coding their messages with the carrot emoji, according to a report from the BBC. The emoji is used in place of the word “vaccine” to avoid the wrath of Facebook’s automated moderator algorithms.
The BBC reports that these groups often share unverified claims of people being killed or injured by the COVID-19 vaccine. One group with over 200,000 members states in its rules that members must “use code words for everything” and that posters cannot “use the c word, v word or b word ever” (covid, vaccine, booster).
According to the BBC, the algorithm the Meta-owned platform uses for moderation tends to focus on words, not images. This is unsurprising news. Back in July 2021, a Bloomberg report detailed how social media algorithms perform poorly in detecting abuse through emojis.
Marc Owen Jones, a disinformation researcher and associate professor at Hamad Bin Khalifa University in Qatar, was invited to join one such group and shared the group’s “very odd” attempt to evade censorship. “Initially I was a little confused,” Jones said in a tweet. “And then it clicked – that it was being used as a way of evading, or apparently evading, Facebook’s fake news detection algorithms.”
“My sister, 57, rushed to the hospital with breathing problems. She has two đ„đ„ and the bđ„,” one poster wrote. Another wrote, “My uncle 55 , brain tumor after đ„đ„.” Based on Jones’ screenshots, a common theme in these posts is users blaming health problems associated with getting older on the COVID vaccine.
Users in that thread also pointed out some other emojis that anti-vax groups would use, like the đș emoji (booster / booze-ster) or the đ emoji back when the CDC started allowing kids to be vaccinated.
A cursory search on Twitter of “đ„ covid” will bring up hundreds of tweets in French posted by users adorning the carrot in their display names. A Google translation of some of these tweets shows users questioning the validity of COVID-19 vaccine measurements put in place by French President Emmanuel Macron.
Social Media’s Auto-Moderation Problem
The use of emojis as code for something more sinister is not new. And not in a darkly funny crab emoji way but in an “I’m trying to be slick about my bigotry” kind of way.
Social media platforms like Facebook and Twitter have come under fire in the past for lackluster responses in stopping racist abuse toward Black soccer players. Internet trolls would post emojis of monkeys and bananas as racist gestures using imagery often associated with racist stereotypes of Black people.
Another way bad actors get away with spreading falsehoods and hate across the internet involves the use of word camouflage. In a research study by Ana Romero-Vicente, a researcher with EU DisinfoLab, this technique involves the subtle tweaking of keywords so that they are “understandable for users while remaining undetected to social networksâ content moderation systems.” For example, “v4c11ne” would mean “vaccine.”
Romero states that tackling this phenomenon of word camouflaging is a complex task that requires blocking lists on social media to be constantly re-evaluated and optimize to achieve a delicate balance between misinformation and content that doesn’t violate the rules.
Facebook for its part is actively attempting to shut down groups that try to spread vaccine information. The platform’s Help Center states that it will remove “Claims that COVID-19 vaccines are experimental if the context of the claim also suggests that vaccinated people are taking part in a medical experiment,” and “Claims that COVID-19 vaccines kill or seriously harm people.”
This talk of moderating COVID content may be moot, however. Nick Clegg, Meta’s president of global affairs wrote in July that he questioned whether “the time is right for us to seek input from the Oversight Board about our measures to address COVID-19 misinformation, including whether those introduced in the early days of an extraordinary global crisis remains the right approach for the months and years ahead.” In other words, Meta may soon stop trying to take down anti-vax content altogether.