Misinformation And Conspiracy Theories Can Spread In COVID-19 Patient Groups : Shots – Health News
Jub Rubjob/Getty Images
For decades, people struggling with illnesses of all kinds have sought help in online support groups, and during 2020, such groups have been in high demand for COVID-19 patients, who often must recover in isolation.
But the fear and uncertainty regarding the coronavirus have made online groups targets for the spread of false information. And to help fellow patients, some of these groups are making a mission of stamping out misinformation.
Shortly after Matthew Long-Middleton got sick on March 12, he joined a COVID-19 support group run by an organization called Body Politic on the messaging platform Slack.
“I had no idea where this road leads, and so I was looking for support and other theories and some places where people were going through a similar thing, including the uncertainty, and also the thing of like, we have to figure this out for ourselves,” says Long-Middleton, 36, an avid cyclist who lives in Kansas City, Missouri. His illness started with chest discomfort, then muscle weakness, high fever, loss of appetite and digestive problems.
But with the support came misinformation. Group members reported taking massive amounts of vitamins — including Vitamin D which can be harmful in excess — or trying other home remedies not backed by science.
Experts warn that such false or unverified information spread on online support groups can not only mislead patients, but also potentially undermine trust in science and medicine in general.
“Even if we’re not actively seeking information, we encounter these kinds of messages on social media, and because of this repeated exposure, there’s more likelihood that it’s going to seep into our thinking and perhaps even change the way that we view certain issues, even if there’s no real merit or credibility,” says Elizabeth Glowacki, a health communication researcher at Northeastern University.
In an effort to help fellow COVID-19 sufferers, some patients, like Vanessa Cruz, spend most of their days fact checking their online support groups.
“It’s really become like a second family to me and being able to help everybody is a positive thing that comes out of all this negativity we’re experiencing right now,” Cruz says.
Cruz, a 43-year-old mother of two, moderates the Facebook COVID-19 support group “have it/had it” from her home in the Chicago suburbs. She’s also a “long-hauler,” and has been dealing with COVID-19 symptoms, including fatigue, fever and confusion since March.
The worldwide group has more than 30,000 members and has recently been buzzing with reports from India about treating COVID-19 with a common tapeworm medication (it’s not FDA-approved and there’s little evidence it works) — as well as speculation about President Trump’s recent diagnosis.
Other troubling posts include people pushing hydroxychloroquine, which has not been proven to be effective in treating COVID-19, and sharing the viral video “America’s Frontline Doctors” which promotes other unproven treatments and spreads conspiracy theories.
Cruz says supporting fellow patients can be a tricky balance of getting the facts right, but also giving people who are scared the chance to be heard.
“It’s like you really don’t know what to question, what to ask for, how to reach for help,” Cruz says. “Instead of doing that, they just, they write up their story, basically, and they share it with everybody.”
To keep the group evidence-based, it has built up a 17-person fact checking team, which includes two nurses and a biologist, that reviews every post that goes up.
However, many online COVID-19 groups don’t have the resources or strategy to address misinformation.
Mel Montano, a 32-year-old writing instructor who lives in New York and has also felt sick since March, says she left a large Facebook support group because she was frustrated by the conspiracy theories that filled its posts.
“All of these conflicting theories completely took away from the focal point of it,” Montano explains. “It was a mess. It was [like] being on one of those conspiracy theory pages or channels, and it was just not for me.”
Montano is now a moderator of the Body Politic group on Slack.
Facebook and Twitter have made changes in their approaches toward COVID-19 misinformation, including additional fact checking, removing posts that contain falsehoods and removing users or groups that spread them.
However, critics say more changes are needed.
Fadi Quran, campaign director of Avaaz, a human rights group that focuses on disinformation campaigns, says Facebook needs to revise the way it prioritizes content.
“Facebook’s algorithm prefers misinformation, prefers the sensational stuff that’s going to get clicks and likes and make people angry,” Quran says. “And so the misinformation actors, because of Facebook, will always have the upper hand.”
A study by Avaaz showed that misinformation and disinformation had been viewed on Facebook four times as often as information from official health groups, like the World Health Organization.
Facebook did not respond to inquiries for this story.
COVID-19 patient Matthew Long-Middleton thinks the problem goes deeper than getting the data right. He says a lot of bad information is spread because patients so badly want to find ways to feel better.
After nearly six months of symptoms, Long-Middleton says he’s returned to better health in the past month, though he continues to check in on fellow support group members who are still struggling.
He never tried risky treatments discussed in the group himself, but he understands why someone might.
“You want to find hope, but you don’t want the hope to lead you down a path that hurts you,” he says.
This story came from a reporting partnership between NPR, Kaiser Health News and KCUR.
*** This article has been archived for your research. The original version from NPR can be found here ***