“I no longer believe 9/11 was a terrorist attack”: Working as a Facebook content moderator messes with people’s minds
Casey Newton’s unprecedented investigation into the contract workers who moderate content on Facebook in America is full with horrifying, dystopian tidbits. Horrifying like: Workers are getting PTSD from repeatedly watching videos of murders in which the victims cry for their mothers as they die.
The Verge has summarized some of Newton’s findings in a TL;DR sidebar (for instance, “Employees can be fired after making just a handful of errors a week, and those who remain live in fear of former colleagues returning to seek vengeance. One man we spoke with started bringing a gun to work to protect himself”). But here I’ll focus on just one aspect: the fact that some of these content moderators are becoming radicalized themselves, beginning to believe the conspiracy theories that they are paid to flag. From the piece:
The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”
And:
One [quality assurance manager] often discussed his belief that the Earth is flat with colleagues, and “was actively trying to recruit other people” into believing, another moderator told me. One of Miguel’s colleagues once referred casually to “the Holohoax,” in what Miguel took as a signal that the man was a Holocaust denier.
Conspiracy theories were often well received on the production floor, six moderators told me. After the Parkland shooting last year, moderators were initially horrified by the attacks. But as more conspiracy content was posted to Facebook and Instagram, some of Chloe’s colleagues began expressing doubts.
“People really started to believe these posts they were supposed to be moderating,” she says. “They were saying, ‘Oh gosh, they weren’t really there. Look at this CNN video of David Hogg — he’s too old to be in school.’ People started Googling things instead of doing their jobs and looking into conspiracy theories about them. We were like, ‘Guys, no, this is the crazy stuff we’re supposed to be moderating. What are you doing?’”
This phenomenon is not limited to the contract workers of Cognizant, the firm Facebook outsources moderation to; it is a known problem for people who spend a lot of time studying conspiracy theories and disinformation online. Kate Starbird, an assistant professor at the University of Washington and director of its Emerging Capacities of Mass Participation (emCOMP) Laboratory, spoke about this at Harvard last year. “When you do this kind of thing, you should keep a rope around your ankle and have someone to pull you back up,” she said. “This is a really disorienting part of the Internet.” People on her team found themselves beginning to believe some of the conspiracy theories that they were reading about; it was something everybody had to be vigilant about.
Starbird spoke about the effects — on her, her research partners, and students — of reading so much of this kind of content. It’s “very effective and extremely disorienting,” she said. “We’ve had trouble even reporting on this data because we’re so confused about what’s going on and it’s so hard to make sense of things…This way of thinking, once you get into it, is very sticky. Once you’re down there, it’s so hard to get out, and it’s hard to think of solutions.”
While Cognizant provides some meager mental health support to its employees — and, uh, nine minutes per day of “wellness time” for when you get too traumatized, which Muslim employees were not allowed to use to pray — neither it nor Facebook appear to be addressing the concern that this work is generating a new crop of conspiracy theorists — the moderators themselves.
they did not appear to be aware of this issue when i raised it to them, nor did Facebook
— Casey Newton (@CaseyNewton) February 25, 2019
Read the full piece here.
*** This article has been archived for your research. The original version from Nieman Journalism Lab at Harvard can be found here ***