What Happened When Facebook Employees Warned About Election Misinformation
WHAT HAPPENED
1. From Wednesday through Saturday there was a lot of content circulating which implied fraud in the election, at around 10% of all civic content and 1-2% of all US VPVs. There was also a fringe of incitement to violence.
2. There were dozens of employees monitoring this, and FB launched ~15 measures prior to the election, and another ~15 in the days afterwards. Most of the measures made existings processes more aggressive: e.g. by lowering thresholds, by making penalties more severe, or expanding eligibility for existing measures. Some measures were qualitative: reclassifying certain types of content as violating, which had not been before.
3. I would guess these measures reduced prevalence of violating content by at least 2X. However they had collateral damage (removing and demoting non-violating content), and the episode caused noticeable resentment by Republican Facebook users who feel they are being unfairly targeted.
*** This article has been archived for your research. The original version from The New York Times can be found here ***