YouTube will temporarily ban accounts for posting election fraud videos
After white nationalist rioters descended on the US Capitol yesterday, YouTube removed a video in which President Trump expressed support for the criminal behavior and lied about winning the 2020 presidential election. Today, YouTube is furthering its efforts to crack down on misinformation in light of the chaos by announcing that any channel posting videos that spread false information about the election will immediately be given a “strike.” A first strike against a channel effectively blocks it from YouTube for a week, preventing it from uploading any videos, stories or live events.
1. Due to the disturbing events that transpired yesterday, and given that the election results have now been certified, starting today *any* channels posting new videos with false claims in violation of our policies will now receive a strike. https://t.co/aq3AVugzL7
— YouTubeInsider (@YouTubeInsider) January 7, 2021
Usually, YouTube gives first-time offenders a warning, but that practice has been suspended in light of yesterday’s events. In December, the video platform announced it would remove videos that alleged “widespread fraud or errors” around the 2020 election, and the company says it has already removed thousands of videos for violating that policy. As it usually does with policy changes, YouTube had a built-in grace period where content that broke the rules would be removed without a strike; that grace period was set to end on inauguration day, January 20th. However, YouTube has made the decision to move that date up to today.
*** This article has been archived for your research. The original version from Engadget can be found here ***