Thursday, November 28, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

2020 Election

YouTube will move more quickly to suspend channels posting videos claiming widespread voter fraud

YouTube CEO Susan Wojcicki speaks during the opening keynote address at the Google I/O 2017 Conference at Shoreline Amphitheater on May 17, 2017 in Mountain View, California.

Justin Sullivan | Getty Images

YouTube says it’s going to suspend any channels posting new videos of false widespread voter fraud claims, rather than giving them a warning as was its previous policy.

“Due to the disturbing events that transpired yesterday, and given that the election results have now been certified, starting today any channels posting new videos with false claims in violation of our policies will now receive a strike,” the company said in a statement Thursday.

YouTube’s regular policy allows channels to get one warning for posting false content before giving them a strike. Channels that receive a strike are suspended from posting or live streaming for one week. If they receive three strikes in the same 90-day period, YouTube will permanently ban them.  

YouTube said that over the last month, it has removed thousands of videos which spread misinformation claiming widespread voter fraud changed the result of the 2020 election, including “several” videos President Trump posted to his channel.

YouTube removed a video President Trump posted Wednesday, which made false claims about the results of the 2020 U.S. presidential election, but has not yet issued any formal statement about banning his channel or blocking him from posting. A spokesperson did not respond to requests for comment.

YouTube is not going nearly as far as its competitors in cracking down on Trump. Facebook announced Thursday that it would take the unprecedented step of blocking Trump from posting at least until inauguration day, and maybe longer. Twitter blocked several Trump tweets containing false claims and put a 12-hour moratorium on new posts until he removed those tweets.

However, many have argued these changes are too little too late, and the public has long been calling for the company to take more action against conspiracy theories, which have fueled beliefs leading to the violence that occurred this week.

Wednesday’s pro-Trump riot in Washington D.C. resulted in four deaths and 50 injured police officers.

YouTube has typically been slow to crack down on troublesome content. In October, Facebook banned all accounts related to the false conspiracy theory QAnon, which has spread voter misinformation and vocalized their plans for Wednesday’s events, weeks and even months beforehand. In response, YouTube issued a carefully worded policy that effectively banned some QAnon content, but stopped short of banning the group itself.

In November, Senate Democrats asked YouTube CEO Susan Wojcicki to commit to removing videos that contain false election information. Instead, the company said it would demonetize the videos while admitting they “undermined confidence in elections with demonstrably false information.” The demonetization, however, haven’t been evenly enforced — for instance, some videos that spread misinformation and called for violence after election day continued to display ads, sometimes until a reporter notified the company.

*** This article has been archived for your research. The original version from CNBC can be found here ***