Trump, QAnon, Russia: Facebook, Twitter, YouTube brace for a turbulent election and post-election cycle
From Facebook to Twitter to YouTube, social media companies whose platforms were used to amplify falsehoods, conspiracy theories and inflammatory rhetoric in 2016 have been preparing for November’s presidential election for years.
Now it’s crunch time.
On Thursday, FBI Director Christopher Wray told House Homeland Security committee hearing that Russia remains “very active” in its effort to sow discord and disrupt the vote, primarily by denigrating Democratic presidential nominee Joe Biden.
There’s been an alarming rise in domestic meddling from groups such as conspiracy cult QAnon, which are pumping out falsehoods. And President Trump, who uses social media as his reelection campaign’s bullhorn, has made remarks that threaten to undermine confidence in the election, from questioning the legitimacy of mail-in voting to suggesting that people vote twice.
Trump vs. Twitter:New policy would crack down on president claiming premature victory in election
Social media misinformation:Facebook, Twitter react to President Trump’s urging of voting twice
“I think it’s clear that they’re trying way more than they did in 2016 or even in 2018. But I don’t think anybody can be completely ready for what might happen if we have an extremely close election this year,” said Patrick Warren, associate professor at Clemson University who researches social media disinformation. “A foreign information operation of the size and scope of the 2016 (Internet Research Agency) campaign would be caught this year. But we’re already seeing domestic operations adopt their tactics, which is a much harder problem for the platforms.”
Amid growing concerns about the integrity of the election, Twitter said Thursday that it would take steps to secure the accounts of high-profile users including administration officials, members of Congress, political parties and campaigns, major news outlets and political journalists.
Earlier this week, Twitter announced a voting information hub within the app that will include facts on mail-in ballots and how to register for the Nov. 3 election in English and Spanish. It also has tightened its policies on election-related misinformation.
Facebook has rolled out changes to curb misinformation and interference before the election, including barring any new political ads in the week before Election Day and cracking down on posts that deter people from voting.
The initiatives are part of a Facebook campaign to register 4 million voters and to supply poll workers on Election Day.
The moves acknowledge the vast influence of social media on American political life and this year’s highly contentious election cycle in which an unprecedented number of Americans are expected to vote by mail due to the coronavirus pandemic.
What tech companies are desperately seeking to avoid: A repeat of the 2016 election in which Russian operatives peppered the American electorate with divisive messages.
Facebook and Twitter recently busted an effort by the Kremlin-backed Internet Research Agency which interfered in the 2016 election to sway voters away from Biden through a network of fake accounts and a website camouflaged to look like a left-wing news site.
Last month, YouTube pledged to increase the visibility of authoritative voices and reduce the spread of election-related misinformation. It expanded efforts to provide third-party fact-checked articles above search results for claims about the election. And when YouTube users search for presidential or federal candidates on YouTube, an information panel pops up above search results with information about the candidate.
But it’s not just the run-up to Nov. 3. Tech companies are bracing for what could be a turbulent period following the election if it’s not certified right away or if the election is contested.
Voter registration drive:Mark Zuckerberg launches massive get-out-the-vote effort ahead of presidential election
Facebook election disinformation:What you can do to stop its spread
Twitter has pledged to label or remove posts that include misleading election results.
Facebook has strengthened its policies to crack down on misinformation and steer users to authoritative information about the election, including slapping a label on content that seeks to delegitimize voting methods or the outcome of the election.
“Since the pandemic means that many of us will be voting by mail, and since some states may still be counting valid ballots after election day, many experts are predicting that we may not have a final result on election night. It’s important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted. This could be a very heated period,” CEO Mark Zuckerberg said in a Facebook post.
*** This article has been archived for your research. The original version from USA TODAY can be found here ***