Tuesday, December 24, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

QAnon

How Facebook Can Slow QAnon for Real

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

Like other dangerous ideas, the QAnon conspiracy is tricky to root out online. But it’s not impossible.

QAnon is a sprawling and false set of theories that powerful institutions are controlled by pedophile cannibals who are plotting against President Trump. It’s also a chameleon. Supporters use legitimate causes like protecting children or promoting wellness to appeal to newcomers and then draw them into their outlandish ideas.

QAnon adherents tailored their ideas for Facebook, which moved slowly to address the movement at first. Facebook announced in August that it was restricting QAnon activity, but so far its actions haven’t accomplished much, my colleagues Sheera Frenkel and Tiffany Hsu wrote.

I talked with Sheera about how much blame Facebook deserves for the spread of this dangerous conspiracy, and what we can learn from internet companies’ prior crackdown on terrorist recruitment.

Shira: Why haven’t Facebook’s recent actions against the QAnon conspiracy worked?

Sheera: It’s tricky. People don’t want to feel like they’re being harassed, monitored or censored. And false conspiracies attach themselves to genuine activism like protecting children.

Imagine a mom is posting in a parenting Facebook group for help finding a tutor, and someone responds with an article about teachers with criminal records getting hired in schools. Two days later she might be on a QAnon group reading bogus claims about child exploitation. She didn’t seek out QAnon content.

How does Facebook take action on something like that?

What should Facebook do about QAnon?

The one idea we hear again and again is for Facebook to stop its automated recommendation systems from suggesting groups supporting QAnon and other conspiracies.

Other than that, every expert has a different opinion.

One I spoke with said Facebook needed to decide whether it would take a hard line against all conspiracies. Another academic said it would be more productive to slowly cut back the circulation of QAnon-related information on Facebook and give people exposed to QAnon material some clear information on why it’s false and dangerous.

Let me play devil’s advocate: If a small but growing number of people believe strongly in this conspiracy, even if we know it’s false, should Facebook try to stop it?

First, QAnon beliefs have been linked to real-world violence.

Plus, Facebook says it wants a “healthy community.” Does it believe these conspiracies are a part of that?

How much blame does Facebook deserve for QAnon’s growth?

When Facebook changed its focus to encourage people to gravitate to smaller, more intimate groups, it inadvertently created safe havens for people to discuss how to spread QAnon theories.

Facebook needs to ask itself if it has a responsibility for fueling QAnon and think through the consequences of that.

Have any internet companies managed to slow the spread of ideas related to QAnon?

Reddit used to be ground zero for QAnon, until it banned a whole section of the site dedicated to the conspiracy in 2018. There is still QAnon stuff on Reddit, but the content largely moved elsewhere — including to Facebook.

Could things have been different for conspiracies on Facebook, too?

I wonder how different our world would look if Facebook, YouTube and Twitter joined Reddit in taking coordinated, effective action against QAnon. That’s what the companies did in 2015 when the Islamic State was using social media to recruit new followers. You could see almost in real time that ISIS lost much of its ability to recruit online.

In my mind, that was the clearest example of the internet companies — when they were motivated to do so — taking action to remove a dangerous group that was pervasive on their sites. This action was supported by the White House, and the internet companies felt empowered to make an overwhelming show of force.

Now, though, tech companies are divided over what to do about QAnon, and they don’t have clear direction from the administration. We’ve not seen condemnation of QAnon from the White House, let alone support for social media companies to restrict its spread.

(Read more on the rise of QAnon.)


Brian X. Chen, The New York Times’s personal technology columnist, has an alternative if an app we want is no longer available from official app stores. But this option comes with risks.

The Trump administration did not follow through on its threat to ban the Chinese app TikTok from the United States. And another threatened ban against China’s WeChat app is on hold because of a legal challenge.

In August, the Fortnite video game disappeared from app stores because of a business dispute between the owner of the game and Apple and Google.

If the apps we want are pulled from official download channels, there is another option. But it’s not for everyone.

For people with Android phones, there is a process known as “sideloading” to install apps that aren’t available in official app stores. Apple iPhone users can install unauthorized apps as well, but it’s such a tricky process that I recommend against it.

First, a disclaimer: Apps that people with Android phones download through the official Google Play app store are vetted to screen for security vulnerabilities and help prevent malware from infecting your phone. Installing apps outside the app store means you are bypassing that security mechanism. Do this at your own risk.

Here’s what Android users need to do to install apps via the Chrome web browser:

  • Open the Settings app and tap on Apps & notifications.

  • Tap on Advanced. Then select Special app access.

  • Tap on Install unknown apps.

  • Tap on Chrome and flip the slider for the “Allow from this source” option.

From here, do a web search for the application file you’re looking for and download it through the website.

Another warning: Make sure you are downloading what you’re looking for. Sometimes bogus and dangerous software is disguised as the official version of an app.

Good luck, and be careful.


  • That TikTok drama was rather pointless. The White House threatened to ban TikTok to protect Americans. But what happened instead was a politically expedient business arrangement among Oracle, Walmart, TikTok’s Chinese owner and the Trump administration, my colleagues Erin Griffith and David McCabe wrote. Also, the participants don’t see eye-to-eye on what they agreed to.

    The Times’s David E. Sanger also wrote that this deal neither resolved security concerns about China’s government possibly acting through TikTok, nor answered how the U.S. government should deal with foreign technology.

  • Are you excited about the “ad tech stack”?! Steve Lohr of The Times explained that Google over decades purchased companies that gave the company a role in many steps of buying and selling online ads. This Google business is now a focus of antitrust investigations, and it has made some legal experts wonder if there should be more restrictions on big tech corporations buying smaller companies.

  • Keeping tabs on the virus alert apps: MIT Technology Review is compiling a database of coronavirus alert apps introduced by health authorities. It offers information on how clear the apps are about the information they collect and how they work.

I have to restrain myself from putting red pandas in this spot every day. They are the best. Here is Lin from the Cincinnati Zoo eating apples and bananas. Did you know red pandas have semi-opposable thumbs?!


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

***
This article has been archived for your research. The original version from The New York Times can be found here.