6 Foolproof Tips to Help You Spot Online Manipulation
Whether you like to admit it or not, we are all susceptible to manipulation and misinformation, especially online, where bad actors often try to take advantage of the brain’s limited ability to discern fact from fiction in a fast-moving social media environment. I’ve spent years researching “the dark arts of manipulation” by looking at how professionals mislead people with bad information. It turns out that most dodgy content leaves fingerprints behind that people can learn to identify.
We’ve termed these “the six degrees of manipulation,” which I cover in my new book Foolproof: Why We Fall for Misinformation and How to Build Immunity.
Here they are in turn:
1. Emotion
Emotions are part of daily life, but bad actors try to influence us by fearmongering, creating outrage, and luring us with emotional content. For example, Facebook can manipulate how people feel on the platform by suppressing positive news on people’s feeds. At a basic psychological level, we know that moral-emotional words such as “hate” and “evil” capture our visual attention more than neutral language. That’s why headlines such as “Hillary Clinton ‘Covered Up’ Elite Pedophile Ring At State Department” get traction.
In research I conducted with colleagues, we looked at millions of social media posts. What we found was clear: posts that use moral and emotional language see a 10 to 20 percent boost in engagement. So, why does this matter? Consider that 76 to 88 percent of anti-vaccination websites use emotional appeals rather than evidence. It’s an old trick. In the 1800s, people fearmongered that you’d turn into a human-cow hybrid if you got the smallpox vaccine! Of course, deciding whether to vaccinate is up to you, but don’t let yourself be influenced by actors who seek to manipulate your feelings.
2. Impersonation
In 2018, someone impersonated Warren Buffet on Twitter (they had misspelled “Buffett”), and the account quickly gained hundreds of thousands of followers, tweeting out fake advice like “don’t spend what you don’t have.” Although innocent, sophisticated impersonators can cause serious damage.
A classic example is the “fake expert” technique: dress someone with no relevant expertise up as an expert or have them use unrelated credentials to peddle misinformation. Take Lee Dimke, who, during the pandemic, proclaimed that he had found the “Achilles heel” of the coronavirus: high temperatures. Waiving around scientific papers in a YouTube video, he advised people to shove a blowdryer up their nose as heat kills the coronavirus (this is total nonsense).
A background checkquickly revealed that Dimke had absolutely no medical qualifications! What about other contested topics? The Global Warming Petition Project claims that global warming is a hoax. To make it seem legitimate, they borrowed credibility from the National Academy of Science (NAS) by stealing their template. The story went viral on social media, and many people were duped. After Russia invaded Ukraine, many manipulated videos were released on social media to create misleading narratives about the war. The list goes on!
3. Polarization
This technique has only one singular purpose: to drive people apart. As political polarization is running high, people not only disagree on policy issues but also increasingly report to be less willing to marry someone from the other side! This tension provides a perfect opportunity for bad actors to stir things up further by aggravating existing grievances. This is often achieved by writing extremely polarizing headlines.
In one study, we found that “dunking” on the other side, can increase the odds of a post being shared by a whopping 67 percent. So-called “outgroup derogation” was also the best predictor of “angry” and “haha” reactions on Facebook. Examples included “Every American needs to see Joe Biden’s latest brain freeze” and “Republicans refuse to say Trump is a liar. What is going on?”
A key way to polarize people is through “false amplification” or stirring things up on both sides of the debate. For example, Russian bots tweeted about how vaccines are “deadly poison” but also noted that “you can’t fix stupidity, let them die” in reference to the unvaccinated. Don’t let polarization techniques toxify online debates.
4. Trolling
A closely related tactic is commonly referred to as “trolling,” which traditionally refers to a fishing line being dragged through the water using a baited hook to catch trout or salmon. Online, trolls use bait, too, by using inflammatory material to manipulate public perception. A widely documented example is Russian interference in the U.S. 2016 Presidential election.
We learned a lot from Lyudmila Savchuk, a former undercover troll at the Russian Internet Agency (IRA), who explained how their troll factory had thousands of people posting to social media day and night. A key difference between a bot and a troll is that a troll is a human-operated account trying to dupe people into thinking they’re politically engaged. The trolls would comment on anything from pro-Putin materials to attacking European values and the U.S. election. For example, trolls would employ voter suppression tactics by confusing people about voting rules or talking about Black culture, LGBT rights, and gun control on social media.
Yet, not all campaigns are organized. For example, in a recent Rolling Stoneinterview, I pointed out how the “Cat Turd” account on Twitter is a single troll who managed to get the attention of Musk and gain millions of followers. Don’t feed the trolls.
5. Conspiracy
Who doesn’t love a good conspiracy theory? Conspiracy theories are designed to be psychologically attractive: they offer simple causal explanations and take advantage of the brain’s desire to connect the dots. For example, is it not suspicious that more COVID-19 cases were reported near 5G cell towers? In fact, no, it’s readily explained by a third factor: population density (where there are more people, there are more 5G towers and COVID-19 cases).
The key principle behind any conspiracy theory is that powerful evil actors secretly plot behind the scenes. In contrast, a scientific theory leaves all plausible options open based on the best available evidence. Sure, some conspiracies have really happened, and it’s good to maintain a healthy level of skepticism, but that doesn’t make something a conspiracy! A key feature of conspiratorial thinking is that it operates like a multi-level marketing scheme: some people at the top cook up a false narrative and other people subsequently spread it further.
Importantly, it’s never just one conspiracy. Once you believe one, you’ll llikely endorse others. Take Alex Jones, who, until recently, peddled the false conspiracy theory that the Sandy Hook shooting was a false flag operation conducted with staged crisis actors! The truth is out there, but it’s likely not a conspiracy.
6. Discrediting
A famous example of discrediting came from former President Trump, who claimed that the mainstream media was “fake news” and “the enemy of the American people.” This claim was an attempt to discredit the mainstream media, which was critical of his conduct. In fact, as part of a research study, we wondered if this effect generalizes. We asked Democrats and Republicans the first thing that came to mind when they heard the term “fake news.” Guess what? The first thing that came to mind for Democrats was “Fox News” while for Republicans it was “CNN.”
The term “fake news” has become a vehicle to dismiss and discredit uncongenial viewpoints. The ad hominem fallacy is often deployed in the service of discrediting: you attack the person or someone’s character instead of the actual argument. It’s easy to get on board with ad hominem attacks when you don’t like someone, but it’s a misleading rhetorical tactic all the same!
In short, being on the lookout for these predictable tactics will help you spot manipulation and misinformation.
This article has been archived for your research. The original version from Psychology Today can be found here.