What Donald Trump, Vladimir Putin and Viktor Orbán Understand About Your Brain
Why do people believe some politicians’ lies even when they have been proven false? And why do so many of the same people peddle conspiracy theories?
Lying and conspiratorical thinking might seem to be two different problems, but they turn out to be related. I study political rhetoric and have tried to understand how populist politicians use language to develop a cult-like following, divide nations, create culture wars and instill hatred. This pattern goes back to antiquity and is seen today in leaders including former President Donald Trump, Hungary’s Viktor Orbán and Russia’s Vladimir Putin. These leaders are capable of using words and speeches to whip people into such an emotional tempest that they will do things like march on the seat of Congress or invade a neighboring country.
What makes this kind of speech worrying is that it is not just emotions like aggression they can manipulate; politicians can also use rhetoric to influence the public’s thoughts and beliefs, and spread lies and conspiracy theories. Those lies and conspiracy theories are stubbornly resistant to countervailing facts and can sow divisions that destabilize their own societies.
My research analyzes real speeches made by politicians past and present, including those of Trump, Orbán and Putin, using cognitive linguistics — a branch of linguistics that examines the relationship between language and the mind. What I have found is that throughout history, speeches by dictators and autocrats have one thing in common: they use dehumanizing metaphors to instill and propagate hatred of others.
It is well-documented that for example words like “reptiles” and “parasites” were used by the Nazi regime to compare outsiders and minorities to animals. Strongmen throughout history have referred to targeted social groups as “rats” or “pests” or “a plague.” And it’s effective regardless of whether the people who hear this language are predisposed to jump to extreme conclusions. Once someone is tuned into these metaphors, their brain actually changes in ways that make them more likely to believe bigger lies, even conspiracy theories.
These metaphors are part of a cognitive process that entraps some people in this kind of thinking while others are unaffected. Here’s how it works.
The first step to manipulating the minds of the public, or really the precondition, is that listeners need to be in the right emotional state.
In order to hack into the minds of the public, people need to feel fear or uncertainty. That could be caused by economic instability or pre-existing cultural prejudices, but the emotional basis is fear. The brain is designed to respond to fear in various ways, with its own in-built defense mechanisms which produce chemicals in the response pattern, such as cortisol and adrenaline. These chemical responses, which zip straight past our logical brains to our fight-or-flight reactions, are also activated by forms of language that instill fear, either directly (as in a vocal threat) or, more insidiously, by twisted facts which allay fears through lies and deceptive statements.
In this state, dehumanizing metaphors are very effective. My research shows that this language taps into and “switches on” existing circuits in the brain that link together important and salient images and ideas. In effect, metaphors bypass higher cognitive reasoning centers to make linkages that may not have a basis in reality. And when that happens, a person is less likely to notice the lie, because it “feels” right.
This pattern becomes more effective the more it is used. According to studies, the more these circuits are activated the more hardwired they become, until it becomes almost impossible to turn them off. What this means is these repetitive uses of dehumanizing metaphors are incredibly powerful to those brains already willing to hear them, because they direct their thoughts, making it easy to focus on certain things and ignore others.
The same is true of conspiracy theories. The neuroscientific research shows that people who believe them develop more rigid neural pathways, meaning they find it difficult to rethink situations once this pattern of thinking is established.
This also means that if someone is already more susceptible to believing lies in the form of dehumanizing metaphors and this same person comes across a big lie or a conspiracy theory that fits into that well-trodden neural pathway, they are more likely to believe it and be influenced by it.
This is how language that might seem like harmless hyperbole winds up literally changing the way people think. And once they think differently, they can act in ways that they might not have before.
With the rise of populist and far-right political movements in the 2010s, the use of dehumanizing metaphors to engender hatred of foreigners or of those who are different in some way has spread worldwide.
In 2016, during a state-orchestrated public campaign against refugees and migrants in Hungary, Orbán characterized them as a “poison.” In August 2017, when groups of white supremacists arrived in the college town of Charlottesville, Va., to participate in a “Unite the Right” rally, the protesters used both animal and dirt metaphors when they claimed they were fighting against the “parasitic class of anti-white vermin” and the “anti-white, anti-American filth.”
Putin’s labeling of the Ukrainian leadership as “Nazi” falls into this category, a powerful slur against the Jewish leader Zelensky, whom Putin called a “disgrace to the Jewish people.” Significantly, he uses this alongside dehumanizing language to justify the invasion of Ukraine, claiming it as a mission in “denazification,” eliminating Ukraine of its “Nazi filth” by innuendo. The use of the “dirt” and “filth” metaphor, coupled with the historically loaded terminology, is a persuasive linguistic tool.
These dehumanizing metaphors have been used consistently to tap into the neural pathways of fearful or anxious people ready and waiting to believe. This helps explain why so many Trump supporters were influenced by the QAnon conspiracy hoax in the lead-up to the 2020 presidential election. Trump’s “Big Lie” refers to the false claim that the election was “rigged” and “stolen” from him through massive electoral fraud — even though that assertion has been repeatedly debunked.
Significantly, Trump also supported his Big Lie with the same pattern of conspiracy theories and fake news reported in far-right social media, such as QAnon, that spurred Trump supporters to attack the U.S. Capitol on Jan. 6, 2021. This sustained use of the central metaphor of a cabal of satanic, cannibalistic abusers of children conspiring against Trump will easily fit into the entrenched neural pathways of someone who is already willing to believe.
The tricky thing about all this is that some people are more susceptible to this type of rhetorical manipulation than others. This comes down to critical thinking and brain training. If one wants to or needs to believe then the language works manipulatively and the neural pathways are built up. If we aren’t fearful or primed to believe, our brain has mechanisms to alert us to the deceit. Simply put — if we are constantly critical of lies, our brains are more trained to notice them.
Unfortunately, research into this brain wiring also shows that once people begin to believe lies, they are unlikely to change their minds even when confronted with evidence that contradicts their beliefs. It is a form of brainwashing. Once the brain has carved out a well-worn path of believing deceit, it is even harder to step out of that path — which is how fanatics are born. Instead, these people will seek out information that confirms their beliefs, avoid anything that is in conflict with them, or even turn the contrasting information on its head, so as to make it fit their beliefs.
People with strong convictions will have a hard time changing their minds, given how embedded a lie becomes in the mind. In fact, there are scientists and scholars still studying the best tools and tricks to combat lies with some combination of brain training and linguistic awareness.
Not all hope is lost, however. History has shown that disruptive events — such as the toppling of a regime or the loss of a war — can force a new perspective and the brain is able to recalibrate. So it is at least possible to change this pattern. Once the critical mind is engaged, away from the frenzy of fear and manipulation, the lie can become clear. This is the uplifting moral tale that can be gleaned from history — all the great liars, from dictators to autocrats, were eventually defeated by truth, which eventually will win out.
But the bad news is that you need that kind of disruption. Without these jarring events to bring a dose of reality, it is unlikely that people with strong convictions will ever change their minds — something that benefits the autocrat and endangers their society.