Trump’s Lies vs. Your Brain
February 10, 2017, 3:35pm

Unfortunately, it’s no contest. Here’s what psychology tells us about life under a leader totally indifferent to the truth.

All presidents lie. Richard Nixon said he was not a crook, yet he orchestrated the most shamelessly crooked act in the modern presidency. Ronald Reagan said he wasn’t aware of the Iran-Contra deal; there’s evidence he was. Bill Clinton said he did not have sex with that woman; he did, or close enough. Lying in politics transcends political party and era. It is, in some ways, an inherent part of the profession of politicking.

But Donald Trump is in a different category. The sheer frequency, spontaneity and seeming irrelevance of his lies have no precedent. Nixon, Reagan and Clinton were protecting their reputations; Trump seems to lie for the pure joy of it. A whopping 70 percent of Trump’s statements that PolitiFact checked during the campaign were false, while only 4 percent were completely true, and 11 percent mostly true. (Compare that to the politician Trump dubbed “crooked,” Hillary Clinton: Just 26 percent of her statements were deemed false.)

Those who have followed Trump’s career say his lying isn’t just a tactic, but an ingrained habit. New York tabloid writers who covered Trump as a mogul on the rise in the 1980s and ’90s found him categorically different from the other self-promoting celebrities in just how often, and pointlessly, he would lie to them. In his own autobiography, Trump used the phrase “truthful hyperbole,” a term coined by his ghostwriter referring to the flagrant truth-stretching that Trump employed, over and over, to help close sales. Trump apparently loved the wording, and went on to adopt it as his own.

On January 20, Trump’s truthful hyperboles will no longer be relegated to the world of dealmaking or campaigning. Donald Trump will become the chief executive of the most powerful nation in the world, the man charged with representing that nation globally—and, most importantly, telling the story of America back to Americans. He has the megaphone of the White House press office, his popular Twitter account and a loyal new right-wing media army that will not just parrot his version of the truth but actively argue against attempts to knock it down with verifiable facts. Unless Trump dramatically transforms himself, Americans are going to start living in a new reality, one in which their leader is a manifestly unreliable source.

What does this mean for the country—and for the Americans on the receiving end of Trump’s constantly twisting version of reality? It’s both a cultural question and a psychological one. For decades, researchers have been wrestling with the nature of falsehood: How does it arise? How does it affect our brains? Can we choose to combat it? The answers aren’t encouraging for those who worry about the national impact of a reign of untruth over the next four, or eight, years. Lies are exhausting to fight, pernicious in their effects and, perhaps worst of all, almost impossible to correct if their content resonates strongly enough with people’s sense of themselves, which Trump’s clearly do.

***

What happens when a lie hits your brain? The now-standard model was first proposed by Harvard University psychologist Daniel Gilbert more than 20 years ago. Gilbert argues that people see the world in two steps. First, even just briefly, we hold the lie as true: We must accept something in order to understand it. For instance, if someone were to tell us—hypothetically, of course—that there had been serious voter fraud in Virginia during the presidential election, we must for a fraction of a second accept that fraud did, in fact, take place. Only then do we take the second step, either completing the mental certification process (yes, fraud!) or rejecting it (what? no way). Unfortunately, while the first step is a natural part of thinking—it happens automatically and effortlessly—the second step can be easily disrupted. It takes work: We must actively choose to accept or reject each statement we hear. In certain circumstances, that verification simply fails to take place. As Gilbert writes, human minds, “when faced with shortages of time, energy, or conclusive evidence, may fail to unaccept the ideas that they involuntarily accept during comprehension.”

When we are overwhelmed with false, or potentially false, statements, our brains pretty quickly become so overworked that we stop trying to sift through everything.

Our brains are particularly ill-equipped to deal with lies when they come not singly but in a constant stream, and Trump, we know, lies constantly, about matters as serious as the election results and as trivial as the tiles at Mar-a-Lago. (According to his butler, Anthony Senecal, Trump once said the tiles in a nursery at the West Palm Beach club had been made by Walt Disney himself; when Senecal protested, Trump had a single response: “Who cares?”) When we are overwhelmed with false, or potentially false, statements, our brains pretty quickly become so overworked that we stop trying to sift through everything. It’s called cognitive load—our limited cognitive resources are overburdened. It doesn’t matter how implausible the statements are; throw out enough of them, and people will inevitably absorb some. Eventually, without quite realizing it, our brains just give up trying to figure out what is true.

But Trump goes a step further. If he has a particular untruth he wants to propagate—not just an undifferentiated barrage—he simply states it, over and over. As it turns out, sheer repetition of the same lie can eventually mark it as true in our heads. It’s an effect known as illusory truth, first discovered in the ’70s and most recently demonstrated with the rise of fake news. In its original demonstration, a group of psychologists had people rate statements as true or false on three different occasions over a two-week period. Some of the statements appeared only once, while others were repeated. The repeated statements were far more likely to be judged as true the second and third time they appeared—regardless of their actual validity. Keep repeating that there was serious voter fraud, and the idea begins to seep into people’s heads. Repeat enough times that you were against the war in Iraq, and your actual record on it somehow disappears.

Here’s the really bad news for all of those fact-checkers and publications hoping to counter Trump’s false claims: Repetition of any kind—even to refute the statement in question—only serves to solidify it. For instance, if you say, “It is not true that there was voter fraud,” or try to refute the claim with evidence, you often perversely accomplish the opposite of what you want. Later on, when the brain goes to recall the information, the first part of the sentence often gets lost, leaving only the second. In a 2002 study, Colleen Seifert, a psychologist at the University of Michigan, found that even retracted information—that we acknowledge has been retracted—can continue to influence our judgments and decisions. Even after people were told that a fire was not caused by paint and gas cylinders left in a closet, they continued to use that information—for instance, saying the fire was particularly intense because of the volatile materials present—even as they acknowledged that the correction had taken place. When presented with the contradictions in their responses, they said things like, “At first, the cylinders and cans were in the closet and then they weren’t”—in effect creating a new fact to explain their continued reliance on false information. This means that when the New York Times, or any other publication, runs a headlinelike “Trump Claims, With No Evidence, That ‘Millions of People Voted Illegally,’” it perversely reinforces the very claim it means to debunk.

In politics, false information has a special power. If false information comports with preexisting beliefs—something that is often true in partisan arguments—attempts to refute it can actually backfire, planting it even more firmly in a person’s mind. Trump won over Republican voters, as well as alienated Democrats, by declaring himself opposed to “Washington,” “the establishment” and “political correctness,” and by stoking fears about the Islamic State, immigrants and crime. Leda Cosmides at the University of California, Santa Barbara, points to her work with her colleague John Tooby on the use of outrage to mobilize people: “The campaign was more about outrage than about policies,” she says. And when a politician can create a sense of moral outrage, truth ceases to matter. People will go along with the emotion, support the cause and retrench into their own core group identities. The actual substance stops being of any relevance.

Brendan Nyhan, a political scientist at Dartmouth University who studies false beliefs, has found that when false information is specifically political in nature, part of our political identity, it becomes almost impossible to correct lies. When people read an article beginning with George W. Bush’s assertion that Iraq may pass weapons to terrorist networks, which later contained the fact that Iraq didn’t actually possess any WMDs at the time of the U.S. invasion of Iraq, the initial misperception persisted among Republicans—and, indeed, was frequently strengthened. In the face of a seeming assault on their identity, they didn’t change their minds to conform with the truth: Instead, amazingly, they doubled down on the exact views that were explained to be wrong.

It’s easy enough to correct minor false facts if they aren’t crucial to your sense of self. Alas, nothing political fits into that bucket.

With regard to Trump specifically, Nyhan points out that claims related to ethno-nationalism—Trump’s declaration early in the campaign that Mexico was sending “rapists” across the border, for instance—get at the very core of who we are as humans, which “may make people less willing or able to evaluate the statement empirically.” If you already believe immigrants put your job at risk, who’s to say the chastity of your daughters isn’t in danger, too? Or as Harvard University psychologist Steven Pinker puts it, once Trump makes that emotional connection, “He could say what he wants, and they’ll follow him.”

So what can we do in the face of a flagrant liar-in-chief? Here, alas, the news is not particularly promising. Consider a 2013 paper aimed at correcting political misperceptions, specifically. In the study, a group of people around the country were first asked about their knowledge of several government policies: For instance, how familiar were they with how electronic health records were handled? They also were asked about their attitudes toward the issues: Were they in favor, or opposed? Everyone next read a news article crafted specifically for the study that described the policy: how electronic health records work, what the objectives of using them are and how widely they are, in fact, used. Next, each participant saw a correction to the article, stating that it contained a number of factual errors, alongside an explanation of what was wrong. But the only people who actually changed their incorrect beliefs as a result were those whose political ideology was aligned with the correct information already. Those whose beliefs ran counter to the correction? They changed their belief in the accuracy of the publication that could possibly publish such an obviously bogus correction. It’s easy enough to correct minor false facts, the color of a label, say, if they aren’t crucial to your sense of self. Alas, nothing political fits into that bucket.

***

Scarier still for those who have never supported Trump is that he just might colonize their brains, too. When we are in an environment headed by someone who lies, so often, something frightening happens: We stop reacting to the liar as a liar. His lying becomes normalized. We might even become more likely to lie ourselves. Trump is creating a highly politicized landscape where everyone is on the defensive: You’re either for me, or against me; if you win, I lose, and vice versa. Fiery Cushman, a moral psychologist at Harvard University, put it this way when I asked him about Trump: “Our moral intuitions are warped by the games we play.” Place us in an environment where it’s zero-sum, dog-eat-dog, party-eats-party, and we become, in game theory terms, “intuitive defectors,” meaning our first instinct is not to cooperate with others but to act in our own self-interest—which could mean disseminating lies ourselves.

The dynamic we are seeing unfurled in the United States is not merely hypothetical. We already have a model of this process—a country regressing when its leader goes from progressive to deceptive: Russia under Vladimir Putin. “This worldview”—a zero-sum, I win-you lose one—“is relatively more prevalent in Russia and other cultures with weak rule of law, high corruption and low generalized trust, as compared with Western democracies,” Cushman says. But when Western democracies start looking like those cultures, the norms can quickly shift.

The distressing reality is that our sense of truth is far more fragile than we would like to think it is—especially in the political arena, and especially when that sense of truth is twisted by a figure in power. As the 19th-century Scottish philosopher Alexander Bain put it, “The great master fallacy of the human mind is believing too much.” False beliefs, once established, are incredibly tricky to correct. A leader who lies constantly creates a new landscape, and a citizenry whose sense of reality may end up swaying far more than they think possible. It’s little wonder that authoritarian regimes with sophisticated propaganda operations can warp the worldviews of entire populations. “You are annihilated, exhausted, you can’t control yourself or remember what you said two minutes before. You feel that all is lost,” as one man who had been subject to Mao Zedong’s “reeducation” campaign in China put it to the psychiatrist Robert Lifton. “You accept anything he says.”

Maria Konnikova is a contributing writer at the New Yorker and author, most recently, of The Confidence Game: Why We Fall for It … Every Time.

Comments