This summer, I was speaking to a relative of mine about the upcoming presidential debate in the United States. “Well,” she said. “I think what will happen is that you know, Trump has been spreading this narrative that Biden has dementia, and the debate will happen, and people will see he doesn’t have dementia, and that will help.”
This, of course, is not what happened.
Afterward, this relative of mine told me, she only let herself think about it in moments here and there, never for too long. It was simply too painful.
I was thinking about this over the last few weeks because of the predictable turbulence on social media after Joe Biden pardoned his son Hunter of basically every crime in the book, even though he previously promised not to. (He then also lied more about the matter after that). Many people (including Biden supporters and other liberals) were upset with this reversal and claimed this would degrade socio-political norms around truthfulness.
At first, I thought this response was ridiculous. Surely the dementia-debate had made things clear; all norms about truthfulness left the building long ago. Biden’s administration, like all administrations, lies. Then I realised much of the outrage wasn’t because people were upset they’d been lied to. No; it was something less obvious, more curious: people were angry, embarrassed, even ashamed, because they’d believed a lie.
And they’re not entirely wrong to feel embarrassed about their role in the matter. For lying is often a two way street. In many cases, the lie-believers are in on the whole endeavour, even if only unconsciously. Which means when lie-believers are forced to admit something was a lie, they also have to admit, to themselves and others, that some of the responsibility lies with them.
There is data on why people lie in everyday life, and that data suggests that people lie when they don’t have the social status or self-image that they want. People tend to start lying as soon as they are feeling bad about themselves: it seems to be triggered by low self-esteem. Often, they don’t even really realise they’re doing it. (Some test subjects, when they are asked to watch back the conversations they’d had where they lied routinely, seemed genuinely shocked by how much they’d twisted the truth). A lot of lying, in other words, is simply someone shaping reality as they go along in a conversation, so as to make it run smoothly for them, or even for the other person, while preserving everyone’s self-image.
Interestingly, on average men seem to do this by making themselves feel better, while women do this to make the other person feel better. In fact, we can easily imagine a conversation based on this research where a man lies to big himself up, and a woman lies to make it seem like she believes him. And then both people quickly assimilate those lies into a version of the truth that is bearable, which is to say, they come to believe their own lies.
When people lie, other people will often lie back in order to keep a shared fiction going. And then everyone might come to believe the fiction, at least for a little while.
That might sound too strange to be true. But I discovered this the hard way a few months ago, when I had the rare experience of someone in my social world lying right to my (digital) face. The full details are too boring, and I don’t wish to overexpose this unfortunate person. But the very short version is that an acquaintance began to send me weird messages online, after having met me just about twice in person. I noticed the strangeness of this right away, but I put in a great deal of work to try to make the situation feel normal, or at least ok. She would write me deeply personal and frankly inappropriate things; she would try to ask me for information about other people. Keen to placate her and keep the shared social world in order, I would mirror back to her as if all was well, sending her the occasional meme as an offering. She did not seem to have many friends; I felt sorry for her.
But of course, the placation only kept the whole thing going. This person would make other people in our social world uncomfortable, then invite me for coffee, and I would deflect, and say I was travelling (which was true, but not the point; it was, in a way, my own version of lying). The problem, of course, is that if someone wants something from you, even if they don’t know what it is, they’ll find ways to get it by stimulating greater drama. Eventually, somehow, she was trying to convince me that I had done something wrong to her in a social situation many months ago. I was pretty sure this was a lie: just a straightforward factual inaccuracy, an event that had literally not occurred. But I couldn’t quite manage to say so.
(Honestly, I feel the horror genre could do more with this setup, and make more about the discomfort of dealing with people who pursue others and pick conflict with them.)
That’s where things get interesting. For as she began to try to tell me that I had done something wrong to her, I did remarkable mental gymnastics to not call her out. In fact, I sort of tried to convince myself she was right. Okay, I said to her. I must have misremembered. I genuinely came to doubt myself. I wanted to like her. She had many charming attributes. I wanted to preserve my sense of myself as a peaceable person, too, rather than confront her. And so, so that I could keep trying to like her, and like the mutual friend who had introduced us, and like myself for somehow ending up in this weird, screwed up scenario, I tried to believe her.
Of course, as is often the case with people who lie, things fell apart. The lying person in my life more or less harassed a bunch of other people in a similar manner, doing far more outrageous things to them, alienating people until she eventually slowly peeled off from our social world. Only at this point did I finally accept that she might simply be unstable, unreliable, willing to lie to make a conversation go the way she needed. (And only after that did I read the research on just how many people, especially insecure people, are willing to do this routinely).
But what’s interesting is how sad I was to realise she had lied to me. I longed naively and stupidly for a way of somehow making things right, a way of talking to her that could reach her, even though once someone is intentionally lying to you this is functionally nearly impossible, almost definitionally so. I wanted the social world to make sense, to not be scary. I wanted to be liked, I wanted to learn how to like her. I was willing to bend reality to get there. This is not because I am golden-hearted. It’s because I did not want to live in the world where the lie was real. I wanted the social fabric to work; I wanted to believe we could attend the same parties in the future. It’s because I wanted my self-image to be one of amicability and calm. It’s also because I felt ashamed as I slowly began to realise that I’d let myself be half-charmed, half-bullied into believing her lies.
Eventually, I discovered a friend had been at the party where the supposed event at the center of the lie had occurred. My friend remembered right away what had really happened. She explained: the other woman was lying, and I was originally remembering correctly. I felt weird crushing sadness, with a touch of relief.
But I suspect no amount of evidence would have done anything for this lying person, for she had likely come to believe herself; people often do.
Lies work not because they are clever, well constructed, or well defended, but because people want to believe them. They work because a chain of people relay them and keep them going, because each person in that chain thinks there will be a social reward for buying into the alternate reality the lie presents. Biden was, it seems, lying to himself (because he wanted to believe he was fine) and his inner circle was lying to themselves (because they wanted to access the social reward of doing so, so they could stay in power) to lying each other (ditto) and to the media (ditto). The liberal media was lying in some way to the public (because they wanted to stay in power, and wanted to believe things would be okay), and the public, frankly, were lying to themselves, and to one another (for all these reasons). My relative was simply at the very end of that chain, thinking wishfully also.
That is how lies survive: we collectively sustain them, often by deceiving ourselves. We hate having to consciously recognise we’re being lied to, so we often keep believing the lie as long as we possibly can.
On a related note, research shows that our desire to believe lies also is a big part of why “fake news” spreads so much more quickly: “Falsehoods were 70 per cent more likely to be retweeted than the truth, and it took true tweets six times as long as lies, on average, to reach 1,500 people.” Fake news just tends to be more attractive, both its novelty and heightened emotional valence. (It wasn’t the bots or the algorithm creating this particular effect, in other words—it’s us. We like new drama). We like not just liars but the lies themselves, perhaps because they allow us to feel tied into an interesting world, protagonists, or at least interested bystanders, in something compelling.
All this means we need to reconsider the type of people who might lie. It’s tempting to imagine that those who lie are cold ruthless manipulators. But in many important cases that isn’t the truth at all. Instead, people who lie are those who want to be liked - and we should therefore be most suspicious not of the cruel but of the insecure, and those who stay in power by currying our favour.
We also then have to turn the lens around on ourselves. We might want to think that we are already deeply critical people, highly alert to lies. But if the research is to be believed, none of us are any good at spotting them in the wild. And it’s far more likely that we mainly want to believe others, especially if it lets us sustain our relationships with others or feel excited about the world we live in.
Most studies suggest human beings only do slightly better than random guessing when it comes to spotting lies. We may think we’re great at it - we’re wrong.
We are bad at telling when people are lying to us in part because most people think they can tell by reading it from facial expression, or tone of voice, when in fact these are so variable and individual in presentation that we cannot spot lying this way at all. A slightly more reliable technique (still highly flawed) is to look at the level of detail in statements people make, as liars tend to put in less detail since they did not, in fact, actually have the experience they are describing.
That research is interesting enough, but like a lot of important things in social psychology, the fact that we’re bad at spotting lies is, I suspect, likely not just a “natural” fact, something about the brain's intrinsic ability or biology. Though you will not generally get this analysis in psychology papers, or even pop psychology books, Our willingness to believe lies isn’t just a natural fact, it is also and more of a social fact, constructed by the way we relate to one another and to power in the social world. In real life, outside the psychology lab, we’re bad at spotting lies not only because there are no reliable tone of voice or facial cues, but because there are many convenient reasons for us to lie to ourselves, to lie to others, and to choose to believe lies. It is much easier to believe that our spouse really was working late, that our politicians really have our best interests at heart, or at least aren’t quite that bad. It is easier to believe (at least for a lot of people) that Joe Biden and team have their s*&^ together, even if you don’t really like them very much, because to believe otherwise is to face another four years of Trump head on.
It is simply less scary to believe that people are not lying to us. It gets us through the day.
Ironically, nowhere is this tendency to cling to comforting falsehoods more obvious than in the history of the psychological research on lying. Perhaps the most famous research in recent years on the topic of lying has been conducted by two researchers, Dan Ariely and Francesa Gino, who are now widely regarded as frauds. Ariely is particularly famous as a popular psychologist; he became, in the works of the New Yorker, the “enigmatic swami of the but-actually circuit.”
Notice how their lies were sustained: everyone had an incentive to believe them, a social reward, for the most part. Graduate students had serious doubts, but they wanted to believe, for the most part, because it would benefit them to do so, especially when they were listed as co-authors, and even when they weren’t. In fact, Ariely was remarkably generous to everyone around him, which probably prolonged their grace towards him:
“he gave holiday gifts and paid for extravagant ski trips and beach retreats; he provided a BMW and a twenty-thousand-dollar coffee machine for his lab members to use. When a prospective student told him that Harvard was willing to provide a better financial package, he offered a personal loan for the difference. He created what was, by all accounts, a compelling and fun work environment—foosball and office Segways—where people felt free to indulge their wildest experimental notions.”
Ariely wasn’t necessarily trying to to consciously fuck with people. It’s far more basic, if we consider his other actions: he wanted folks to like him. And other people wanted to believe him, so that he would like them, so that they could like him and keep working with him. Professors wanted to believe them because it made the field look good to have these rockstars. Also, they liked being friends. Lawrence Lessig, a well-known professor at the same university as Gino, was sure she was innocent “because I know her…that’s the strongest reason why I can’t believe this has happened.” Lawrence - I get it, I’ve been there too.
And so, for a long time, the two lying professors were believed. Their research on lying (where they argued that people usually only fudge things a little bit) became extremely well known. It was only when a team of rogue researchers, Data Colada, finally took a look at their research that the whole thing fell apart.
We can’t spot liars because we usually don’t want to, even when others are being difficult. We don’t want to because once we begin to call out individual lies, it all might unravel: the relationship, the fantasy, the trust in our neighbors, the cool job in academia, the whole damn system.
We should feel sorry for everyday liars, and perhaps even people like Ariely and Gino. Psychology research (if the rest of it is to be trusted at all, anyway) also shows that people who lie more assume those around them must also lie this much, and they live in a world of distrust, anxiety, and most of all, loneliness.
“Individuals who report using deception for both vindictive and relational reasons also report experiencing greater loneliness in their lives, even when controlling for their social network size and diversity. It is particularly interesting that relational lies show this pattern, given that they are told with the express purpose of protecting social relationships. That is, even when lies are told to escape conflicts or spare others’ feelings, they are associated with feelings of loneliness. These findings build on previous research which found that people overestimated the benefits of kindness and underestimated the costs of honesty with respect to social connection.”
Notice, again, how much lying is done to spare other people’s feelings; how much it is about being liked and also liking other people.
And notice that the cost of lying is ultimately disconnection. Which makes sense, sadly. For trust and connection are extremely closely tied. To flip this all around for a moment, there’s also a sea of fascinating research on what happens when people don’t feel they have to lie. Those who don’t feel they have to lie do much, much better. ‘Social trust’ cures a variety of national, local, and economic ills, and a lack of it seems to predict less democratic forms of life. Relationships of all kinds seem to work as long as trust is involved; I recently enjoyed reading research about how friends-with-benefits remain friends just fine after the sexual relationship ceases, but only as long as they really were friends in the first place. The level of trust says more than the shape of the relationship. So many forms of relating can work just fine; but only if there is trust. If there isn’t, forget it. Trust is, in this sense, a kind of magic, a near-panacea, one that is predictive of whether friendships, relationships, and even nations go well or not.
Indeed I am always amazed by the incredible flexibility of high-trust relationships, the way that, with the right person, any number of difficult chats can be had and only strengthen the relationship. In high-trust relationships, we can say hard truths, we can screw up, and all can be well, shockingly well. In low-trust relationships, every little wrongdoing multiplies the existing history of hurt, and so every small thing is a crisis. Once one is stuck in this kind of relating to others, lying is often a way to avoid the next blowup.
When I first read bell hook’s insistence on truth-telling as a form of love, I was honestly annoyed by it. It felt glib, naive, almost “woo”, as in pseudo spiritual. I was in my twenties, navigating a number of difficult relationships, convinced people wouldn’t like the real me, the me who is (if I’m honest - am I?) painfully driven, somewhat bitchy, promiscuous and blunt. So I hid those aspects of myself as best I could (which was poorly), and then felt baffled when various relationships went wrong.
But it turns out hooks is right, not just philosophically, but empirically. Love and closeness require telling the truth, in fact, that is what love is about.
If you want to spot a liar, find someone who is insecure and wants social status, who has a lot to gain if they can make someone believe the lie. And if you want to spot a successful liar, find someone who has a pool of people who have motivations, however, perverse, to believe them.
Most politicians and political figures are, unfortunately, exactly of this description. They want to be liked, and they have a lot to gain from being liked. And they usually have cultivated a pool of people who think they have a lot to gain from believing them.
We should feel sorry for the lady who lied to me in my social world; she clearly was having a bad, lonely, paranoid time and managed to make that even worse for herself by lying to a number of people who later compared notes. But it’s another matter when it comes to those in positions of power. We don’t need to feel sorry for them, because doing so is a form of self-harm when we need to be engaged in self-defense. And we shouldn’t be surprised by the lies in politics, because a politician or political actor is more or less always in exactly the position that incentivises lying; they need to be liked, they have a lot to lose if they’re not, and a lot to gain with a successful lie. Politicians are going to be lying to us all the time from now on, and we need to get used to this. Or, to put it more bluntly: no one should be surprised that Joe Biden is lying. I do not say this as a matter of personal anger or dissatisfaction; I do not particularly dislike Joe Biden compared to other members of his party. I say this because it is always a naïve move to believe politicians. Politicians have huge motivations to want to be liked and to want to look good, and who have a whole chain of people who are extremely motivated to believe them, so that lies can be sustained.
To do well at political life, unlike other areas of life, we have to get used to looking for lies and calling them out. This is hard work, because it means we first have to acknowledge at some level our own propensity to believe the lie and to want to believe the lie. I have discovered, through the aforementioned social adventure and many others besides, that I hate being scrutinous in this way. I hate assuming others might be lying. I want to believe other people, despite my decade of training in “critical theory”, despite my grumpy demenour, I want to like people, even at the cost of the truth. I don’t want to live in a world that is full of people trying to deceive me.
But when it comes to people in power, we have to.
It’s worth saying that once lying is common, the value of speech goes down. And that is simply true right now. Words weigh little, and we should discount them. Liberalism, with its myths about the effectiveness of a “marketplace of ideas” and its reliance on discourse rather than an analysis of power, has baseline assumptions of truthfulness that we can’t afford anymore. A good deal of liberal culture assumes that, with enough sunlight and chat, we can get to the bottom of things, and that will right everything.
This doesn’t work if enough people are generally be lying, which they will be if incentives are high enough. It especially doesn’t work if we follow our instinct to believe things rather than question. And in our political world, in the world of highly unequal social relationships, they are. In other words: in the average social world we only have so many reasons to lie. It seems that people lie when they feel they don’t have status or respect enough from others. (Again, the woman who lied to me seemed to have few friends, for a variety of reasons). Nevertheless, most people in my social world are probably only lying sometimes.
But in politics there are far more unequal power relations and far more reasons to lie. (There is also none of the magic of social trust, nor the true benefits of intimacy). So when it comes to politics, we have to look for and expect lies and call them out, and not turn towards our tendency to believe people. We can’t cave to the normal human gravitational impulse of wanting to be liked, or, more dangerously in this case, wanting to like.
We also can’t shy away from the “creepiness” of being lied to. When the woman in my social world lied to me, I felt a sense of deep unease. Recognising the lie forced me to live in a world where I then had to consider that all kinds of other people might be lying to me. It made me pull away from the social world this person was part of, distrusting our mutual friends. It fucked me up, at least for a little. And it forced me to live my life differently. And that wasn’t fun - but it was right, because it was a true response to the truth. We have to do the same when it comes to politics. We need to be able to recognise this uneasy feeling, this wariness that comes when people lie to us, and bear it, so that we can also bear the reality of situations where people are lying.
As for that real trust, and all the magic of true intimacy: that’s what we reserve for the people who tell us what we don’t want to hear.
Very interesting! Is deluding one's self the same thing as lying? If someone (perhaps the woman in your example) truly believes what they're saying, is it any different than a lie? Does it matter if something is intentional or inadvertent? While equally frustrating, the two feel kind of different to me. If someone is deceiving themself, and there are many reasons for this beyond just wanting to be liked, it feels different to me than if they know what the truth is and deliberately says something untrue.
In the article about liars providing less detail, did they control for both over- and under-explaining as well as power dynamics? Sadly I don't have access.
I must admit, I am skeptical of generalizing that particular finding. For example, a "no excuses" work culture stifles long explanations. But how would we control for the truth-value or reasonableness of explanations according to length? Bosses get to say less. "Minions" may be forced to say less when they need to say more to be truthful. The boss may be more an asshole than the employee is a liar.