A new approach to lie detection

Researchers from the University of Amsterdam's Leugenlab (Lie Lab) have developed a new approach to lie detection through a series of lab experiments.

Participants were free to use all possible signals—from looking people in the eye to looking for nervous behavior or a particularly emotional story—to assess whether someone was lying.

In this situation, they found it difficult to distinguish lies from truths and scarcely performed above the level of probability. When instructed to rely only on the amount of detail (place, person, time, location) in the story, they were consistently able to discern lies from truths.

Bachelor's students from the UvA and Master's students from the UvA and the UM carried out data collection, control experiments and replication studies for the research in the context of their theses. 

Read more online at The Univeristy of Amsterdam 

Your Inner Voice Can Mislead You

It’s very disturbing when you realize that our brains are a fiction-making machine. We make up all kinds of crazy things to help us feel better and to justify the decisions that we’ve made. The inner voice is the one who arbitrates a lot of that maneuvering around the truth, so we have to be very careful. It’s a master storyteller and far more important than you may realize.

Jim Loehr, performance psychologist and cofounder of the Human Performance Institute, quoted in Fast Company

Imagination inflation can lead to false memories

Imagination inflation refers to the tendency of people who, when asked to imagine an event vividly, will sometimes begin to believe, when asked about it later, that the event actually occurred. Adults who were asked "Did you ever break a window with your hand?" were more likely on a later life inventory to report that they believe this event occurred during their lifetimes. It seems that asking the question led them to imagine the event, and the act of having imagined it had the effect, later, of making them more likely to think it had occurred (relative to other group answer the question not having previously imagined it occurring).

Accounts that sound familiar can create the feeling of knowing and be mistaken for true. This is one reason that political or advertising claims that are not factual but are repeated can gain traction with the public, particularly if they have emotional resonance. Something you once heard that you hear again later carries a warmth of familiarity that can be mistaken for memory, a shred of something you once knew and cannot quite place but are inclined to believe. In the world propaganda, this is called "the big lie" technique—even a big lie told repeatedly can come to be accepted as truth.

Peter C. Brown and Henry L. Roediger III, Make It Stick: The Science of Successful Learning

What does it means for a human being to possess the truth

Kierkegaard’s concern is really not with the adequacy of a philosophical theory of truth, but with the question of what it means for a human being to possess the truth. To grasp the significance of this, we must not think of truth in the way characteristic of contemporary philosophy, focusing on the properties of propositions, but in the way ancient thinkers conceived of truth. For Socrates and Plato, at least as Kierkegaard understood them, having the truth meant having the key to human life, possessing that which makes it possible to live life as it was intended to be lived.

C Stephen Evans, Introduction: Kierkegaard’s life and works

Availability bias

People give their own memories and experiences more credence than they deserve, making it hard to accept new ideas and theories. Psychologists call this quirk the availability bias. It’s a useful built-in shortcut when you need to make quick decisions and don’t have time to critically analyze lots of data, but it messes with your fact-checking skills.

Marc Zimmer writing in The Conversation

Why are conspiracy theories popular?

People who feel powerless or vulnerable are more likely to endorse and spread conspiracy theories. This is seen in online forums where people’s perceived level of threat is strongly linked to proposing conspiracy theories. Conspiracy theories allow people to cope with threatening events by focusing blame on a set of conspirators. People find it difficult to accept that “big” events (e.g., the death of Princess Diana) can have an ordinary cause (driving while intoxicated). A conspiracy theory satisfies the need for a “big” event to have a big cause, such as a conspiracy involving MI5 to assassinate Princess Diana. For the same reason, people tend to propose conspiratorial explanations for events that are highly unlikely. Conspiracy theories act as a coping mechanism to help people handle uncertainty.

Stephan Lewandowsky & John Cook, The Conspiracy Theory Handbook

How does this information make me feel?

We don’t need to become emotionless processors of numerical information – just noticing our emotions and taking them into account may often be enough to improve our judgment. Rather than requiring superhuman control of our emotions, we need simply to develop good habits. Ask yourself: how does this information make me feel? Do I feel vindicated or smug? Anxious, angry or afraid? Am I in denial, scrambling to find a reason to dismiss the claim?

Before I repeat any statistical claim, I first try to take note of how it makes me feel. It’s not a foolproof method against tricking myself, but it’s a habit that does little harm, and is sometimes a great deal of help. Our emotions are powerful. We can’t make them vanish, and nor should we want to. But we can, and should, try to notice when they are clouding our judgment.

Tim Harford, How to Make the World Add Up

The Backfire effect 

Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.

David McRaney  

You Have Your Truth, I have Mine

Some people.. maintain that morality is not dependent on the society but rather the individual. “Morality is in the eye of the beholder.” They treat morality like taste or aesthetic judgments, person relative. 

On the basis of (moral) subjectivism Adolf Hitler and serial murderer Ted Bundy could be considered as a moral as Gandhi, as long as each lived by his own standards, whatever those might be. 

Although many students say they espouse subjectivism, there is evidence that it conflicts with other of their moral views. They typically condemn Hitler as an evil man for his genocidal policies. A contradiction seems to exist between subjectivism and the very concept of morality. 

Louis Pojman, Ethical Theory

tell me a story

We naturally avoid ambiguity. We want black and white, right or left, up or down. The greys of life are so distasteful that when a cause is attached to any set of facts, we assume the "facts" are more likely to have really happened.

Nassim Taleb in his book The Black Swain points out that if you ask someone, "How many people are likely to have lung cancer in the U.S.?" you might get a response like "half a million." But if you make one change to the question and ask, "How many people are likely to have lung cancer in the U.S. because of smoking cigarettes" you would get a much higher number. Why is that? Taleb suggests we tend to believe an idea is more likely to be true when a cause is attached to it.

Joey seemed happily married but killed his wife.

Joey seemed happily married but killed his wife to get her inheritance. 

The first is broader and accommodate more possibilities. The second statement is more specific and less likely to be true.  But if you ask people which is more likely, more of them would say the second statement. Why?  The second statement tells us a story.

The narrative misguides us. We want an explanation, a back story. That's why it’s hard for us to look at a series of facts without weaving an explanation into them and tying the factsto the because. We like a good story-even when it misleads us about what is true. That's why you should be careful whenever you come across a because. Connecting causes to particular events must be handled with care.

Stephen Goforth

Casting Doubt

A lot of people still think of propaganda as the art of making lies sound truthful, but...they want to make truthfulness an irrelevant category. It’s not about proving something, it’s about casting doubt. Most political ideologies have not been about casting doubt — they’ve claimed to be telling the truth about the way the world is or should be. But this new propaganda is different. Putin isn’t selling a wonderful communist future. He’s saying, we live in a dark world, the truth is unknowable, the truth is always subjective, you never know what it is, and you, the little guy, will never be able to make sense of it all — so you need to follow a strong leader.

Sean Illing & Peter Pomerantsev in a Vox  interview 

 

When the facts change

According to David Perkins of Harvard University, the brighter people are, the more deftly they can conjure up post-hoc justifications for arguments that back their own side. Brainboxes are as likely as anyone else to ignore facts which support their foes. John Maynard Keynes, a (famously intelligent) British economist, is said to have asked someone: “When the facts change, I change my mind. What do you do, sir?” If they were honest, most would reply: “I stick to my guns.”

from The Economist 

Few people can detect a liar

In daily life, without the particular pressures of politics, people find it hard to spot liars. Tim Levine of the University of Alabama, Birmingham, has spent decades running tests that allow participants (apparently unobserved) to cheat. He then asks them on camera if they have played fair. He asks others to look at the recordings and decide who is being forthright about cheating and who is covering it up. In 300 such tests people got it wrong about half of the time, no better than a random coin toss. Few people can detect a liar. Even those whose job is to conduct interviews to dig out hidden truths, such as police officers or intelligence agents, are no better than ordinary folk.

The Economist 

Conspiracy Theories

A conspiracy theory is an attempt to force a story on a set of disparate, though often distantly related facts and observations. But the real world is not a narrative, not a clever mystery to be unraveled by amateur detectives. Every baroque edifice of conspiracy rests upon a foundational belief that there is a singular truth that diligent investigation will reveal, even if the shape of that truth branches and swirls in an infinite fractal. What this mindset cannot accept is that there may be many simple truths for many disturbing facts.

Jacob Bacharach writing in The Outline

Why We Lie

A life of total dedication to the truth means.. a life of total honesty. It means a continuous and never-ending process of self-monitoring to assure that our communications – not only the words that we say but also the way we say them-invariably reflect as accurately as humanly possible the truth or reality as we know it. Such honesty does not come painlessly. The reason people lie is to avoid the pain of challenge and its consequences. 

M Scott Peck, The Road Less Traveled

The Madman’s Narrative

Consider that two people can hold incompatible beliefs based on the exact same data. Does this mean that there are possible families of explanations and that each of these can be equally perfect and sound? Certainly not. One may have a million ways to explain things, but the true explanation is unique, whether or not it is within our reach. 

In a famous argument, the logician WV Quine showed that there exist families of logically consistent interpretations and theories that can match a give series of facts. Such insight should warn us that mere absence of nonsense may not be sufficient to make something true. 

Nassim Taleb, The Black Swain

What Makes People Susceptible to Fake News

Susceptibility to fake news is driven more by lazy thinking than by partisan bias. Which on one hand sounds—let's be honest—pretty bad. But it also implies that getting people to be more discerning isn't a lost cause. Changing people's ideologies, which are closely bound to their sense of identity and self, is notoriously difficult. Getting people to think more critically about what they're reading could be a lot easier, by comparison.

Then again, maybe not. 

Anyone who has sat and stared vacantly at their phone while thumb-thumb-thumbing to refresh their Twitter feed, or closed out of Instagram only to re-open it reflexively, has experienced firsthand what it means to browse in such a brain-dead, ouroboric state. Default settings like push notifications, autoplaying videos, algorithmic news feeds—they all cater to humans' inclination to consume things passively instead of actively, to be swept up by momentum rather than resist it. 

This isn't baseless philosophizing; most folks just tend not to use social media to engage critically with whatever news, video, or sound bite is flying past. As one recent study shows, most people browse Twitter and Facebook to unwind and defrag—hardly the mindset you want to adopt when engaging in cognitively demanding tasks.

David Rand—a behavioral scientist at MIT—says he has experiments in the works that investigate whether nudging people to think about the concept of accuracy can make them more discerning about what they believe and share. In the meantime, he suggests confronting fake news espoused by other people not necessarily by lambasting it as fake, but by casually bringing up the notion of truthfulness in a non-political context. You know: just planting the seed. It won't be enough to turn the tide of misinformation. But if our susceptibility to fake news really does boil down to intellectual laziness, it could make for a good start.

Robbie Gonzalez writing in Wired Magazine 

Spotting Liars

We’re bad at accurately interpreting behavior and speech patterns, said James Alcock, professor of psychology at Canada’s York University. Learning is based on getting regular feedback, he told me. Try to add 2 + 2 and someone will tell you whether you got it right or wrong. Over time, that feedback allows you to know when you’re right. But there’s no systematic un-blinding to tell you when you correctly guessed whether you were being lied to. The feedback we get on this is spotty. Often there is none. Sometimes the feedback itself is incorrect. There’s never a chance to really learn and get better, Alcock said. “So why should we be good at it?”

Take people whose job it is to professionally detect lies — judges, police officers, customs agents. Studies show they believe themselves to be better than chance at spotting liars. But the same studies show they aren’t, Alcock said. And that makes sense, he told me, because the feedback they get misleads them. Customs agents, for instance, correctly pull aside smugglers for searches just often enough to reinforce their sense of their own accuracy. But “they have no idea about the ones they didn’t search who got away,” Alcock said.

Maggie Koerth-baker, writing in fivethirtyeight