What Makes People Susceptible to Fake News

Susceptibility to fake news is driven more by lazy thinking than by partisan bias. Which on one hand sounds—let's be honest—pretty bad. But it also implies that getting people to be more discerning isn't a lost cause. Changing people's ideologies, which are closely bound to their sense of identity and self, is notoriously difficult. Getting people to think more critically about what they're reading could be a lot easier, by comparison.

Then again, maybe not. 

Anyone who has sat and stared vacantly at their phone while thumb-thumb-thumbing to refresh their Twitter feed, or closed out of Instagram only to re-open it reflexively, has experienced firsthand what it means to browse in such a brain-dead, ouroboric state. Default settings like push notifications, autoplaying videos, algorithmic news feeds—they all cater to humans' inclination to consume things passively instead of actively, to be swept up by momentum rather than resist it. 

This isn't baseless philosophizing; most folks just tend not to use social media to engage critically with whatever news, video, or sound bite is flying past. As one recent study shows, most people browse Twitter and Facebook to unwind and defrag—hardly the mindset you want to adopt when engaging in cognitively demanding tasks.

David Rand—a behavioral scientist at MIT—says he has experiments in the works that investigate whether nudging people to think about the concept of accuracy can make them more discerning about what they believe and share. In the meantime, he suggests confronting fake news espoused by other people not necessarily by lambasting it as fake, but by casually bringing up the notion of truthfulness in a non-political context. You know: just planting the seed. It won't be enough to turn the tide of misinformation. But if our susceptibility to fake news really does boil down to intellectual laziness, it could make for a good start.

Robbie Gonzalez writing in Wired Magazine 

Spotting Liars

We’re bad at accurately interpreting behavior and speech patterns, said James Alcock, professor of psychology at Canada’s York University. Learning is based on getting regular feedback, he told me. Try to add 2 + 2 and someone will tell you whether you got it right or wrong. Over time, that feedback allows you to know when you’re right. But there’s no systematic un-blinding to tell you when you correctly guessed whether you were being lied to. The feedback we get on this is spotty. Often there is none. Sometimes the feedback itself is incorrect. There’s never a chance to really learn and get better, Alcock said. “So why should we be good at it?”

Take people whose job it is to professionally detect lies — judges, police officers, customs agents. Studies show they believe themselves to be better than chance at spotting liars. But the same studies show they aren’t, Alcock said. And that makes sense, he told me, because the feedback they get misleads them. Customs agents, for instance, correctly pull aside smugglers for searches just often enough to reinforce their sense of their own accuracy. But “they have no idea about the ones they didn’t search who got away,” Alcock said.

Maggie Koerth-baker, writing in fivethirtyeight

 

How to Spot a Liar

When an international team of researchers asked some 2,300 people in 58 countries to respond to a single question — “How can you tell when people are lying?” — one sign stood out: In two-thirds of responses, people listed gaze aversion. A liar doesn’t look you in the eye. Twenty-eight percent reported that liars seemed nervous, a quarter reported incoherence, and another quarter that liars exhibited certain little giveaway motions.

It just so happens that the common wisdom is false.

Why do we think we know how liars behave? Liars should divert their eyes. They should feel ashamed and guilty and show the signs of discomfort that such feelings engender. And because they should, we think they do.

The desire for the world to be what it ought to be and not what it is permeates experimental psychology as much as writing, though. There’s experimental bias and the problem known in the field as “demand characteristics” — when researchers end up finding what they want to find by cuing participants to act a certain way. It’s also visible when psychologists choose to study one thing rather than another, dismiss evidence that doesn’t mesh with their worldview while embracing that which does.

Maria Konnikova writing in the New York Times

Desensitized to Lying

When you’re exposed to a strong smell, at first the smell is extremely noticeable, but eventually you stop noticing it as much. With time, any stimulus — a loud noise, a strong perfume, etc. — is likely to provoke a smaller response. The same goes with lying.

We get desensitized to our own lying as the areas of our brain that correlate with negativity become less active. This makes it easier for us to lie in the future.

“The first time you cheat — let’s say you’re cheating on your taxes — you feel quite bad about it,” Tali Sharot, a University College London neuroscientist. But then the next time you cheat, you’re less likely to get that negative feeling. That makes it easier to lie again. And the cycle escalates from there.

Brian Resnick writing in Vox

 

The Backfire Effect

The backfire effect happens when the myth ends up becoming more memorable than the fact. One of the most striking examples of this was seen in a study evaluating a “Myths and Facts” flyer about flu vaccines. Immediately after reading the flyer, participants accurately remembered the facts as facts and the myths as myths. But just 30 minutes later this had been completely turned on its head, with the myths being much more likely to be remembered as “facts”.  The thinking is that merely mentioning the myths actually helps to reinforce them. And then as time passes you forget the context in which you heard the myth – in this case during a debunking – and are left with just the memory of the myth itself.

Mark Lorch writing in Business Inisder

tell me a story

We naturally avoid ambiguity. We want black and white, right or left, up or down. The greys of life are so distasteful that when a cause is attached to any set of facts, we assume the "facts" are more likely to have really happened.

Nassim Taleb in his book The Black Swain points out that if you ask someone, "How many people are likely to have lung cancer in the U.S.?" you might get a response like "half a million." But if you make one change to the question and ask, "How many people are likely to have lung cancer in the U.S. because of smoking cigarettes" you would get a much higher number. Why is that? Taleb suggests we tend to believe an idea is more likely to be true when a cause is attached to it.

Joey seemed happily married but killed his wife.

Joey seemed happily married but killed his wife to get her inheritance.

The first is broader and accommodate more possibilities. The second statement is more specific and less likely to be true.  But if you ask people which is more likely, more of them would say the second statement. Why?  The second statement tells us a story.

The narrative misguides us. We want an explanation, a back story. That's why it’s hard for us to look at a series of facts without weaving an explanation into them and tying the facts to the because. We like a good story-even when it misleads us about what is true. That's why you should be careful whenever you come across a because. Connecting causes to particular events must be handled with care.

Stephen Goforth

Black Swains

We have a natural tendency to look for instances that confirm our story and our vision of the world.

Seeing white swans does not confirm the nonexistence of black swans. There is an exception, however: I know what statement is wrong, but not necessarily what statement is correct. If I see a black swan I can certify that all swans are not white! If I see someone kill, then I can be practically certain that he is a criminal. If I don’t see him kill, I cannot be certain that he is innocent. The same applies to cancer detection: the finding of a single malignant tumor proves that you have cancer, but the absence of such a finding cannot allow you to say with certainty that you are cancer-free.

We can get closer to the truth by negative instances, not by verification.

Nissim Taleb, The Black Swain

The Madman's Narrative

Consider that two people can hold incompatible beliefs based on the exact same data. Does this mean that there are possible families of explanations and that each of these can be equally perfect and sound? Certainly not. One may have a million ways to explain things, but the true explanation is unique, whether or not it is within our reach.

In a famous argument, the logician WV Quine showed that there exist families of logically consistent interpretations and theories that can match a give series of facts. Such insight should warn us that mere absence of nonsense may not be sufficient to make something true.

Nissim Taleb, The Black Swain

How would you answer Peter's question?

Whenever I interview someone for a job, I like to ask this question: “What important truth do very few people agree with you on?” This question sounds easy because it’s straightforward. Actually, it’s very hard to answer. It’s intellectually difficult because the knowledge that everyone is taught in school is by definition agreed upon. And it’s psychologically difficult because anyone trying to answer must say something she knows to be unpopular. Brilliant thinking is rare, but courage is in even shorter supply than genius.

Most commonly, I hear answers like the following:

“Our educational system is broken and urgently needs to be fixed.”

“America is exceptional.”

“There is no God.”

Those are bad answers. The first and the second statements might be true, but many people already agree with them. The third statement simply takes one side in a familiar debate. A good answer takes the following form: “Most people believe in x, but the truth is the opposite of x.”

Peter Thiel

Zero to One Notes on Startups, or How to Build the Future

building on sand

People who circle the wagons when questioned about their religious creed are usually afraid that what they profess might not be true.

Seldom (if ever?) will you run across 100% false belief system. There are scattered nuggets of truth in each one. That’s why there are people in every religious, political, and philosophical system who simply accept the group’s views at face value. They grew up in it, they gave in to social pressure and they joined. Or else they simply are unwilling to come to terms with the fact they have been walking on the wrong road. Admitting that you’ve invested yourself in something that’s been a waste of your time is not easy. Going back and starting over again is not very appealing. Ultimately, it’s a choice about whether to maintain a comfort level or pursue truth.

If you surround yourself only with things and people who reinforce your belief system, you don't have to worry about your worldview being knocked out from under you (although circumstances have a way of eventually doing this, anyway). The choice ultimately becomes denying reality--or reassessing cherished ideas on which we’ve built our lives.

Stephen Goforth

The Standard

The moment you say that one set of moral ideas can be better than another, you are, in fact, measuring them both by a standard, saying that one of them conforms to that standard more nearly than the other. But the standard that measures two things is something different from either. You are, in fact, comparing them both with some Real Morality, admitting that there is such a thing as a real Right, independent of what people think, and that some people's ideas get nearer to that real Right than others.

Or put it this way. If your moral ideas can be truer, and those of the Nazis less true, there must be something-some Real Morality--for them to be true about.

If the Rule of Decent Behaviour meant simply 'whatever each nation happens to approve,' there would be no sense in saying that any one nation had ever been more correct in its approval than any other; no sense in saying that the world could ever grow morally better or morally worse.

CS Lewis, Mere Christianity

Bent to the Illusion

The Müller-Lyer illusion, a famous optical illusion (involves) two sets of arrows. The arrows are exactly the same length. But in one case, the ends of the arrows outward, seem to signify expansion and boundless potential. In the other case, they point inward, making them seem self-contained and limited. The first case is analogous to how investors see the stock market when returns have been increasing; the second case is how they see it after a crash.

“There’s no way that you can control yourself not to have that illusion,” (Nobel prize winner) Daniel Kahneman told me. “You look at them, and one of the arrows is going to look longer than the other. But you can train yourself to recognize that this is a pattern that causes an illusion, and in that situation, I can’t trust my impressions; I’ve got to use a ruler.”

Nate Silver, The Signal and the Noise

stomping of the foot (before storming out of class)

I'll never forget the student who charged out of one of my first philosophy classes. The professor had challenged the student's view of religion and the young man stomped his foot, turned red, yelled, and left the room.

Why such an emotional outburst? Perhaps his beliefs were built on a weak foundation. A little rhetoric from an authority figure threatened to topple the structure. When we accept the conclusions of other people, never figuring out the "why" for ourselves, weak lay a weak foundation. Should we intentionally avoid opposing view points? It turns out we naturally steer clear of conflict.

Researchers at the University of Illinois at Urbana-Champaign found the less certain you are about what you believe, the more likely you’ll stay away from opposing viewpoints (and freak out when you run across opposing opinion). After reviewing nearly 100 studies, they came to the conclusion that people tend minimize their exposure when they are less certain and less confident in their own position. In fact, we're nearly twice as likely to completely avoid differing opinions than we are to give consideration to different ideas. For those who are close-minded the percentage jumps even higher. Three-out-of-four times the close-minded person will stick to what supports their own conclusions. Details of the study are in the Psychological Bulletin by Researchers.

Stephen Goforth

Each time You Lie

Each time you lie, even if you’re not caught, you “become a little more of this ugly thing: a liar. Character is always in the making, with each morally valanced action, whether right or wrong, affecting our characters, the people who we are. You become the person who could commit such an act, and how you are known in the world is irrelevant to this state of being.” In the end, who we are inside matters more than what others think of us.

Michael Dirda in a Washington Post review of Plato at the Googleplex by Rebecca Newberger Goldstein

Here are the Rules

When someone gives you rules for your relationship whether explicitly or implied (“We can only talk about these subjects and not those subjects over there” or “We will only go to these places together” or “Only contact me in this particular way”) you have to decide whether this comes out of a legitimate concern to keep the relationship in a healthy place or whether it’s an attempt to control you-prompted by insecurity and fear. In other words, is this a request that you become co-conspirators in hiding from painful truths about the person making the request?

Stephen Goforth

Bumper Sticker Catch Phrases

We need to be careful about staking the important ethical decisions in our lives on bumper sticker catch phrases. The problem is that the ideas expressed in these bite-sized pronouncements have broader implications.

While the ethical aspect that is explicit in the bumper sticker may look good at first glance, other ideas that follow from it may not be so attractive. Most of us have heard or used the cliché “When in Rome, do as the Romans do,” and it can sound like worthwhile advice. But what if the standard practices of the “Romans” stand in direct conflict with your moral or religious convictions? The is why we need to get behind the cliché’ itself.

Before we commit ourselves to any bumper sticker, we want to make certain that we can accept all that is implied in the slogan.

Steve Wilkens, Beyond Bumper Sticker Ethics

The strange case of the surveillance cameras

This is the story of a statistic; it is sort of a detective story.

The mystery stat was sitting on one of our Times blogs and read “the average Brit is caught on security cameras some 300 times a day” and, God knows why, I just decided to chase the number down and find out where it came from.

The source was given in a footnote as coming from a book The Maximum Surveillance Society, published in 1999, by two academics, including a C. Norris.

So I set to work trying to find the book. In the meantime I mused on two things. First was how the “300 times” had become viral. It now occurs all over the place, and is the standard statistic used for the number of times Britons may or will be captured by cameras daily.

I managed to find a copy of the Norris book online. The footnoted page was towards the back of a chapter detailing a day in the life of a man called Thomas Reams, as he did various things in and around London. The authors wrote “While this contrived account is, of course, a fictional construction, it is a fiction that increasingly mirrors the reality of routine surveillance.”

What? A fiction!

Imagine, for a moment, that the original paragraph had read “and one hypothetical construction managed to have its fictional hero captured 300 times in a single day”. It wouldn't be quite the same, would it? So I began to wonder how the report's authors had failed to notice that an important factual source was fictional. Had they not checked it?

Professor Norris reminded me that the viral 300 - still based on his book - entered the bloodstream long before 2006. And he was right. For example the BBC website carries a 2002 story stating that “the average citizen in the UK is caught on cameras 300 times a day”.

So, that was the story of one statistic in one study.

Every day we hear of several statistics, and every week of several studies. I have no idea whether the “300 times” case is typical, but I fear that it might be, and that, if only there were more time to scrutinise all the claims made in such “reports” - whatever side they take - we would discover many “truths” that just aren't.

David Aaronovitch
(London) Times

grasping truth

Kierkegaard’s concern is really not with the adequacy of a philosophical theory of truth, but with the question of what it means for a human being to possess the truth. To grasp the significance of this, we must not think of truth in the way characteristic of contemporary philosophy, focusing on the properties of propositions, but in the way ancient thinkers conceived of truth. For Socrates and Plato, at least as Kierkegaard understood them, having the truth meant having the key to human life, possessing that which makes it possible to live life as it was intended to be lived.

C Stephen Evans, Introduction: Kierkegaard’s life and works

How we create our own false memories

Some witnesses to crimes who are struggling to recall them are instructed to let their minds roam freely, to generate whatever comes to mind, even if it is a guess. However, the act of guessing about possible events causes people to provide their own misinformation, which, if left uncorrected, they may later come to retrieve as memories.

Suppose the police interview a witness shortly after a crime, showing pictures of possible suspects. Time passes, but eventually the police nab a suspect, one whose picture has been viewed by the witness. If the witness is now asked to view a lineup, he may mistakenly remember one of the suspects whose photo he saw as having been present at the crime.

We cannot remember every aspect of an event, so we remember those elements that have the greatest emotional significance for us, and we fill in the gaps with details of our own that are consistent with our narrative but maybe wrong.

Imagination inflation refers to the tendency of people who, when asked to imagine an event vividly, will sometimes begin to believe, when asked about it later, that the event actually occurred. Adults who ask "Did you ever break a window with your hand?" Were more likely on a later life inventory to report that they believe this event occurred during their lifetimes. It seems that asking the question led them to imagine the event, and the act of having imagined it had the effect, later, of making them more likely to think it had occurred (relative to other group answer the question not having previously imagined it occurring).

Accounts that sound familiar can create a feeling the feeling of knowing and be mistaken for true. This is one reason that political or advertising claims that are not factual but repeated can gain traction with the public, particularly if they have emotional resonance. Something you once heard that you hurt again heard it again later carries a warmth of familiarity that can be mistaken for memory, a shred of something you once knew and cannot quite place but are inclined to believe. In the world propaganda, this is called "the big lie" technique-even a big lie told repeatedly can come to be accepted as truth.

Peter C. Brown and Henry L. Roediger III, Make It Stick: The Science of Successful Learning