We’re hardwired to delude ourselves

When people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. 

If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view. Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow.

Ben Yagoda writing in The Atlantic 

a mental short-cut that can lead us away from truth

Imagine I tell you that a group of 30 engineers and 70 lawyers have applied for a job. I show you a single application that reveals a person who is great at math and bad with people, a person who loves Star Wars and hates public speaking, and then I ask whether it is more likely that this person is an engineer or a lawyer. What is your initial, gut reaction? What seems like the right answer?

Statistically speaking, it is more likely the applicant is a lawyer. But if you are like most people in their research, you ignored the odds when checking your gut. You tossed the numbers out the window. So what if there is a 70 percent chance this person is a lawyer? That doesn’t feel like the right answer.

That’s what a heuristic is, a simple rule that in the currency of mental processes trades accuracy for speed. A heuristic can lead to a bias, and your biases, though often correct and harmless, can be dangerous when in error, resulting in a wide variety of bad outcomes from foggy morning car crashes to unconscious prejudices in job interviews.

David McRaney writing in BoingBoing

The Halo Effect

If you like the president’s politics, you probably like his voice and his appearance as well. The tendency to like (or dislike) everything about a person–including things you have not observed–is known as the halo effect. The term has been in use in psychology for a century, but it has not come into wide use in everyday language. This is a pity, because the halo effect is a good name for a common bias that plays a large role in shaping our view of people and situations. It is one of the ways the representation of the world that system one generates is simpler and more coherent than the real thing.

You meet a woman named Joan at a party and find her personable and easy to talk to. Now her name comes up as someone who could be asked to contribute to a charity. What do you know about Joan's generosity? The correct answer is that you know virtually nothing, because there is little reason to believe that people who are agreeable in social situations are also generous contributors to charities. But you like Joan and you will retrieve the feeling of liking her when you think of her. You also like generosity and generous people. By association, you are now predisposed to believe that Joan is generous. And that you believe she is generous you probably like Joan eve better than you did earlier, because you have added generosity to her pleasant attributes.

The sequence in which we observe characteristics of a person is often determined by chance. Sequence matters, however, because the halo effect increase the weight of the first impressions, sometimes to the point that subsequent information is mostly wasted.

Daniel Kahneman, Thinking, Fast and Slow

the strongest political bias of all

The strongest bias in American politics is not a liberal bias or a conservative bias; it is a confirmation bias, or the urge to believe only things that confirm what you already believe to be true. Not only do we tend to seek out and remember information that reaffirms what we already believe, but there is also a “backfire effect,” which sees people doubling down on their beliefs after being presented with evidence that contradicts them. So, where do we go from here? There’s no simple answer, but the only way people will start rejecting falsehoods being fed to them is by confronting uncomfortable truths.

Emma Roller writing in the New York Times

wired to blunder

We're hardwired to make blunders and avoiding them requires nearly superhuman discipline. Four tendencies conspire to sabotage our decisions at critical moments:

OVERCONFIDENCE.

We think we're smarter than we are, so we think the stocks we've chosen will deliver even when the market doesn't. When evidence contradicts us, we're blinded by...

CONFIRMATION BIAS.

We seek information that supports our actions and avoid information that doesn't. We interpret ambiguous evidence in our favor. We can cite an article that confirms our view but can't recall one that challenges it. Even when troubling evidence becomes unavoidable, we come up against...

STATUS QUO BIAS.

We like leaving things the way they are, even when doing so is objectively not the best course. Plenty of theories attempt to explain why, but the phenomenon is beyond dispute. And supposing we could somehow fight past these crippling biases, we'd still face the mother of all irrationalities in behavioral finance...

LOSS AVERSION (and its cousin, regret avoidance)

We hate losing more than we like winning, and we're terrified of doing something we'll regret. So we don't buy and sell when we should because maybe we'll realize a loss or miss out on a gain.

These tendencies are so deep-rooted that knowing all about them isn't nearly enough to extinguish them. The best we can do is wage lifelong war against them and hope to gain some ground.

Geoff Colvin (from his Fortune Magazine article "Investor March Madness: We're Wired for Blunders, but can Improve Odds")