Cunningham’s Law

Cunningham’s Law is the observation that the best way to get a good or right answer is not to ask a question; it’s to post a wrong answer. So, if you want to know the leading causes of World War I, go to a history forum and post, “World War One was entirely caused by the British.” Sit back, grab some popcorn, and wait for the angry — yet probably informed — corrections to come flying in. 

Socrates, did a lot of it. Socrates would sit on some public bench and talk to whoever happened to sit next to him. He’d often open his dialogues by presenting a false or deeply flawed argument and go from there. He would ironically agree with whatever his partner would say, but then raise a seemingly innocuous question to challenge that position.

“Socratic irony” is where you pretend to be ignorant of something so you can get greater clarity about it. In short, it’s a lot like Cunningham’s Law.

Here are two ways you can use Cunningham’s Law:

The Bad Option: Have you ever been in a group where no one can decide what decision to make, and so you hover about in an awkward, polite limbo? “What restaurant shall we go to?” gets met with total silence. Instead try saying, “Let’s go to McDonald’s” and see how others object and go on to offer other ideas. 

The Coin Toss: If you’re unsure about any life decision — like “should I read this book or that book next?” or “Should I leave my job or not?” — do a coin toss. Heads you do X, tails you do Y. You are not actually going to live by the coin’s decision, but you need to make a note of your reaction to whatever outcome came of it. Were you upset at what it landed on? Are you secretly relieved? It’s a good way to elicit your true thoughts on a topic.

Jonny Thomson writing in BigThink

Thoughtful discourse on college campuses

The capacity to entertain different views is vital not only on a college campus but also in a pluralistic and democratic society. With shouting matches replacing thoughtful debate everywhere, from the halls of Congress to school-board meetings, a college campus might be the last, best place where students can learn to converse, cooperate, and coexist with people who see the world differently. 

The University of Chicago famously enshrined this principle in a 2014 report by a faculty committee charged with articulating the university’s commitment to uninhibited debate. “It is not the proper role of the university,” the Chicago Principles read, “to attempt to shield individuals from ideas and opinions they find unwelcome, disagreeable, or even deeply offensive.” 

Daniel Diermeier writing in the Chronicle of Higher Ed

Managing Yourself

If you understand how you think and work, you have more control over who you will become. Abilities can improve as you understand how your mind works.

Creative and critically thinking people open a conversation with themselves that allows them to understand, control, and improve their own minds and work.

Ken Bain, What the Best College Students Do

A healthy balance between model building and data gathering

Too much theory without data, and speculations run amok. We get lost in a fog of models and idealizations that seldom have much to say about the world we live in. The maps invent all sorts of worlds and tell us very little about the world we live in, leaving us to get lost in fantasy. With too much data and no theory, though, we drown in confusion. We don’t know how to tell the story we are supposed to tell. We hear all sorts of tales about what is out there in the wilderness, but we don’t know how to chart the best path to reach our destination. The better the balance between speculative thinking and data gathering, the healthier the science that comes out.  

Marcelo Gleiser writing in BigThink

How Feelings Help You Think

If you’re in a grocery store, and you're hungry, everyone knows you're going to buy more stuff. You go into the store, you have certain data. If you go when you're in a non-hungry state, you have all that data in front of you, and all those choices to make, and you make a series of choices. If you go when you're in a hungry state, same data, same information, and you make totally different decisions. That's a good illustration of what emotions do. The emotions are a framework for your logical processing. It affects how you evaluate data, how skeptical you are of certain ideas versus how accepting you are of those same ideas. Your brain doesn't process in a vacuum. 

Leonard Mlodinow, quoted in GQ

The Best Teachers

The best teachers … sometimes discard or place less emphasis on traditional goals in favor of the capacity to comprehend, to use evidence to draw conclusions, to raise important questions, and to understand one’s thinking. In most disciplines, that means they emphasize comprehension, reasoning, and brilliant insights over memory, order punctuality, or the spick-and-span. Spelling, the size of margins or fonts, and the style of footnotes and bibliographies are trivial in comparison to the power to think on paper; conceptual understanding of chemistry is more important than remembering individual details; the capacity to think about one’s thinking — to ponder metacognitively – and to correct it in progress is far more worthy than remembering any name, date, or number. 

The ability to understand the principles and concepts in thinking critically through a problem outranks any capacity to reach the correct answer on any particular question. These teachers want their students to learn to use a wide range of information, ideas, and concepts logically and consistently to draw meaningful conclusions. They help their students achieve those levels by providing meaningful directions and exemplary feedback that quietly yet forcefully couple lofty ideals with firm confidence in what students can do – without making any judgments of their worth as human beings. Most significant, they help students shift their focus from making the grade to thinking about personal goals of development. 

Ken Bain, What the Best College Teachers Do

False equivalence

False equivalency means that you think (or are told) two things should have equal weight in your decision-making. If one opinion has solid data supporting it, but the other opinion is conjecture, they are not equivalent in quality. 

False equivalence leads people to believe two separate things are equally bad, or equally good. A look into how damaging this thought process is can be found in Isaac Asminov's article, "The Relativity of Wrong." Asminov wrote, "When people thought the earth was flat, they were wrong. When people thought the Earth was spherical they were wrong. But if you think that thinking the Earth is spherical is just as wrong as thinking the Earth is flat, then your view is wronger than both of them put together. The basic trouble, you see, is that people think that ‘right’ and ‘wrong are absolute; that everything that isn't perfectly and completely right is totally and equally wrong.”

Stephanie Sarkis writing in Forbes

Teaching Critical Thinking

Give (children) many opportunities to use their reasoning abilities as they tackle fascinating problems and receive challenges to their thinking. Ask them to consider the implications of their reasoning, implications for themselves, for the way they view the world, for policy debates, for significant philosophical questions, or even for moral or religious issues. (Help them determine) what intellectual standards (can) test proposed answers and (how) to weigh conflicting claims about the “truth”. Help (them) learn to assess their own work using those standards. Ask them about their assumptions and about the concepts and evidence they employ in their reasoning.

Ken Bain, What the Best Teachers Do

Inconvenient Conclusions

Conspiracy theories may be deployed as a rhetorical tool to escape inconvenient conclusions. People selectively appeal to a conspiracy among scientists to explain away a scientific consensus when their political ideology compels them to do so—but not when the scientific consensus is of no relevance to their politics.

Stephan Lewandowsky & John Cook, The Conspiracy Theory Handbook

The persecuted victim

Conspiracy theorists perceive and present themselves as the victim of organized persecution. At the same time, they see themselves as brave antagonists taking on the villainous conspirators. Conspiratorial thinking involves a self-perception of simultaneously being a victim and a hero.

Stephan Lewandowsky & John Cook, The Conspiracy Theory Handbook

Sunk Cost Fallacy

We all have seen the sunk cost fallacy in action at some point, whether it be sitting through that bad movie because we have already paid for it or finishing that awful book because we were already halfway through. Everyone has been in a situation where they ended up wasting more time because they were trying to salvage the time they had already invested. A sunk cost, also known as a retrospective cost, is one that has already been incurred and cannot be recovered by any additional action. The sunk cost fallacy refers to the tendency of human beings to make decisions based on how much of an investment they have already made, which leads to even more investment but no returns whatsoever. Sometimes, hard as it is, the best thing to do is to let go.

A way to save yourself from this cognitive bias is by focusing on future benefits and costs rather than the already lost past costs. You have to develop the habit, hard as it is, of ignoring the previous cost information.   

Rahul Agarwal writing in Built in

How to Identify Adaptable People

How can you determine whether a job candidate is willing to constantly revise their understanding and reconsider problems they thought they'd already solved?" Ask: “Tell me about a goal you didn't manage to achieve. What happened? What did you do as a result?" 

Most candidates will take responsibility for failing. (People who don't are people you definitely don't want to hire.) Good candidates don't place the blame on other people or on outside factors. They recognize that few things go perfectly, and a key ingredient of success is having the ability to adjust.   

Smart people take responsibility. And they also learn key lessons from the experience, especially about themselves. They see failure as training. That means they can describe, in detail what perspectives, skills, and expertise they gained from that training. And they can admit where they were wrong -- and how they were willing and even eager to change their minds.     

Jeff Haden writing in Inc.

Availability Bias

Have you ever said something like, “I know that [insert a generic statement here] because [insert one single example].” For example, someone might say, “You can’t get fat from drinking beer, because Bob drinks a lot of it, and he’s thin.” If you have, then you’ve suffered from availability bias. You are trying to make sense of the world with limited data.

People naturally tend to base decisions on information that is already available to us or things we hear about often without looking at alternatives that might be useful. As a result, we limit ourselves to a very specific subset of information.

This happens often in the data science world. Data scientists tend to get and work on data that’s easier to obtain rather than looking for data that is harder to gather but might be more useful. We make do with models that we understand and that are available to us in a neat package rather than something more suitable for the problem at hand but much more difficult to come by.

A way to overcome availability bias in data science is to broaden our horizons. Commit to lifelong learning. Read. A lot. About everything. Then read some more. Meet new people. Discuss your work with other data scientists at work or in online forums. Be more open to suggestions about changes that you may have to take in your approach. By opening yourself up to new information and ideas, you can make sure that you’re less likely to work with incomplete information.

Rahul Agarwal writing in Built in

 

We can be too clever for our own good

Unthinking is the ability to apply years of learning at the crucial moment by removing your thinking self from the equation. Its power is not confined to sport: actors and musicians know about it too, and are apt to say that their best work happens in a kind of trance. Thinking too much can kill not just physical performance but mental inspiration. Bob Dylan, wistfully recalling his youthful ability to write songs without even trying, described the making of “Like a Rolling Stone” as a “piece of vomit, 20 pages long”. It hasn’t stopped the song being voted the best of all time.

In less dramatic ways the same principle applies to all of us. A fundamental paradox of human psychology is that thinking can be bad for us. When we follow our own thoughts too closely, we can lose our bearings, as our inner chatter drowns out common sense. A study of shopping behaviour found that the less information people were given about a brand of jam, the better the choice they made. When offered details of ingredients, they got befuddled by their options and ended up choosing a jam they didn’t like.

If a rat is faced with a puzzle in which food is placed on its left 60% of the time and on the right 40% of the time, it will quickly deduce that the left side is more rewarding, and head there every time, thus achieving a 60% success rate. Young children adopt the same strategy. When Yale undergraduates play the game, they try to figure out some underlying pattern, and end up doing worse than the rat or the child. We really can be too clever for our own good.

Ian Leslie, writing in The Economist

Take a deep breath

A famous classical musician slipped on jeans and a baseball cap. He then took his million-dollar Stradivarius violin into the Washington, DC metro and played for passengers. They were not impressed because he didn't look the part of a professional. He wasn’t wearing concert attire or playing in a concert hall. 

A mother brought her toddler into the emergency room of a hospital for three straight days. But doctors dismissed her symptoms because of her mother’s behavior. On the third visit, the girl died.

In their book Sway: The Irresistible Pull of Irrational Behavior, Ori Brafman and Rom Brafman say these are examples of premature labeling. 

Before you determine the inherent worth of someone or something, take a deep breath and make sure your first impression isn’t keeping you from seeing what's actually going on. 


Stephen Goforth

Extraordinary claims (require extraordinary evidence)

For some people, the less likely an explanation, the more likely they are to believe it. Take flat-Earth believers. Their claim rests on the idea that all the pilots, astronomers, geologists, physicists, and GPS engineers in the world are intentionally coordinating to mislead the public about the shape of the planet. From a prior odds perspective, the likelihood of a plot so enormous and intricate coming together out of all other conceivable possibilities is vanishingly small. But bizarrely, any demonstration of counterevidence, no matter how strong, just seems to cement their worldview further.

Liv Boeree writing in Vox   

Motivated Reasoning 

When we identify too strongly with a deeply held belief, idea, or outcome, a plethora of cognitive biases can rear their ugly heads. Take confirmation bias, for example. This is our inclination to eagerly accept any information that confirms our opinion, and undervalue anything that contradicts it. It’s remarkably easy to spot in other people (especially those you don’t agree with politically), but extremely hard to spot in ourselves because the biasing happens unconsciously. But it’s always there. 

Criminal cases where jurors unconsciously ignore exonerating evidence and send an innocent person to jail because of a bad experience with someone of the defendant’s demographic. The growing inability to hear alternative arguments in good faith from other parts of the political spectrum. Conspiracy theorists swallowing any unconventional belief they can get their hands 

We all have some deeply held belief that immediately puts us on the defensive. Defensiveness doesn’t mean that belief is actually incorrect. But it does mean we’re vulnerable to bad reasoning around it. And if you can learn to identify the emotional warning signs in yourself, you stand a better chance of evaluating the other side’s evidence or arguments more objectively.

Liv Boeree writing in Vox    

The Madman’s Narrative

Consider that two people can hold incompatible beliefs based on the exact same data. Does this mean that there are possible families of explanations and that each of these can be equally perfect and sound? Certainly not. One may have a million ways to explain things, but the true explanation is unique, whether or not it is within our reach. 

In a famous argument, the logician WV Quine showed that there exist families of logically consistent interpretations and theories that can match a give series of facts. Such insight should warn us that mere absence of nonsense may not be sufficient to make something true. 

Nassim Taleb, The Black Swain

No one is completely immune

 Psychological research shows that misinformation is cleverly designed to bypass careful analytical reasoning, meaning that it can easily slip under the radar of even the most intelligent and educated people. No one is completely immune. Indeed, there is now evidence that smarter people may sometimes be even more vulnerable to certain ideas, since their greater brainpower simply allows them to rationalise their (incorrect) beliefs. 

David Robson writing in The Guardian 

Be a Poet

In 2016, educational psychologists, Denis Dumas and Kevin Dunbar found that people who try to solve creative problems are more successful if they behave like an eccentric poet than a rigid librarian. Given a test in which they have to come up with as many uses as possible for any object (e.g. a brick) those who behave like eccentric poets have superior creative performance. This finding holds even if the same person takes on a different identity.  When in a creative deadlock, try this exercise of embodying a different identity. It will likely get you out of your own head, and allow you to think from another person’s perspective. I call this psychological halloweenism.   

Srini Pillay writing in the Harvard Business Review