Thinking, Fast and Slow: How We Process and Respond to the World

Remember your Econ teacher asking you to assume that people acted rationally? There are obvious limitations to that assumption. It’s not that people are irrational but their decisions are influenced by a complex mixture of memories and emotions, both conscious and unconscious. In Thinking, Fast and Slow Daniel Kahneman unpacks the research into human being’s process and respond to the world.

Kahneman draws a distinction between our automatic and involuntary response, System 1, and the controlled and reasoned operations of System 2 which, “Allocates attention to the effortful mental activities that demand it, including complex computations.” System 2 is, “Associated with the subjective experience of agency, choice, and concentration,” and requires attention and is disrupted when attention is drawn away.

His Nobel Prize winning exploration of human decision making yielded several core insights:

“We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events. Overconfidence is fed by the illusory certainty of hindsight.

“We can be blind to the obvious, and we are also blind to our blindness.

“The premise of this book is that it is easier to recognize other people’s mistakes than our own.”

System 1 runs ahead of the facts in constructing a rich image on the basis of scraps of evidence. Daniel writes:

“Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information.

“When System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.

“A lazy System 2 often follows the path of least effort and endorses a heuristic answer without much scrutiny of whether it is truly appropriate.”

System 1 is impacted by mood, “When we are uncomfortable and unhappy, we lose touch with our intuition.”

Our typical automatic response:

  • Infers and invents causes and intentions
  • Neglects ambiguity and suppresses doubt
  • Is biased to believe and confirm
  • Exaggerates emotional consistency
  • Computes more than intended
  • Sometimes substitutes an easier question for a difficult one
  • Is more sensitive to changes than to states
  • Overweights low probabilities
  • Shows diminishing sensitivity to quantity
  • Responds more strongly to losses than to gains
  • Frames decision problems narrowly, in isolation from one another
  • Is influenced by fluency, vividness, and the ease of imagining.

Kahneman warns against common thinking error:

  • Illusion. “We pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify
  • Causality. “Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.”
  • Anchors. “The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.”
  • Feedback. “The feedback to which life exposes us is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.”
  • Past. “The illusion that one has understood the past feeds the further illusion that one can predict and control the future.”
  • Mind needs. “A simple message of triumph and failure that identifies clear causes and ignores the determinative power of luck and the inevitability of regression. These stories induce and maintain an illusion of understanding, imparting lessons of little enduring value to readers who are all too eager to believe them.”
  • Confidence. “Subjective confidence in a judgment is not a reasoned evaluation of the probability that this judgment is correct. Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it.”
  • Optimism. “The evidence suggests that optimism is widespread, stubborn, and costly.”
  • Keeping score. “The ultimate currency that rewards or punishes is often emotional, a form of mental self-dealing that inevitably creates conflicts of interest when the individual acts as an agent on behalf of an organization.”

Kahneman destroyed the credibility of experts in many fields by demonstrating that, “In every case [of low-validity environments]  the accuracy of experts was matched or exceeded by a simple algorithm.” He added, “An algorithm that is constructed on the back of an envelope is often good enough to compete with an optimally weighted formula, and certainly good enough to outdo expert judgment.”

He warns, “Do not trust anyone—including yourself—to tell you how much you should trust their judgment,” and warns, “Intuition cannot be trusted in the absence of stable regularities in the environment.”

In many cases experts are just overconfident. In other cases, he writes

Errors in the initial budget are not always innocent. The authors of unrealistic plans are often driven by the desire to get the plan approved—whether by their superiors or by a client—supported by the knowledge that projects are rarely abandoned unfinished merely because of overruns in costs or completion times.

He warns leaders reviewing the plans to ask for an outside opinion to avoid a planning fallacy. Kahneman adds, “The people who have the greatest influence on the lives of others are likely to be optimistic and overconfident, and to take more risks than they realize.”

On winning the lottery, “People overestimate the probabilities of unlikely events.” But the opposite is true, “People overweight unlikely events in their decisions.”

Loss avoidance is one of human nature’s departure from rationality, “We are driven more strongly to avoid losses than to achieve gains.” It’s loss avoidance that, “Favors minimal changes from the status quo in the lives of both institutions and individuals.” Combined with a narrow framing of a situation, loss aversion is a ‘costly curse.’

In addition to bias, inconsistency plagues human decision making, “We have neither the inclination nor the mental resources to enforce consistency on our preferences, and our preferences are not magically set to be coherent.”

It’s how we remember things that matters most. He writes:

The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions.” While the experiencing self does the living, it’s “the remembering self” which keeps score and makes the choices.

Kahneman concludes, “There is much to be done to improve decision making.” At one point in his career he decided that decision theory should be a curriculum taught in school. He launched but never finished a curriculum.

For educators, Kahneman’s work should expand our consideration of ‘critical thinking skills.’ We shouldn’t teach economics or statistics without discussion of how human beings interpret and use data to manage their affairs.

 

For more Smart Reviews, check out:

Tom Vander Ark

Tom Vander Ark is the CEO of Getting Smart. He has written or co-authored more than 50 books and papers including Getting Smart, Smart Cities, Smart Parents, Better Together, The Power of Place and Difference Making. He served as a public school superintendent and the first Executive Director of Education for the Bill & Melinda Gates Foundation.

Discover the latest in learning innovations

Sign up for our weekly newsletter.

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.