Summary and Analysis of Thinking, Fast and Slow: Based on the Book by Daniel Kahneman
So much to read, so little time? This brief overview of Thinking, Fast and Slow tells you what you need to know—before or after you read Daniel Kahneman’s book.

Crafted and edited with care, Worth Books set the standard for quality and give you the tools you need to be a well-informed reader.
 
This short summary of Thinking, Fast and Slow by Daniel Kahneman includes:
 
  • Historical context
  • Part-by-part summaries
  • Detailed timeline of key events
  • Important quotes
  • Fascinating trivia
  • Glossary of terms
  • Supporting material to enhance your understanding of the source work
 
About Thinking, Fast and Slow by Daniel Kahneman:
 
Nobel Prize–winning psychologist Daniel Kahneman explores the mysteries of intuition, judgment, bias, and logic in the international bestseller Thinking, Fast and Slow. His award-winning book explains the different ways people think, whether they’re deciding how to invest their money or how to make friends.
 
Kahneman’s experiments in behavioral economics, in collaboration with cognitive psychologist Amos Tversky, led to a theory of two systems of thought: the fast thinking used when ducking a blow, and slow thinking that’s better employed for making major life decisions.
 
Applying these psychological concepts to different facets of our lives, Kahneman demonstrates how to better understand your own decision-making, and the choices made by others.
 
The summary and analysis in this ebook are intended to complement your reading experience and bring you closer to great work of nonfiction.
1125375735
Summary and Analysis of Thinking, Fast and Slow: Based on the Book by Daniel Kahneman
So much to read, so little time? This brief overview of Thinking, Fast and Slow tells you what you need to know—before or after you read Daniel Kahneman’s book.

Crafted and edited with care, Worth Books set the standard for quality and give you the tools you need to be a well-informed reader.
 
This short summary of Thinking, Fast and Slow by Daniel Kahneman includes:
 
  • Historical context
  • Part-by-part summaries
  • Detailed timeline of key events
  • Important quotes
  • Fascinating trivia
  • Glossary of terms
  • Supporting material to enhance your understanding of the source work
 
About Thinking, Fast and Slow by Daniel Kahneman:
 
Nobel Prize–winning psychologist Daniel Kahneman explores the mysteries of intuition, judgment, bias, and logic in the international bestseller Thinking, Fast and Slow. His award-winning book explains the different ways people think, whether they’re deciding how to invest their money or how to make friends.
 
Kahneman’s experiments in behavioral economics, in collaboration with cognitive psychologist Amos Tversky, led to a theory of two systems of thought: the fast thinking used when ducking a blow, and slow thinking that’s better employed for making major life decisions.
 
Applying these psychological concepts to different facets of our lives, Kahneman demonstrates how to better understand your own decision-making, and the choices made by others.
 
The summary and analysis in this ebook are intended to complement your reading experience and bring you closer to great work of nonfiction.
2.49 In Stock
Summary and Analysis of Thinking, Fast and Slow: Based on the Book by Daniel Kahneman

Summary and Analysis of Thinking, Fast and Slow: Based on the Book by Daniel Kahneman

by Worth Books
Summary and Analysis of Thinking, Fast and Slow: Based on the Book by Daniel Kahneman

Summary and Analysis of Thinking, Fast and Slow: Based on the Book by Daniel Kahneman

by Worth Books

eBook

$2.49  $3.50 Save 29% Current price is $2.49, Original price is $3.5. You Save 29%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers

LEND ME® See Details

Overview

So much to read, so little time? This brief overview of Thinking, Fast and Slow tells you what you need to know—before or after you read Daniel Kahneman’s book.

Crafted and edited with care, Worth Books set the standard for quality and give you the tools you need to be a well-informed reader.
 
This short summary of Thinking, Fast and Slow by Daniel Kahneman includes:
 
  • Historical context
  • Part-by-part summaries
  • Detailed timeline of key events
  • Important quotes
  • Fascinating trivia
  • Glossary of terms
  • Supporting material to enhance your understanding of the source work
 
About Thinking, Fast and Slow by Daniel Kahneman:
 
Nobel Prize–winning psychologist Daniel Kahneman explores the mysteries of intuition, judgment, bias, and logic in the international bestseller Thinking, Fast and Slow. His award-winning book explains the different ways people think, whether they’re deciding how to invest their money or how to make friends.
 
Kahneman’s experiments in behavioral economics, in collaboration with cognitive psychologist Amos Tversky, led to a theory of two systems of thought: the fast thinking used when ducking a blow, and slow thinking that’s better employed for making major life decisions.
 
Applying these psychological concepts to different facets of our lives, Kahneman demonstrates how to better understand your own decision-making, and the choices made by others.
 
The summary and analysis in this ebook are intended to complement your reading experience and bring you closer to great work of nonfiction.

Product Details

ISBN-13: 9781504043717
Publisher: Worth Books
Publication date: 01/24/2017
Series: Smart Summaries
Sold by: Barnes & Noble
Format: eBook
Pages: 30
File size: 2 MB

About the Author

So much to read, so little time? Each volume in the Worth Books catalog presents a summary and analysis to help you stay informed in a busy world, whether you’re managing your to-read list for work or school, brushing up on business strategies on your commute, preparing to wow at the next book club, or continuing to satisfy your thirst for knowledge. Get ready to be edified, enlightened, and entertained—all in about 30 minutes or less!
Worth Books’ smart summaries get straight to the point and provide essential tools to help you be an informed reader in a busy world, whether you’re browsing for new discoveries, managing your to-read list for work or school, or simply deepening your knowledge. Available for fiction and nonfiction titles, these are the book summaries that are worth your time.
 

Read an Excerpt

Summary and Analysis of Thinking, Fast and Slow


By Daniel Kahneman

OPEN ROAD INTEGRATED MEDIA

Copyright © 2017 Open Road Integrated Media, Inc.
All rights reserved.
ISBN: 978-1-5040-4371-7



CHAPTER 1

Summary

Part 1: Two Systems

System 1 of your brain operates speedily and automatically, while System 2 is slow and requires a great deal of mental energy to process complex mental activities. System 1 and System 2 work together.

Here are some examples of automatic activities attributed to System 1:

• Detect if one object is more distant than another

• Complete simple math equations, like 1 + 1 = ?

• Smile when shown an image of puppies

• Complete the phrase, "salt and ..."

• Orient to the source of a sound

• Swat a mosquito

• Drive on an empty road

• Recognize stereotypes


All of these mental calculations occur automatically and require very little or no effort. System 1 also includes innate skills, or those we share with other animals, such as being prepared to perceive the world, avoiding losses, recognizing objects, and fearing spiders. Many more we acquire through practice and learned associations and abilities, such as knowing the capital of New York, reading, and interpreting social situations. Additionally, many of the mental actions attributed to System 1 are involuntary. Your mind cannot help but solve 1 + 1.

Here are some examples of the mental events attributed to System 2:

• Focus on the voice of one person in a noisy room

• Look for a woman with red hair in a crowd

• Focus your attention on only the elephants in a circus

• Maintain a faster walking speed than is comfortable for you

• Fill out a tax form

• Search your memory for a surprising sound

• Brace for a punch

• Monitor how you act in a social situation


Unlike the mental events attributed to System 1, System 2 events require you to pay attention. Paying attention requires you to spend mental energy. Usually, you can only process one System 2 mental event at a time. For example, it is impossible for most people to make a left turn in heavy traffic while calculating 19 x 168. Focusing intently on something effectively makes you blind, even to stimuli that normally attract attention.


System 1:

System 1 responses are immediate, which means they can easily be erroneous and are often based on inadequate information; they are essential to human survival, however, because they make instant judgments in potentially dangerous situations. For example, System 1 tells you to move from the path of an oncoming cyclist, swat a mosquito, or duck to avoid a projectile.

System 1's main job is to assess normality, the evolutionary function that helps us survive — life is more manageable when there are patterns to follow. It seeks out causes and intentions, and can attribute effects to them even when there is no actual causality. The "halo effect," for instance, occurs when the brain uses a small amount of information to form broad, sweeping conclusions about someone or something without considering what information might be missing.

This system pulls from the many ideas we unconsciously group together at any given time to instantly make sense of situations and stimuli around us. Priming effects are an expression of this. When your brain has been primed by exposure to an idea, theme, or even by your own physiological needs, you more readily pull related associations from your existing networks. Primes guide our behaviors, making us more or less likely to act in certain ways. For example, research shows that voters from both ends of the political spectrum are more likely to vote to increase school budgets when their polling place is inside a school. This is a priming effect.


System 2:

System 2 involves exerting a high level of effort, which causes the brain to "not see" other stimuli, even when those stimuli would normally be remarkable. In fact, tasks handled by this second system result in physical changes to our bodies: pupil dilation and increased heart rate — the two signs Kahneman used to research how and when people switched back and forth between the mental processes.

This system operates at a slower speed. Pushing it to work much faster takes considerable effort and depletes the brain's resources quickly. The mind is cognitively busy when it is intensely focused on tasks that demand System 2's energies, such as calculating numbers. But being cognitively busy also renders one more likely to make superficial judgments and impulsive decisions. This is because System 2, the home of good judgment, is occupied, so System 1 is forced to take over.

Real cognitive aptitude is the heart of System 2, and it's what gives us the ability to think and consider our options before acting, to employ self-control, and to come to rational decisions.


Clashes Between System 1 and System 2:

System 1 monitors what is going on around us by constantly engaging in basic assessments; it believes and confirms what it sees, while System 2 doubts and challenges. System 1 is unable to focus only on the task assigned to it by System 2; inevitably, it performs other basic assessments as well.

When an intuitive answer to a question isn't readily available, System 1 generates an easier question, substitutes it, and answers it. Kahneman defines this as a heuristic: a simple process to find answers to hard questions, even though those answers may not be perfect. Issues arise when the heuristics generated are inadequate substitutes. The mood heuristic is the way the mind substitutes an assessment of a current mood for the more complicated question of general happiness or other emotional assessment. The affect heuristic refers to the way we tend to allow our likes and dislikes to control our beliefs about the world.

Cognitive ease is the feeling you have when things are generally going well and there is no need for System 2 to intervene. Cognitive strain is the feeling that there are unmet needs that require work from System 2. Mere familiarity can be enough to trick the brain into thinking something is true. Things that are easier to read and understand feel truer to us. Things that are tougher to read and understand induce cognitive strain and therefore engage System 2.

Mood affects intuition, because good mood and cognitive ease go together. Bad mood and cognitive strain are associated because tougher times demand System 2. When you are feeling happy you may be more creative, but you may also be more prone to logical errors.

Need to Know: System 1 works automatically and looks for patterns around you; it makes snap judgments, which may be erroneous if the problem at hand is complex. System 2 doubts and challenges, but is lazy. Sometimes the latter should take over, but instead it relies on what the former tells it.


Part 2: Heuristics and Biases

Humans are not intuitively good at statistics; in fact, System 1 finds relationships and causality where none exist, leading to a bias of confidence over doubt. This is a problem even for trained experts who understand statistics, because System 1 works automatically.


The Anchoring Effect:

The anchoring effect occurs when people consider some value for an unknown number before estimating that number. Even when the considered value is completely random and unrelated, subsequent estimates tend to be "anchored" to it. For example, if you are hoping to buy a car that is listed at $40,000 and then get 10 percent off, you feel you've gotten a deal in paying $36,000. However, that may be because you've "anchored" to that initial $40,000. In fact, $40,000 may be totally irrelevant to the car's actual value.


Availability:

The availability heuristic means that our estimate of how prevalent something is or how frequently it happens is heavily influenced by how easily we can call up examples of the phenomenon and how recently we were exposed to it. For instance, if in your recent memory several famous musicians died young, you may be inclined to think that the majority of musicians die young.

Availability causes a bias in how we assess risk. Trends in reporting cause people to "see" more about some kinds of risk than others, and therefore attach more importance to them. In reality, one type of risk may be far less significant than another. Because its job is to sense danger, it is System 1 that responds to these kinds of trends. But thanks to the availability bias, actions can result in inappropriate or ineffective responses.


Representativeness:

Humans tend to predict probability based on representativeness rather than base rates. This happens when System 1 substitutes simple-to-answer heuristic questions in place of actual base-rate calculations of probability — we guess how likely things are to happen based on intuitive judgments. Instead, we should anchor our estimates to reliable diagnostic evidence and base rates.

Statistical base rates characterize the populations involved in a case, but not the facts of the specific case itself. As we try to form judgments and predictions, we tend to undervalue statistical base rates and overvalue causal base rates — even though they have the same relevance as each other. The stereotypes of System 1 shape this process; Bayesian analysis is a statistical tool that can mitigate these effects because it works with base rates and removes intuition from the process. The theorem is an algorithm that allows the user to plug in the known data and base rates and arrive at a logical prediction.


The Linda Problem:

Another common System 1 error is mistaking plausibility for probability. Kahneman illustrates this by detailing a well-known experiment he and his partner conducted. It is called the Linda problem: Linda is a social activist. Is she more likely to be a bank teller or a feminist bank teller? Obviously she's statistically more likely to be a bank teller, because all feminist bank tellers are bank tellers. If we choose "feminist bank teller" as the answer, we are limiting our own odds of being right. Many people get this answer wrong, however, because this scenario fools System 1. The job of System 1 is simply to assess plausibility, and feminist bank teller "feels" more plausible to System 1.


Regression to the Mean:

Random fluctuations in the quality of human performance mean that particularly bad or good outcomes are more likely to be followed by less spectacular outcomes in the other direction. This is called regression to the mean. For example, if a fighter pilot in a training program has a particularly bad flight, he or she is highly likely to have a better flight the next time, no matter whether praise or punishment follows that bad flight.

Most psychologists have found that praise is more effective in training overall as time passes, but many military trainers believe that criticism and punishment are more effective. Why? Because they see a very bad flight, followed immediately by punishment, followed by a better flight. They assume that there is a cause and effect in action. However, statistically speaking, the better flight was highly likely to occur anyway; the bad flight was simply a regression to the mean.

Need to Know: A trait of System 1 is a willingness to base extreme and rare predictions on weak evidence. System 1 will often substitute heuristic, or easier, questions and answers into contexts that are too complex, thereby avoiding analysis and data-driven decisions.


Part 3: Overconfidence

Our expectations for the future are strongly shaped by the stories we tell about the past; unfortunately, the brain retells incomplete versions that are based on hindsight. This narrative fallacy leads to outcome bias: the mistaken idea that a poor decision that happened to turn out well was actually a good decision. These are called cognitive illusions. For example, investors and pundits believe that they have the skill and talent to make successful predictions, even though there is no concrete data to support that contention. In fact, the numbers indicate that most success in these areas is based on luck.

Algorithms are significantly more accurate in predicting outcomes than intuition is, even the intuitions of experts. The same experts, given the exact same case materials in different settings, contradict their own evaluations about 20% of the time. This is because System 1 is dependent on context.

For true expertise to develop without the illusion of validity, there must be two conditions present: a predictable environment and the chance to learn about the factors that make the environment predictable. In other words, enough practice and exposure can allow a level of expertise that makes even complicated assessments automatic. For example, a chess master can glance at a board and instantly "know" the right moves and the outcome of the game, while for most people, that series of calculations would demand the employment of System 2.

The planning fallacy describes predictions that are unrealistic best-case scenarios which could be made more accurate by using reference-case forecasting — a way of anchoring predictions to statistics surrounding similar cases. The sunk-cost fallacy is a related issue in which a plan that should be abandoned is pursued because no one wants to admit defeat. While the believed reason for pursuing the plan may be unwillingness to lose money already invested, careful analysis of the plan and its probable outcome can reveal the high likelihood of losing more money, which shows the reasoning to be faulty.

Entrepreneurial delusions are a kind of optimism in which business owners do not realistically assess their chances of success. They also have an illusion of being in control of the entire situation and therefore cognitively neglect the impact of competitors.

Need to Know: Look for ways to adopt algorithms in decision making to improve your outcomes. This may mean learning to use Bayes's theorem when making predictions, for example, or learning to anchor your assessments to realistic base rates and other data sources, rather than unrelated anchors or intuitions.


Part 4: Choices

The choices dealt with in this section have to do with wealth, risk, and loss. Kahneman is concerned with how System 1 and System 2 affect our perception of risks, losses, and how we accumulate (and lose) wealth.


Utility Theory:

According to Bernoulli's utility theory, risk-averse people choose a sure thing that is lesser in value, essentially paying to avoid uncertainty. Bernoulli's model states that it is the utility of wealth — not the money itself, but what can be done with money — that makes people happy. However, Kahneman argues that this ignores reference points; a person who is faced only with bad outcomes is more likely to take a risk, even when the practical numerical outcome is the same as the one faced by the risk-averse person.

In other words, if utility theory is true, and you and I each have one million dollars, we should both be equally happy because we have the same ability to accomplish things with the same amount of wealth. However, this ignores our individual reference points. If you had ten million dollars yesterday and I had one thousand, chances are you are far less happy than I am now.


Prospect Theory:

In contrast to utility theory, prospect theory is shaped by three cognitive features: evaluation is relative to a neutral reference point, there is diminishing sensitivity to changes in wealth, and we all share an aversion to loss. It also considers the endowment effect — the feeling that something is worth more once you already have it, which is itself based on a reference point.

For Kahneman, the most important takeaways from prospect theory are that reference points exist, and losses always loom larger than corresponding gains in our perception. Therefore, because existing terms provide reference points for negotiations, people tend to fight harder to prevent losses than they do to achieve gains, and they have a sense of unfairness or entitlement which pushes them to feel that their losses are unjust.


The Fourfold Pattern:

Kahneman and Tversky came up with a way to explain a system of preferences called the fourfold pattern. Two main ideas are at the heart of this: 1) people attach values not to wealth, but to losses and gains, and 2) people attach decision weights to outcomes that differ from actual probabilities. These patterns give rise to four scenarios:

• When facing a gain with a low probability, the possibility effect — mistaking something that is merely possible for something that is probable — induces risk-seeking behavior: We buy a lottery ticket even though it is a waste of money.

• When facing a loss with a low probability, the possibility effect induces risk-averse behavior: We pay for insurance so we don't lose everything in a fire.

• When facing a gain with a high probability, the certainty effect induces risk-averse behavior: We take a "sure thing," $80 today, instead of a highly "likely" $100 tomorrow.

• When facing a loss with a high probability, the certainty effect induces risk-seeking behavior: We go ahead and gamble instead of cutting our losses.


Rare Events:

In the decision-making process, people overweigh and overestimate the probability of unlikely events. Vivid events, obsessive concerns, explicit reminders, and concrete representations are given more importance and are seen as more probable. For example, most people in the United States today fear a terrorist act more than a car accident or a heart attack, even though statistically they have far more risk from the latter two possible events. However, terror attacks are vivid events and we see explicit reminders of them in the media every day; therefore, we give them more importance as we decide what is probable.


(Continues...)

Excerpted from Summary and Analysis of Thinking, Fast and Slow by Daniel Kahneman. Copyright © 2017 Open Road Integrated Media, Inc.. Excerpted by permission of OPEN ROAD INTEGRATED MEDIA.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Contents

Context,
Overview,
Summary,
Timeline,
Direct Quotes and Analysis,
Trivia,
What's That Word?,
Critical Response,
About Daniel Kahneman,
For Your Information,
Bibliography,
Copyright,

From the B&N Reads Blog

Customer Reviews