28 results back to index
Success and Luck: Good Fortune and the Myth of Meritocracy by Robert H. Frank
2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, Amazon Mechanical Turk, American Society of Civil Engineers: Report Card, attribution theory, availability heuristic, Branko Milanovic, Capital in the Twenty-First Century by Thomas Piketty, carried interest, Daniel Kahneman / Amos Tversky, David Brooks, deliberate practice, en.wikipedia.org, endowment effect, experimental subject, framing effect, full employment, hindsight bias, If something cannot go on forever, it will stop - Herbert Stein's Law, income inequality, invisible hand, labor-force participation, labour mobility, lake wobegon effect, loss aversion, minimum wage unemployment, Network effects, Paul Samuelson, Report Card for America’s Infrastructure, Richard Thaler, Rod Stewart played at Stephen Schwarzman birthday party, Ronald Reagan, Rory Sutherland, selection bias, side project, sovereign wealth fund, Steve Jobs, The Wealth of Nations by Adam Smith, Tim Cook: Apple, ultimatum game, Vincenzo Peruggia: Mona Lisa, winner-take-all economy
One of the rules of thumb people often use when making judgments is the so-called availability heuristic. Suppose you’re asked, “Which are more frequent: English words that start with the letter ‘R,’ or those that have ‘R’ as their third letter?” Using the availability heuristic, most people react by trying to think of examples in each category. That approach usually works well, since examples of things that occur more frequently are in fact generally easier to summon from memory. And since most people find it easier to think of examples of words starting with “R,” the availability heuristic leads them to answer that such words occur more frequently. Yet English words with “R” in the third slot are actually far more numerous. The availability heuristic fails here because frequency isn’t the only thing that governs ease of recall.
The availability heuristic fails here because frequency isn’t the only thing that governs ease of recall. We store words in our memories in multiple ways—by their meanings, by the sounds they make and the images they evoke, by their first letters, and by numerous other features. But virtually no one stores words in memory by the identity of their third letter. The availability heuristic suggests that when we construct narratives about how the world works, we rely more heavily on information that happens to be more accessible from memory. But that almost guarantees that our accounts will be biased, since some types of information are far more readily accessible than others. Information about things we’ve experienced repeatedly, for example, is far more salient than information about things we’ve only heard or read about infrequently. Information in the latter category has a much harder time breaking through.
Most of them, after all, are vividly aware of how hard they’ve worked and how talented they are. They’ve been working hard and solving difficult problems every day for many years! They probably also know, in some abstract sense, that they might not have done as well in some other environments. Yet their day-to-day experiences provide only infrequent reminders to reflect on how fortunate they were not to have been born in, say, a war-torn country like Zimbabwe. The availability heuristic biases our personal narratives in a second way, because events that work to our disadvantage are systematically easier to recall than those that affect us positively. My Cornell colleague Tom Gilovich invokes a metaphor involving headwinds and tailwinds to describe this asymmetry. If any of you go running or ride a bike, you’ll know that when you’re running or bicycling into the wind, you’re very aware of it.
Infotopia: How Many Minds Produce Knowledge by Cass R. Sunstein
affirmative action, Andrei Shleifer, availability heuristic, Build a better mousetrap, c2.com, Cass Sunstein, cognitive bias, cuban missile crisis, Daniel Kahneman / Amos Tversky, Edward Glaeser, en.wikipedia.org, feminist movement, framing effect, hindsight bias, information asymmetry, Isaac Newton, Jean Tirole, jimmy wales, market bubble, market design, minimum wage unemployment, prediction markets, profit motive, rent control, Richard Stallman, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, slashdot, stem cell, The Wisdom of Crowds, winner-take-all economy
We use heuristics, or rules of thumb, that lead us to make predictable errors. We are also subject to identifiable biases, which can produce big mistakes.1 A growing literature explores the role of these heuristics and biases and their relationship to law and policy. For example, people err because they use the availability heuristic to answer difficult questions about probability. How / 75 likely is a terrorist attack, a hurricane, a traffic jam, an accident from a nuclear power plant, a case of venereal disease? When people use the availability heuristic, they answer a question of probability by asking whether examples come readily to mind.2 The point very much bears on private and public responses to risks—suggesting, for example, that people will be especially responsive to the dangers of AIDS, crime, earthquakes, and nuclear power plant accidents if examples are easy to recall.
A terrorist attack on television will be highly salient to viewers and will have a greater impact than a report about the attack in the newspaper.3 Similarly, earlier events will have a smaller impact than more recent ones. The point helps explain much behavior. For example, whether people will buy insurance for natural disasters is greatly affected by recent experiences.4 In the aftermath of an earthquake, people become far readier to buy insurance for earthquakes, but their readiness to do so declines steadily from that point, as vivid memories recede. Use of the availability heuristic is not irrational, but it can easily lead to serious errors of fact. After the 2005 disaster produced by Hurricane Katrina in the United States, it was predictable that significant steps would be taken to prepare for hurricanes—and also predictable that before that disaster, such steps would be quite inadequate. 76 / Infotopia Most people are also strikingly vulnerable to framing effects, making different decisions depending on the wording of the problem.
If so, the many minds on the jury are likely to amplify rather than to correct those biases.9 Deliberating groups have also been found to amplify, rather than to attenuate, reliance on the representativeness heuristic.10 Such groups fall prey to even larger framing effects than individuals, so that when the same situation is described in different terms, groups are especially likely to be affected by the redescriptions.11 Groups show more overconfidence than group members;12 78 / Infotopia they are even more affected by the biasing effect of bad arguments from lawyers.13 In an especially revealing finding, groups have been found to make more, rather than fewer, conjunction errors (believing that A and B are more likely to be true than A alone) than individuals when individual error rates are high—though fewer when individual error rates are low.14 Groups do demonstrate a decreased level of reliance on the availability heuristic, but the decrease is slight, even when use of that heuristic leads to clear errors.15 Here’s a disturbing finding, one with great relevance to group behavior in both politics and business: Groups are more likely than individuals to escalate their commitment to a course of action that is failing—and all the more so if members identify strongly with the groups of which they are a part.16 There is a clue here about why companies, states, and even nations often continue with projects and plans that are clearly going awry.
Why Nudge?: The Politics of Libertarian Paternalism by Cass R. Sunstein
Affordable Care Act / Obamacare, Andrei Shleifer, availability heuristic, Cass Sunstein, choice architecture, clean water, Daniel Kahneman / Amos Tversky, Edward Glaeser, endowment effect, energy security, framing effect, invisible hand, late fees, libertarian paternalism, loss aversion, nudge unit, randomized controlled trial, Richard Thaler
This possibility suggests that we need a kind of behavioral science for judgments of morality, and not merely judgments of fact.11 Moral heuristics are pervasive, and they can go wrong, no less than heuristics of other kinds. Consider an analogy: The availability heuristic helps people to come up with estimates of probability, and it generally works well. When we learn of an incident in which certain actions produced serious harm, we update our probability judgments. The updating is perfectly sensible. Use of the availability heuristic can be seen as a kind of rough-and-ready statistical analysis—and perhaps it is even better than that. The problem is that use of the availability heuristic can also go badly wrong, leading to wildly exaggerated fears (or to unhealthy complacency). The same problems arise for many moral heuristics, which generally work well but can lead us in bad directions.
Graphic warnings, grabbing the attention of System 1, are a possibility here. PROBLEMS WITH PROBABILITY System 1 does not handle probability well. One problem is the availability heuristic. When people use that heuristic, they make judgments about probability by asking whether a recent event comes readily to mind.63 If an event is cognitively “available,” people might well overestimate the risk. If an event is not cognitively available, they might underestimate the risk.64 In deciding whether it is dangerous to walk in a city at night, to text while driving, or to smoke, people often ask about incidents of which they are aware. While System 2 might be willing to do some calculations, System 1 works quickly, and it is pretty simple to use the availability heuristic. Instead of asking hard questions about statistics, System 1 asks easy questions about what comes to mind.
If government ensures that people have accurate information about cost, it is not revisiting their ends in any way. It is not even acting paternalistically, in the sense that it is informing people’s choices rather than (independently) influencing them. The same can be said if people underestimate the risks of distracted driving or of smoking. If the government corrects people’s unrealistic optimism, or counteracts their use of the availability heuristic in order to produce an accurate judgment about probability, it is respecting their ends. We might not want to characterize this action as paternalistic at all. So too if, for example, people are ignoring certain product attributes because those attributes are shrouded. If those attributes would matter to people who attended to them, then efforts to promote disclosure do not question people’s ends.
The Paradox of Choice: Why More Is Less by Barry Schwartz
accounting loophole / creative accounting, attribution theory, Atul Gawande, availability heuristic, Cass Sunstein, Daniel Kahneman / Amos Tversky, endowment effect, framing effect, income per capita, job satisfaction, loss aversion, medical residency, mental accounting, Own Your Own Home, Pareto efficiency, positional goods, price anchoring, psychological pricing, RAND corporation, Richard Thaler, science of happiness, The Wealth of Nations by Adam Smith
Unfortunately, most people give substantial weight to this kind of anecdotal “evidence,” perhaps so much so that it will cancel out the positive recommendation found in Consumer Reports. Most of us give weight to these kinds of stories because they are extremely vivid and based on a personal, detailed, face-to-face account. Kahneman and Tversky discovered and reported on people’s tendency to give undue weight to some types of information in contrast to others. They called it the availability heuristic. This needs a little explaining. A heuristic is a rule of thumb, a mental shortcut. The availability heuristic works like this: suppose someone asked you a silly question like “What’s more common in English, words that begin with the letter t or words that have t as the third letter?” How would you answer this question? What you probably would do is try to call to mind words that start with t and words that have t as the third letter. You would then discover that you had a much easier time generating words that start with t.
You would then reason roughly as follows: “In general, the more often we encounter something, the easier it is for us to recall it in the future. Because I had an easier time recalling words that start with t than recalling words with t as the third letter, I must have encountered them more often in the past. So there must be more words in English that start with t than have it as the third letter.” But your conclusion would be wrong. The availability heuristic says that we assume that the more available some piece of information is to memory, the more frequently we must have encountered it in the past. This heuristic is partly true. In general, the frequency of experience does affect its availability to memory. But frequency of experience is not the only thing that affects availability to memory. Salience or vividness matters as well. Because starting letters of words are much more salient than third letters, they are much more useful as cues for retrieving words from memory.
So it’s the salience of starting letters that makes t-words come easily to mind, while people mistakenly think it’s the frequency of starting letters that makes them come easily to mind. In addition to affecting the ease with which we retrieve information from memory, salience or vividness will influence the weight we give any particular piece of information. There are many examples of the availability heuristic in operation. When college students who are deciding what courses to take next semester are presented with summaries of course evaluations from several hundred students that point in one direction, and a videotaped interview with a single student that points in the other direction, they are more influenced by the vivid interview than by the summary judgments of hundreds. Vivid interviews with people have profound effects on judgment even when people are told, in advance of seeing the interviews, that the subjects of the interview are atypical.
Thinking, Fast and Slow by Daniel Kahneman
Albert Einstein, Atul Gawande, availability heuristic, Bayesian statistics, Black Swan, Cass Sunstein, Checklist Manifesto, choice architecture, cognitive bias, complexity theory, correlation coefficient, correlation does not imply causation, Daniel Kahneman / Amos Tversky, delayed gratification, demand response, endowment effect, experimental economics, experimental subject, Exxon Valdez, feminist movement, framing effect, hindsight bias, index card, information asymmetry, job satisfaction, John von Neumann, Kenneth Arrow, libertarian paternalism, loss aversion, medical residency, mental accounting, meta analysis, meta-analysis, nudge unit, pattern recognition, Paul Samuelson, pre–internet, price anchoring, quantitative trading / quantitative ﬁnance, random walk, Richard Thaler, risk tolerance, Robert Metcalfe, Ronald Reagan, The Chicago School, The Wisdom of Crowds, Thomas Bayes, transaction costs, union organizing, Walter Mischel, Yom Kippur War
We also worked very hard, running dozens of experiments and writing our articles on judgment heuristics. At night I wrote Attention and Effort. It was a busy year. One of our projects was the study of what we called the availability heuristic. We thought of that heuristic when we asked ourselves what people actually do when they wish to estimate the frequency of a category, such as “people who divorce after the age of 60” or “dangerous plants.” The answer was straightforward: instances of the class will be retrieved from memory, and if retrieval is easy and fluent, the category will be judged to be large. We defined the availability heuristic as the process of judging frequency by “the ease with which instances come to mind.” The statement seemed clear when we formulated it, but the concept of availability has been refined since then.
judgment heuristics Judgment in Managerial Decision Making (Bazerman) judgments; basic assessments in; of experts, see expert intuition; intensity matching in; mental shotgun in; predictive, see predictions and forecasts; sets and prototypes in; summary, of complex information; see also decisions, decision making “Judgment Under Uncertainty: Heuristics and Biases” (Tversky and Kahneman) Julie problem jumping to conclusions; bias for belief and confirmation in; halo effect in, see halo effect; suppression of ambiguity and doubt in; WYSIATI in, see what you see is all there is Kaye, Danny keeping score; mental accounts and; regret and; responsibility and KEEP-LOSE study kidney cancer Killing Ground, The kitchen renovations Klein, Gary Knetsch, Jack know, use of word knowledge; reconstruction of past states of kouros Krueger, Alan Kunreuther, Howard Kuran, Timur labor negotiations Lady Macbeth effect language, complex vs. simple Larrick, Richard Larson, Gary law, see legal cases law of large numbers law of small numbers; and bias of confidence over doubt laziness of System 2 Layard, Richard leaderless group challenge leadership and business practices; at Google LeBoeuf, Robyn legal cases: civil, damages in; DNA evidence in; fourfold pattern and; frivolous; loss aversion in; malpractice; outcome bias in leisure time less-is-more pattern Lewis, Michael libertarian policies Lichtenstein, Sarah life: evaluation of; stories in; satisfaction in; thinking about Linda problem List, John loans logarithmic functions loss aversion; in animals; enhanced; goals as reference points in; in legal decisions; status quo and loss aversion ratio losses lotteries Lovallo, Dan Love Canal luck lying Malkiel, Burton Malmendier, Ulrike malpractice litigation Mao Zedong march of historyuote> Markowitz, Harry marriage; life satisfaction and Mathematical Psychology (Dawes, Tversky, and Coombs) matter, relation of mind to McFarland, Cathy media, availability heuristic and medical school admissions medical survey problem medicine; expertise in; malpractice litigation; overconfidence in; physicians; unique cases in; unusual treatments in Mednick, Sarnoff Meehl, Paul meetings memory, memories; associative, see associative memory; availability heuristic and, see availability; duration neglect in; experienced utility and; illusions of; and the remembering self; of vacations mental accounts mental effort, see effort mental energy mental shotgun mere exposure effect messages, persuasive metaphors Michigan/Detroit problem Michigan State University Michotte, Albert Miller, Dale mind, relation of matter to Mischel, Walter miswanting MIT money and wealth: cultural differences in attitudes toward; happiness and; income vs. leisure; mental accounts and; poverty; priming and; utility of Moneyball (Lewis) mood, see emotions and mood Morgenstern, Oskar Moses illusion motivation movies “MPG Illusion, The” (Larrick and Soll) mug experiments Mullainathan, Sendhil Müller-Lyer illusion multiple regression Mussweiler, Thomas mutual funds names: complicated; of famous people narrative fallacy narrow framing; disposition effect Naturalistic Decision Making (NDM) negativity dominance negotiations neuroeconomics New York Times, The New York University 9/11 Nisbett, Richard Nixon, Richard Nobel Prize norms norm theory novelty Nudge (Thaler and Sunstein) nutrition Oakland A’s Obama, Barack obesity Odean, Terry Office of Information and Regulatory Affairs one-sided evidence Oppenheimer, Danny optimal experience optimism; in CEOs; resilience and optimistic bias; competition neglect; in entrepreneurs; overconfidence; planning fallacy; premortem and; risk taking and Oregon Research Institute organ donation organizations outcome bias outside view ou> pain; chronic; cold-hand experiment and; colonoscopies and; duration neglect and; injection puzzle and; memory of; operation experiment and; peak-end rule and; in rats paraplegics parole past: and confusing experiences with memories; hindsight bias and; regret and pastness pattern seeking Pavlov, Ivan peak-end rule persuasive messages physicians; malpractice litigation and piano playing and weight, measuring plane crashes planning fallacy; mitigating plausibility pleasure; in rats Plott, Charles poignancy political experts political preference Pólya, George Pope, Devin Porras, Jerry I.
The reliance on the heuristic caused predictable biases (systematic errors) in their predictions. On another occasion, Amos and I wondered about the rate of divorce among professors in our university. We noticed that the question triggered a search of memory for divorced professors we knew or knew about, and that we judged the size of categories by the ease with which instances came to mind. We called this reliance on the ease of memory search the availability heuristic. In one of our studies, we asked participants to answer a simple question about words in a typical English text: Consider the letter K. Is K more likely to appear as the first letter in a word OR as the third letter? As any Scrabble player knows, it is much easier to come up with words that begin with a particular letter than to find words that have the same letter in the third position.
Mindware: Tools for Smart Thinking by Richard E. Nisbett
affirmative action, Albert Einstein, availability heuristic, big-box store, Cass Sunstein, choice architecture, cognitive dissonance, correlation coefficient, correlation does not imply causation, cosmological constant, Daniel Kahneman / Amos Tversky, dark matter, endowment effect, experimental subject, feminist movement, fixed income, fundamental attribution error, glass ceiling, Henri Poincaré, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, job satisfaction, lake wobegon effect, libertarian paternalism, loss aversion, low skilled workers, Menlo Park, meta analysis, meta-analysis, quantitative easing, Richard Thaler, Ronald Reagan, selection bias, Socratic dialogue, Steve Jobs, Steven Levy, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, William of Occam, Zipcar
It’s easier to come up with words beginning with r than words having an r in the third position—because we “file” words in our minds by their initial letters and so they’re more available as we rummage through memory. But in fact there are more words with r in the third position. One problem with using the availability heuristic for judgments of frequency or plausibility is that availability is tangled up with salience. Deaths by earthquake are easier to recall than deaths by asthma, so people overestimate the frequency of earthquake deaths in their country (by a lot) and underestimate the frequency of asthma deaths (hugely). Heuristics, including the representativeness heuristic and the availability heuristic, operate quite automatically and often unconsciously. This means it’s going to be hard to know just how influential they can be. But knowing about them allows us to reflect on the possibility that we’ve been led astray by them in a particular instance.
Honesty in the future is best predicted by honesty in the past, not by whether a person looks you steadily in the eye or claims a recent religious conversion. Competence as an editor is best predicted by prior performance as an editor, or at least by competence as a writer, and not by how verbally clever a person seems or how large the person’s vocabulary is. Another important heuristic Tversky and Kahneman identified is the availability heuristic. This is a rule of thumb we use to judge the frequency or plausibility of a given type of event. The more easily examples of the event come to mind, the more frequent or plausible they seem. It’s a perfectly helpful rule most of the time. It’s easier to come up with the names of great Russian novelists than great Swedish novelists, and there are indeed more of the former than the latter.
The next two chapters do that with anecdotes and realistic problems that can crop up in everyday life. The chapters are intended to help you build statistical heuristics—rules of thumb that will suggest correct answers for an indefinitely large number of everyday life events. These heuristics will shrink the range of events to which you will apply only intuitive heuristics, such as the representativeness and availability heuristics. Such heuristics invade the space of events for which only statistical heuristics are appropriate. Two years of thinking about rats or brains or memory for nonsense syllables produces little improvement in ability to apply statistical principles to everyday life events. Students in the hard areas of psychology may learn scarcely more than students in chemistry and law. I found that students in those fields gain literally nothing over two years in the way of ability to apply statistics to the everyday world.
Albert Einstein, Andrew Wiles, asset allocation, availability heuristic, backtesting, Black Swan, capital asset pricing model, cognitive dissonance, compound rate of return, computerized trading, Daniel Kahneman / Amos Tversky, distributed generation, Elliott wave, en.wikipedia.org, feminist movement, hindsight bias, index fund, invention of the telescope, invisible hand, Long Term Capital Management, mental accounting, meta analysis, meta-analysis, p-value, pattern recognition, Paul Samuelson, Ponzi scheme, price anchoring, price stability, quantitative trading / quantitative ﬁnance, Ralph Nelson Elliott, random walk, retrograde motion, revision control, risk tolerance, risk-adjusted returns, riskless arbitrage, Robert Shiller, Robert Shiller, Sharpe ratio, short selling, source of truth, statistical model, systematic trading, the scientific method, transfer pricing, unbiased observer, yield curve, Yogi Berra
That is to say, a faulty application of the generally useful representativeness rule biases us toward the perception of order where it does not exist. Heuristic Bias and the Availability Heuristic To recap, heuristics help us make complex decisions rapidly in spite of the limitations of human intelligence, but they can cause those decisions to be biased. The notion of heuristic bias is easily explained by considering the availability heuristic. We rely on the availability heuristic to estimate the likelihood of future events. It is based on the reasonable notion that the more easily we can bring to mind a particular class of events, the more likely it is that such events will occur in the future. Events that are easily brought to mind are said to be cognitively available. For example, plane crashes are a class of events with high cognitive availability. The availability heuristic makes a certain amount of sense. The ability to recall a class of events is indeed related to how frequently they have occurred in the past, and it is also true that events that have happened fre- 88 METHODOLOGICAL, PSYCHOLOGICAL, PHILOSOPHICAL, STATISTICAL FOUNDATIONS quently in the past are generally more likely to occur in the future.
The ability to recall a class of events is indeed related to how frequently they have occurred in the past, and it is also true that events that have happened fre- 88 METHODOLOGICAL, PSYCHOLOGICAL, PHILOSOPHICAL, STATISTICAL FOUNDATIONS quently in the past are generally more likely to occur in the future. This is in keeping with one theory of probability that asserts that the future likelihood of an event is related to its historical frequency.143 Taken as a class, thunderstorms have been more frequent in the past than asteroid impacts, and they do indeed have a higher future likelihood. The problem with the availability heuristic is that there are factors that can enhance an event’s cognitive availability that have nothing to do with its historical frequency and are, therefore, irrelevant to estimating its future likelihood. Consequently, our judgments of likelihood are sometimes falsely inﬂated by the intrusion of these irrelevant factors. Two factors that have no relevance to an event’s likelihood but that do increase its cognitive availability are recency and vividness.
That is to say, how recently144 an event of the type in question took place and how vivid the event was both impact how easily we can bring such events to mind. Consider plane crashes as a class of events. A plane crash that just occurred is both vivid and recent. As a result, in the period right after a well-publicized plane crash many people tend to overestimate the likelihood of future plane crashes and are inordinately fearful of ﬂying. Note the bias in the judgment is one of overestimating a probability. The availability heuristic never causes us to underestimate an event’s likelihood. The error is always one of overestimating the probability. The Representativeness Heuristic: Reasoning by Similarity The representativeness heuristic, which was ﬁrst identiﬁed by Tversky and Kahneman145 is of particular relevance to subjective TA. We use this rule to make intuitive classiﬁcation judgments. In other words, it is used to estimate the probability that a particular object, for example the dog before me, belongs to a particular class of objects, for example the class of poodles.
The Undoing Project: A Friendship That Changed Our Minds by Michael Lewis
Albert Einstein, availability heuristic, Cass Sunstein, choice architecture, complexity theory, Daniel Kahneman / Amos Tversky, Donald Trump, Douglas Hofstadter, endowment effect, feminist movement, framing effect, hindsight bias, John von Neumann, Kenneth Arrow, loss aversion, medical residency, Menlo Park, Murray Gell-Mann, Nate Silver, New Journalism, Paul Samuelson, Richard Thaler, Saturday Night Live, statistical model, the new new thing, Thomas Bayes, Walter Mischel, Yom Kippur War
“Each of the problems had an objectively correct answer,” Amos and Danny wrote, after they were done with their strange mini-experiments. “This is not the case in many real-life situations where probabilities are judged. Each occurrence of an economic recession, a successful medical operation, or a divorce, is essentially unique, and its probability cannot be evaluated by a simple tally of instances. Nevertheless, the availability heuristic may be applied to evaluate the likelihood of such events. “In judging the likelihood that a particular couple will be divorced, for example, one may scan one’s memory for similar couples which this question brings to mind. Divorces will appear probable if divorces are prevalent among the instances that are retrieved in this manner.” The point, once again, wasn’t that people were stupid.
This particular rule they used to judge probabilities (the easier it is for me to retrieve from my memory, the more likely it is) often worked well. But if you presented people with situations in which the evidence they needed to judge them accurately was hard for them to retrieve from their memories, and misleading evidence came easily to mind, they made mistakes. “Consequently,” Amos and Danny wrote, “the use of the availability heuristic leads to systematic biases.” Human judgment was distorted by . . . the memorable. Having identified what they took to be two of the mind’s mechanisms for coping with uncertainty, they naturally asked: Are there others? Apparently they were unsure. Before they left Eugene, they jotted down some notes about other possibilities. “The conditionality heuristic,” they called one of these.
What were they to make of systematic errors for which there was no apparent mechanism? “We really couldn’t think of others,” said Danny. “There seemed to be very few mechanisms.” Just as they never tried to explain how the mind forms the models that underpinned the representativeness heuristic, they left mostly to one side the question of why human memory worked in such a way that the availability heuristic had such power to mislead us. They focused entirely on the various tricks it could play. The more complicated and lifelike the situation a person was asked to judge, they suggested, the more insidious the role of availability. What people did in many complicated real-life problems—when trying to decide if Egypt might invade Israel, say, or their husband might leave them for another woman—was to construct scenarios.
Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler, Cass R. Sunstein
Al Roth, Albert Einstein, asset allocation, availability heuristic, call centre, Cass Sunstein, choice architecture, continuous integration, Daniel Kahneman / Amos Tversky, desegregation, diversification, diversified portfolio, endowment effect, equity premium, feminist movement, fixed income, framing effect, full employment, George Akerlof, index fund, invisible hand, late fees, libertarian paternalism, loss aversion, Mahatma Gandhi, Mason jar, medical malpractice, medical residency, mental accounting, meta analysis, meta-analysis, Milgram experiment, money market fund, pension reform, presumed consent, profit maximization, rent-seeking, Richard Thaler, Right to Buy, risk tolerance, Robert Shiller, Robert Shiller, Saturday Night Live, school choice, school vouchers, transaction costs, Vanguard fund, Zipcar
Availability How much should you worry about hurricanes, nuclear power, terrorism, mad cow disease, alligator attacks, or avian flu? And how much BIASES AND BLUNDERS care should you take in avoiding risks associated with each? What, exactly, should you do to prevent the kinds of dangers that you face in ordinary life? In answering questions of this kind, most people use what is called the availability heuristic. They assess the likelihood of risks by asking how readily examples come to mind. If people can easily think of relevant examples, they are far more likely to be frightened and concerned than if they cannot. A risk that is familiar, like that associated with terrorism in the aftermath of 9/11, will be seen as more serious than a risk that is less familiar, like that associated with sunbathing or hotter summers.
Thus vivid and easily imagined causes of death (for example, tornadoes) often receive inflated estimates of probability, and less-vivid causes (for example, asthma attacks) receive low estimates, even if they occur with a far greater frequency (here a factor of twenty). So, too, recent events have a greater impact on our behavior, and on our fears, than earlier ones. In all these highly available examples, the Automatic System is keenly aware of the risk (perhaps too keenly), without having to resort to any tables of boring statistics. The availability heuristic helps to explain much risk-related behavior, including both public and private decisions to take precautions. Whether people buy insurance for natural disasters is greatly affected by recent experiences.6 In the aftermath of an earthquake, purchases of new earthquake insurance policies rise sharply—but purchases decline steadily from that point, as vivid memories recede. If floods have not occurred in the immediate past, people who live on floodplains are far less likely to purchase insurance.
A survey by the Harvard School of Public Health found that about 44 percent of college students engaged in binge drinking in the two-week period preceding the survey.16 This is, of course, a problem, but a clue to how to correct it lies in the fact that most students believe that alcohol abuse is far more pervasive than it actually is.17 Misperceptions of this kind result in part from the availability heuristic. Incidents of alcohol abuse are easily recalled, and the consequence is to inﬂate perceptions. College students are inﬂuenced by their beliefs about what other college students do, and hence alcohol abuse will inevitably increase if students have an exaggerated sense of how much other students are drinking. Alert to the possibility of changing behavior by emphasizing the statistical reality, many public ofﬁcials have tried to nudge people in better direc- 67 68 HUMANS AND ECONS tions.
Superforecasting: The Art and Science of Prediction by Philip Tetlock, Dan Gardner
Affordable Care Act / Obamacare, Any sufficiently advanced technology is indistinguishable from magic, availability heuristic, Black Swan, butterfly effect, cloud computing, cuban missile crisis, Daniel Kahneman / Amos Tversky, desegregation, drone strike, Edward Lorenz: Chaos theory, forward guidance, Freestyle chess, fundamental attribution error, germ theory of disease, hindsight bias, index fund, Jane Jacobs, Jeff Bezos, Kenneth Arrow, Mikhail Gorbachev, Mohammed Bouazizi, Nash equilibrium, Nate Silver, obamacare, pattern recognition, performance metric, Pierre-Simon Laplace, place-making, placebo effect, prediction markets, quantitative easing, random walk, randomized controlled trial, Richard Feynman, Richard Feynman, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, Saturday Night Live, Silicon Valley, Skype, statistical model, stem cell, Steve Ballmer, Steve Jobs, Steven Pinker, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Watson beat the top human players on Jeopardy!
“Should I worry about the shadow in the long grass?” is a hard question. Without more data, it may be unanswerable. So we substitute an easier question: “Can I easily recall a lion attacking someone from the long grass?” That question becomes a proxy for the original question and if the answer is yes to the second question, the answer to the first also becomes yes. So the availability heuristic—like Kahneman’s other heuristics—is essentially a bait-and-switch maneuver. And just as the availability heuristic is usually an unconscious System 1 activity, so too is bait and switch.18 Of course we aren’t always oblivious to the machinations of our minds. If someone asks about climate change, we may say, “I have no training in climatology and haven’t read any of the science. If I tried to answer based on what I know I’d make a mess of it.
If that memory comes to you easily—it is not the sort of thing people tend to forget—you will conclude lion attacks are common. And then start to worry. Spelling out this process makes it sound ponderous, slow, and calculating but it can happen entirely within System 1—making it automatic, fast, and complete within a few tenths of a second. You see the shadow. Snap! You are frightened—and running. That’s the “availability heuristic,” one of many System 1 operations—or heuristics—discovered by Daniel Kahneman, his collaborator Amos Tversky, and other researchers in the fast-growing science of judgment and choice. A defining feature of intuitive judgment is its insensitivity to the quality of the evidence on which the judgment is based. It has to be that way. System 1 can only do its job of delivering strong conclusions at lightning speed if it never pauses to wonder whether the evidence at hand is flawed or inadequate, or if there is better evidence elsewhere.
Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Nicholas Taleb
Antoine Gombaud: Chevalier de Méré, availability heuristic, backtesting, Benoit Mandelbrot, Black Swan, commoditize, complexity theory, corporate governance, corporate raider, currency peg, Daniel Kahneman / Amos Tversky, discounted cash flows, diversified portfolio, endowment effect, equity premium, fixed income, global village, hindsight bias, Kenneth Arrow, Long Term Capital Management, loss aversion, mandelbrot fractal, mental accounting, meta analysis, meta-analysis, Myron Scholes, Paul Samuelson, quantitative trading / quantitative ﬁnance, QWERTY keyboard, random walk, Richard Feynman, Richard Feynman, road to serfdom, Robert Shiller, Robert Shiller, selection bias, shareholder value, Sharpe ratio, Steven Pinker, stochastic process, survivorship bias, too big to fail, Turing test, Yogi Berra
When Tversky and Kahneman sampled mathematical psychologists, some of whom were authors of statistical textbooks, they were puzzled by their errors. “Respondents put too much confidence in the result of small samples and their statistical judgment showed little sensitivity to sample size.”The puzzling aspect is that not only should they have known better, “they did know better.” And yet . . . I will next list a few more heuristics. (1) The availability heuristic, which we saw in Chapter 3 with the earthquake in California deemed more likely than catastrophe in the entire country, or death from terrorism being more “likely” than death from all possible sources (including terrorism). It corresponds to the practice of estimating the frequency of an event according to the ease with which instances of the event can be recalled. (2) The representativeness heuristic: gauging the probability that a person belongs to a particular social group by assessing how similar the person’s characteristics are to the “typical” group member’s.
Simpson had 1/500,000 chance of not being the killer from the blood standpoint (remember the lawyers used the sophistry that there were four people with such blood types walking around Los Angeles) and adding to it the fact that he was the husband of the person and that there was additional evidence, then (owing to the compounding effect) the odds against him rise to several trillion trillion. “Sophisticated” people make worse mistakes. I can surprise people by saying that the probability of the joint event is lower than either. Recall the availability heuristic: with the Linda problem rational and educated people finding the likelihood of an event greater than that of a larger one that encompasses it. I am glad to be a trader taking advantage of people’s biases but I am scared of living in such a society. An Absurd World Kafka’s prophetic book, The Trial, about the plight of a man, Joseph K., who is arrested for a mysterious and unexplained reason, hit a spot as it was written before we heard of the methods of the “scientific” totalitarian regimes.
Risk and emotions: Given the growing recent interest in the emotional role in behavior, there has been a growing literature on the role of emotions in both risk bearing and risk avoidance: The “risk as feeling” theory: See Loewenstein, Weber, Hsee and Welch (2001), and Slovic, Finucane, Peters and MacGregor (2003a). For a survey, see Slovic, Finucane, Peters and MacGregor (2003b). See also Slovic (1987). For a discussion of the affect heuristic: See Finucane, Alhakami, Slovic and Johnson (2000). Emotions and cognition: For the effect of emotions on cognition, see LeDoux (2002). Availability heuristic (how easily things come to mind): Tversky and Kahneman (1973). Real incidence of catastrophes: For an insightful discussion, see Albouy (2002). On sayings and proverbs: Psychologists have long examined the gullibility of people in social settings facing well-sounding proverbs. For instance, experiments since the 1960s have been made where people are asked whether they believed that a proverb is right, while another cohort is presented the opposite meaning.
Affordable Care Act / Obamacare, algorithmic trading, Andrei Shleifer, asset-backed security, availability heuristic, bank run, banking crisis, Black-Scholes formula, bonus culture, break the buck, Bretton Woods, call centre, Carmen Reinhart, cloud computing, collapse of Lehman Brothers, collateralized debt obligation, computerized trading, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, Daniel Kahneman / Amos Tversky, David Graeber, diversification, diversified portfolio, Edmond Halley, Edward Glaeser, endogenous growth, Eugene Fama: efficient market hypothesis, eurozone crisis, family office, financial deregulation, financial innovation, fixed income, Flash crash, Google Glasses, Gordon Gekko, high net worth, housing crisis, Hyman Minsky, implied volatility, income inequality, index fund, information asymmetry, Innovator's Dilemma, interest rate swap, Kenneth Rogoff, Kickstarter, late fees, London Interbank Offered Rate, Long Term Capital Management, loss aversion, margin call, Mark Zuckerberg, McMansion, money market fund, mortgage debt, mortgage tax deduction, Myron Scholes, negative equity, Network effects, Northern Rock, obamacare, payday loans, peer-to-peer lending, Peter Thiel, principal–agent problem, profit maximization, quantitative trading / quantitative ﬁnance, railway mania, randomized controlled trial, Richard Feynman, Richard Feynman, Richard Thaler, risk tolerance, risk-adjusted returns, Robert Shiller, Robert Shiller, short selling, Silicon Valley, Silicon Valley startup, Skype, South Sea Bubble, sovereign wealth fund, statistical model, transaction costs, Tunguska event, unbanked and underbanked, underbanked, Vanguard fund, web application
Even the rational buyer has an incentive to get into the market when prices are on a tear. The amount of space that people need increases predictably over time as they find partners and have children; it makes sense to buy early in order to protect themselves against the risk of future price increases that would make houses unaffordable. When prices start going up, another behavioral bias starts to kick in. The “availability heuristic” captures the propensity of people to assess situations by referring to examples that come readily to mind. A 2008 paper by Hugo Benitez-Silva, Selcuk Eren, Frank Heiland, and Sergi Jiménez-Martín used the Health and Retirement Study, a biennial survey of Americans over the age of fifty, to compare people’s estimates of the value of their homes with actual values when a sale took place.
Index AAA credit ratings, 49–51, 233–236 AARP Public Policy Institute, report on home ownership by, 139 Abacus, 235 Accenture, 54, 56 Adaptive-market hypothesis, 115–116 Adelino, Manuel, 49 Adoption, SIB program for, 97 Adverse selection, 21, 174, 175, 182 AIG (American International Group), 65 AIR Worldwide, 222, 225 Alabama, land boom in, 74–75 Algorithms, 53–54, 56–57, 62–63, 113, 202, 216–217 Alibaba.com, 219 Allia, 108 Alzheimer’s disease, megafund for, 122 Amazon, 162, 216–217, 219 American Diabetes Association, 102 American Dream Downpayment Act of 2003, 78 American International Group (AIG), 65 American Railroad Journal, 24 American Research and Development Corporation, 150 Amsterdam Stock Exchange, 14–15, 24, 38 Anchoring effect, 137–138 Annuities, 20–22, 139 Apax Partners, 91 Aristotle, 10 Asian debt crisis (1990s), x, 30 Asian Development Bank, 27 Auto-enrollment in pension schemes, 135 Auto-escalation, 135–136 Availability heuristic, 73 Baby boomers, retirement rate of, 125 Bailouts, xi, 35, 65 Bank, derivation of word, 12 Bank deregulation, effect of on college enrollments, 171 Bank for International Settlements (BIS), 224, 226 Bank of America, 98 Banks advantages of, 192–193 bailouts of, xi commercial paper, use of, 185 competition, response to, 193–194 crisis, episodes of, 35–36 in Dark Ages, 11 deposits, xiv, 12–13 equity, 186–187 innovator’s dilemma, 189 leverage, 50, 70–71, 80, 186, 188 liquidity and, 12–14, 185–186, 193 operating expenses, 188 profits of, ix property and, xiv, 69, 75–80 public attitudes toward, ix, xi purpose of, 11–14 raising returns in, 51 repurchase “repo” markets, 15, 185 runs on, x, 13, 185 secured lending, xiv unbanked households, 200 Barbon, Nicholas, 16–17 Basel accords, 77 Basildon, England, 52–53, 58 Bass, Oren, 166, 168 Behavioral finance, 132–138, 208–214 Belinsky, Michael, 103 Benartzi, Shlomo, 136 Benitez-Silva, Hugo, 73 Bernoulli, Jacob, 18 Betting on Lives (Clark), 144 Bid-ask spreads, 55 Big data, xviii, 22, 47, 199, 201, 218, 236 Big Society Capital, 95 Biotechnology, decline in investment in, xii-xiii, 114–115 Black, Fischer, 31, 123–124 Black Monday, October, 1987, 62 BlackRock, 132 Black-Scholes equation, 31, 32, 124 Blackstone, 85 Blood donation, experiment with, 110 Bloomberg, Michael, 98 Bonds attractiveness to investors, 120 catastrophe, 224–227 income, 25 inflation protected, 26 samurai, 27 Book of Calculation (Fibonacci), 19 Bottomry, 8 Brain, reaction of to monetary rewards, 116 Brazil, financial liberalization of, 34 Breslow, Noah, 216, 219 Bretton Woods system, 30 Bridges Ventures, 93 Britain average age of first-time home buyer, 84 average house price, 74 banking crisis, 69 equity-crowdfunding, 154 government spending, 99 life expectancy, 125 peer-to-peer lending, 181 social-impact bonds (SIB), 95–97 student indebtedness, 171 total residential property value, 69–70 Brown, Gordon, 93 Bucket price (okenedan), 40 Bullae (early financial contracts), 5 Bush, George W., 78 Byng, John, 143 Call options, 9–10, 131 Calment, Jeanne, 144 Cameron, David, 95 Cancer megafund.
The Science of Fear: How the Culture of Fear Manipulates Your Brain by Daniel Gardner
Atul Gawande, availability heuristic, Black Swan, Cass Sunstein, citizen journalism, cognitive bias, cognitive dissonance, Columbine, correlation does not imply causation, Daniel Kahneman / Amos Tversky, David Brooks, Doomsday Clock, feminist movement, haute couture, hindsight bias, illegal immigration, Intergovernmental Panel on Climate Change (IPCC), mandatory minimum, medical residency, Mikhail Gorbachev, millennium bug, moral panic, mutually assured destruction, nuclear winter, placebo effect, Ralph Nader, RAND corporation, Ronald Reagan, Stephen Hawking, Steven Levy, Steven Pinker, the scientific method, Tunguska event, uranium enrichment, Y2K, young professional
Be afraid! And you will be. You won’t know why, really, because System One’s operations are unconscious. You’ll just have an uneasy feeling that taking a walk is dangerous—a feeling you would have trouble explaining to someone else. What System One did is apply a simple rule of thumb: If examples of something can be recalled easily, that thing must be common. Psychologists call this the “availability heuristic.” Obviously, System One is both brilliant and flawed. It is brilliant because the simple rules of thumb System One uses allow it to assess a situation and render a judgment in an instant—which is exactly what you need when you see a shadow move at the back of an alley and you don’t have the latest crime statistics handy. But System One is also flawed because the same rules of thumb can generate irrational conclusions.
There was no grand theorizing, only research so solid it would withstand countless challenges in the years ahead. Like the paper itself, the three rules of thumb it revealed were admirably simple and clear. The first—the Anchoring Rule—we’ve already discussed. The second is what psychologists call the “representativeness heuristic,” which I’ll call the Rule of Typical Things. And finally, there is the “availability heuristic,” or the Example Rule, which is by far the most important of the three in shaping our perceptions and reactions to risk. THE RULE OF TYPICAL THINGS Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
But it’s strange that people let their insurance lapse as time passes. And it’s downright bizarre that people don’t rush to get insurance when scientists issue warnings. At least, it makes no sense to Head. To Gut, it makes perfect sense. One of Gut’s simplest rules of thumb is that the easier it is to recall examples of something, the more common that something must be. This is the “availability heuristic,” which I call the Example Rule. Kahneman and Tversky demonstrated the influence of the Example Rule in a typically elegant way. First, they asked a group of students to list as many words as they could think of that fit the form _ _ _ _ _ n _. The students had 60 seconds to work on the problem. The average number of words they came up with was 2.9. Then another group of students was asked to do the same, with the same time limit, for words that fit the form _ _ _ _ ing.
How Markets Fail: The Logic of Economic Calamities by John Cassidy
Albert Einstein, Andrei Shleifer, anti-communist, asset allocation, asset-backed security, availability heuristic, bank run, banking crisis, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, Black-Scholes formula, Bretton Woods, British Empire, capital asset pricing model, centralized clearinghouse, collateralized debt obligation, Columbine, conceptual framework, Corn Laws, corporate raider, correlation coefficient, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, crony capitalism, Daniel Kahneman / Amos Tversky, debt deflation, diversification, Elliott wave, Eugene Fama: efficient market hypothesis, financial deregulation, financial innovation, Financial Instability Hypothesis, financial intermediation, full employment, George Akerlof, global supply chain, Gunnar Myrdal, Haight Ashbury, hiring and firing, Hyman Minsky, income per capita, incomplete markets, index fund, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), invisible hand, John Nash: game theory, John von Neumann, Joseph Schumpeter, Kenneth Arrow, laissez-faire capitalism, Landlord’s Game, liquidity trap, London Interbank Offered Rate, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, market bubble, market clearing, mental accounting, Mikhail Gorbachev, money market fund, Mont Pelerin Society, moral hazard, mortgage debt, Myron Scholes, Naomi Klein, negative equity, Network effects, Nick Leeson, Northern Rock, paradox of thrift, Pareto efficiency, Paul Samuelson, Ponzi scheme, price discrimination, price stability, principal–agent problem, profit maximization, quantitative trading / quantitative ﬁnance, race to the bottom, Ralph Nader, RAND corporation, random walk, Renaissance Technologies, rent control, Richard Thaler, risk tolerance, risk-adjusted returns, road to serfdom, Robert Shiller, Robert Shiller, Ronald Coase, Ronald Reagan, shareholder value, short selling, Silicon Valley, South Sea Bubble, sovereign wealth fund, statistical model, technology bubble, The Chicago School, The Great Moderation, The Market for Lemons, The Wealth of Nations by Adam Smith, too big to fail, transaction costs, unorthodox policies, value at risk, Vanguard fund, Vilfredo Pareto, wealth creators, zero-sum game
And it isn’t just their personal histories that cloud people’s judgment. Dramatic and salient events of any kind stick in their minds, whereas they are apt to downplay everyday happenings. Since 9/11, for example, many Americans fret more about being killed in a terrorist attack than in a road accident, even though in advanced countries the latter outcome is roughly four hundred times more likely. Kahneman and Tversky refer to this sort of thing as the “availability heuristic.” When thinking about the dangers we face, getting blown up comes to mind more readily than getting run over: it is more available. Kahneman and Tversky also pointed out that people have a general inclination to judge things relative to arbitrary reference points. When the status quo is their reference point, people tend to assume that things won’t change very much—the bias of conservatism.
“He may convince himself that the valuation is right and that the market does not reflect the full economic value of the combined firm.” Hubris comes in many forms. Especially when the economy is doing well, businesses and individuals find it increasingly difficult to imagine that anything very bad could happen—a phenomenon known as “disaster myopia.” The representativeness heuristic obviously plays a role here, but so does the availability heuristic. By definition, low-probability events such as stock market crashes and credit crunches occur rarely, which means that many people don’t have any personal experience of them to draw on. After the 1981–82 recession, it was almost twenty years before the stock market underwent another lengthy downturn. (After the 1987 crash, the market rebounded rapidly.) With the market going up, a lot of money was made.
They know how to track these defaults, and they do a reasonable job of setting aside reserves to meet them. But Guttentag and Herring pointed out that the banks tend to underestimate the chances of a systemic shock that could render many of their lenders simultaneously unable to repay their loans, such as the American economy plunging into a deep recession, or a sovereign government defaulting on its loans. When the economy is growing strongly, such a possibility is difficult to imagine (the availability heuristic), and bankers downplay it. Eventually, the danger comes to be seen as so remote that it is ignored (the threshold heuristic), and banks take on too much lending exposure relative to their capital. Myopia is another mental trait that behavioral economists have examined. In the late 1990s and early 2000s, the U.S. personal savings rate fell sharply. Eventually, it turned negative, meaning Americans were spending more than they were earning, and were running up debts.
Pedigree: How Elite Students Get Elite Jobs by Lauren A. Rivera
affirmative action, availability heuristic, barriers to entry, Donald Trump, fundamental attribution error, glass ceiling, income inequality, job satisfaction, knowledge economy, meta analysis, meta-analysis, new economy, performance metric, profit maximization, profit motive, school choice, Silicon Valley, Silicon Valley startup, The Wisdom of Crowds, unpaid internship, women in the workforce, young professional
Moreover, Americans often conflate the idea of an economic class structure with that of a caste structure; if there is any mobility, it is taken as evidence of a lack of the former.4 And indeed there is some minor fluidity between strata. Most people can think of at least one person they know (often from one or more generations ago) who rose to riches despite modest means or fell from positions of affluence. In line with what behavioral economists call the availability heuristic, we tend to overgeneralize from these familiar cases to believe that mobility is far more common (and possible) than it is.5 The reality is that in any class system, including ours, mobility happens.6 But the deck is stacked against it. Chances are that children will end up in the same economic quintile as their parents.7 Furthermore, studying economic elites tends to be problematic in the United States not only because of the types of ideological barriers noted above but also because, unlike the United Kingdom, for example, the United States lacks historic metrics for measuring relative class position.
It is not interviewers’ heart rates, cortisol levels, or bodily distance, nor the objective facts of an interaction that they record on interview forms and refer to during group deliberations that provide the basis for a final decision. Rather, decisions turn on interviewers’ subjective interpretations of interviews and candidates. Moreover, people typically draw from cognitive attributions and interpretations when making complex, high-stakes decisions and those where there is a sense of personal accountability (Leach and Tiedens 2004), such as in the hiring decisions analyzed here. 11. This is a subset of the availability heuristic. For a review of this and other cognitive biases in decision making, see Kahneman 2011. 12. Chen and Miller 2012; Duckworth et al. 2007. 13. Durkheim (1912) 1995. See also Collins 2004. 14. Granfield 1992; Lubrano 2005. 15. Phillips, Rothbard, and Dumas 2009. 16. Bourdieu 1984. 17. Lamont (1992) finds a similar aversion to individuals who are motivated by money rather than personal enjoyment and moral character among American, upper-middle-class professionals and managers. 18.
See candidates Armstrong, Elizabeth, 13 arrogance, 178–79, 199 autobiographical narratives, 147–82, 259, 269; assessment of drive in, 147, 149–61, 333–34nn10–11; assessment of interest in firm in, 147, 161–69, 255–56, 335n17; assessment of polish in, 147, 170–81, 335n19; compelling qualities of, 148–55, 334n10; cultural dimensions of, 148; evidence of growth and maturation in, 153–54; evidence of knowledge of firm in, 165–68; resonance with interviewers of, 149, 155–56, 165–66, 182, 257–58, 335n11; vivid images of obstacles in, 156–61 availability heuristic, 287, 335n11 baller lifestyle, 59–62, 272 “better match” hypothesis, 49 Biglaw. See law firms Blake Thomas (fictitious candidate), 89, 92; extracurricular activities of, 94–95, 97; résumé of, 303f boasting, 179 bounded excitement, 176–78, 336n27 Bourdieu, Pierre: on class-based linguistic and interaction styles, 180; on class-specific cultural values, 7, 268; on distance from necessity, 163–64, 269, 330n16; on embodied cultural capital, 7, 316–17n31; on habitus, 330n29; on institutionalized cultural capital, 110, 330n32, 335n21; on objectified cultural capital, 316–17n31; on the power of consecration, 318n45 brainpower, 87–90 breaking the ice.
The Irrational Economist: Making Decisions in a Dangerous World by Erwann Michel-Kerjan, Paul Slovic
Andrei Shleifer, availability heuristic, bank run, Black Swan, Cass Sunstein, clean water, cognitive dissonance, collateralized debt obligation, complexity theory, conceptual framework, corporate social responsibility, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-subsidies, Daniel Kahneman / Amos Tversky, endowment effect, experimental economics, financial innovation, Fractional reserve banking, George Akerlof, hindsight bias, incomplete markets, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Isaac Newton, iterative process, Kenneth Arrow, Loma Prieta earthquake, London Interbank Offered Rate, market bubble, market clearing, money market fund, moral hazard, mortgage debt, Pareto efficiency, Paul Samuelson, placebo effect, price discrimination, price stability, RAND corporation, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, source of truth, statistical model, stochastic process, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, ultimatum game, University of East Anglia, urban planning, Vilfredo Pareto
Third, the expected consequence of an event represents a combination of its likelihood and its severity. The floods in 1993 were much more severe than anything previously experienced. There should have been an update in the belief of residents there about the severity of future floods, a factor that would certainly affect housing values.5 Two subject areas in behavioral economics, prospect theory and the availability heuristic, help explain the over-updating of virgin risks and the under-updating of experienced risks after an extreme event. A finding of prospect theory is that individuals place excess weight on zero. The Russian Roulette problem illustrates this phenomenon. Most people are willing to pay more to remove one bullet from a six-cylinder gun when it is the only bullet than if there are two (or more) bullets in the gun.
The enormous change in perception when a probability goes from 0 to positive is consistent with evidence from other areas. The theory of just noticeable differences explores such phenomena. For instance, as a noise gets louder, a greater change in volume is needed to make the change perceptible. This is similar to our argument that as base probabilities get larger, small changes in probability are not perceived as well. The availability heuristic also supports our conjecture. It asserts that individuals assess the probability of an event as higher when examples come to mind more readily. Once an event has occurred, it is much more salient, leading individuals to overestimate its probability. While the first occurrence of a risk makes it suddenly salient, the third occurrence, say, does not add much to its availability. This would explain the substantial updating for virgin risks and the relatively little updating for previously experienced risks.
Mastermind: How to Think Like Sherlock Holmes by Maria Konnikova
Albert Einstein, Alfred Russel Wallace, availability heuristic, Daniel Kahneman / Amos Tversky, dark matter, delayed gratification, fear of failure, feminist movement, functional fixedness, Lao Tzu, pre–internet, Richard Feynman, Richard Feynman, Steve Jobs, Steven Pinker, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Walter Mischel
Watson is conveniently forgetting how long it took to get to know his past companions—assuming he ever got to know them at all. (Consider also that Watson is a bachelor, just returned from war, wounded, and largely friendless. What would his chronic motivational state likely be? Now, imagine he’d been instead married, successful, the toast of the town. Replay his evaluation of Mary accordingly.) This tendency is a common and powerful one, known as the availability heuristic: we use what is available to the mind at any given point in time. And the easier it is to recall, the more confident we are in its applicability and truth. In one of the classic demonstrations of the effect, individuals who had read unfamiliar names in the context of a passage later judged those names as famous—based simply on the ease with which they could recall them—and were subsequently more confident in the accuracy of their judgments.
INDEX activation, ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8, ref9, ref10 activation spread, ref1, ref2 active perception, compared with passive perception, ref1 Adams, Richard, ref1 adaptability, ref1 ADHD, ref1 “The Adventure of the Abbey Grange,” ref1, ref2, ref3, ref4, ref5 “The Adventure of the Blue Carbuncle,” ref1, ref2 “The Adventure of the Bruce-Partington Plans,” ref1, ref2, ref3, ref4 “The Adventure of the Copper Beeches,” ref1, ref2, ref3 “The Adventure of the Creeping Man,” ref1 “The Adventure of the Devil’s Foot,” ref1, ref2 “The Adventure of the Dying Detective,” ref1 “The Adventure of the Mazarin Stone,” ref1 “The Adventure of the Norwood Builder,” ref1, ref2, ref3, ref4, ref5, ref6 “The Adventure of the Priory School,” ref1, ref2, ref3 “The Adventure of the Red Circle,” ref1, ref2, ref3, ref4, ref5 “The Adventure of the Second Stain,” ref1 “The Adventure of the Veiled Lodger,” ref1 “The Adventure of Wisteria Lodge,” ref1, ref2 affect heuristic, ref1 Anson, George, ref1 associative activation, ref1, ref2, ref3, ref4 astronomy, and Sherlock Holmes, ref1 Atari, ref1 attention, paying, ref1, ref2, ref3, ref4, ref5, ref6, ref7 attentional blindness, ref1 Auden, W. H., ref1 availability heuristic, ref1 Bacon, Francis, ref1 Barrie, J. M., ref1 base rates, ref1, ref2 Baumeister, Roy, ref1 Bavelier, Daphné, ref1 Bell, Joseph, ref1, ref2, ref3, ref4, ref5 Bem, Daryl, ref1, ref2 bias, implicit, ref1, ref2, ref3 BlackBerry, ref1 brain and aging process, ref1 baseline, ref1 cerebellum, ref1 cingulate cortex, ref1, ref2, ref3 corpus collosum, ref1 frontal cortex, ref1 hippocampus, ref1, ref2, ref3 parietal cortex, ref1 precuneus, ref1 prefrontal cortex, ref1 split, ref1, ref2, ref3 tempero-parietal junction (TPJ), ref1 temporal gyrus, ref1 temporal lobes, ref1 wandering, ref1, ref2 Watson’s compared with Holmes’, ref1 brain attic contents, ref1, ref2 defined, ref1 levels of storage, ref1 and memory, ref1 structure, ref1, ref2 System Watson compared with System Holmes, ref1, ref2 Watson’s compared with Holmes’s, ref1, ref2 Brett, Jeremy, ref1 capital punishment, ref1 Carpenter, William B., ref1 “The Case of the Crooked Lip,” ref1 cell phone information experiment, ref1 cerebellum, ref1 childhood, mindfulness in, ref1 cingulate cortex, ref1, ref2, ref3 cocaine, ref1 Cognitive Reflection Test (CRT), ref1, ref2 common sense, systematized, ref1, ref2 compound remote associates, ref1 Conan Doyle, Arthur becomes spiritualist, ref1 creation of Sherlock Holmes character, ref1 and fairy photos, ref1, ref2, ref3, ref4, ref5 and Great Wyrley sheep murders, ref1, ref2, ref3 and Joseph Bell, ref1, ref2, ref3, ref4, ref5 confidence, ref1, ref2.
Inside the Nudge Unit: How Small Changes Can Make a Big Difference by David Halpern
Affordable Care Act / Obamacare, availability heuristic, carbon footprint, Cass Sunstein, centre right, choice architecture, cognitive dissonance, collaborative consumption, correlation does not imply causation, Daniel Kahneman / Amos Tversky, endowment effect, happiness index / gross national happiness, hindsight bias, illegal immigration, job satisfaction, Kickstarter, libertarian paternalism, light touch regulation, market design, meta analysis, meta-analysis, Milgram experiment, nudge unit, peer-to-peer lending, pension reform, presumed consent, QR code, quantitative easing, randomized controlled trial, Richard Feynman, Richard Thaler, Right to Buy, Ronald Reagan, Rory Sutherland, Simon Kuznets, skunkworks, the built environment, theory of mind, traffic fines, World Values Survey
For example, people generally don’t estimate the safety of air versus car travel by dividing the number of crashes over the last year by the number of planes versus cars travelling in the world over that time. Rather, most people use a mental shortcut along the lines of how easily they can recall examples of planes versus cars crashing – what Tversky and Kahneman called an ‘availability’ heuristic. The more easily the person can call to mind an example, the more likely or common they infer it to be. It’s generally not a bad heuristic. It gives you a pretty good idea of how many tigers versus pigeons you might meet walking around the streets of London or New York. But when it comes to aeroplane versus car safety, using the availability heuristic can lead us badly astray. Rare but devastating air crashes make the news and stick in our minds, but the daily death toll on our roads passes largely without comment or lasting attention. As such, most people ‘feel’ that flying is much more dangerous than travelling by car, despite the statistics – and certainly expressed by mile travelled – suggesting the reverse.12 These erroneous estimates can have enormous consequences.
Progress: Ten Reasons to Look Forward to the Future by Johan Norberg
agricultural Revolution, anti-communist, availability heuristic, Bartolomé de las Casas, Berlin Wall, British Empire, business climate, clean water, continuation of politics by other means, Daniel Kahneman / Amos Tversky, demographic transition, desegregation, Donald Trump, Flynn Effect, germ theory of disease, Gini coefficient, Gunnar Myrdal, Haber-Bosch Process, Hans Island, Hans Rosling, Ignaz Semmelweis: hand washing, income inequality, income per capita, indoor plumbing, Isaac Newton, Jane Jacobs, John Snow's cholera map, Kibera, Louis Pasteur, Mahatma Gandhi, meta analysis, meta-analysis, Mikhail Gorbachev, more computing power than Apollo, moveable type in China, Naomi Klein, open economy, place-making, Rosa Parks, sexual politics, special economic zone, Steven Pinker, telerobotics, The Wealth of Nations by Adam Smith, transatlantic slave trade, very high income, working poor, Xiaogang Anhui farmers, zero-sum game
When news reporters do not have access to a spectacular event, we often fill in the gaps with rumours and horror stories. When something bad happens anywhere, two billion smartphones will nowadays make sure that we find out, even if no reporters are on the scene. The psychologists Daniel Kahneman and Amos Tversky have shown that people do not base their estimates of how frequent something is on data, but on how easy it is to recall examples from memory.16 This ‘availability heuristic’ means that the more memorable an incident is, the more probable we think it is, so we imagine that horrible and shocking things, which stay in our thoughts, are more frequent than they are. We are probably built to be worried. We are interested in exceptions. We notice the new things, the strange and unexpected. It’s natural. We don’t have to explain and understand normal, everyday things, but we do need to understand the exceptions.
airport security, availability heuristic, Bayesian statistics, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, big-box store, Black Swan, Broken windows theory, Carmen Reinhart, Claude Shannon: information theory, Climategate, Climatic Research Unit, cognitive dissonance, collapse of Lehman Brothers, collateralized debt obligation, complexity theory, computer age, correlation does not imply causation, Credit Default Swap, credit default swaps / collateralized debt obligations, cuban missile crisis, Daniel Kahneman / Amos Tversky, diversification, Donald Trump, Edmond Halley, Edward Lorenz: Chaos theory, en.wikipedia.org, equity premium, Eugene Fama: efficient market hypothesis, everywhere but in the productivity statistics, fear of failure, Fellow of the Royal Society, Freestyle chess, fudge factor, George Akerlof, haute cuisine, Henri Poincaré, high batting average, housing crisis, income per capita, index fund, Intergovernmental Panel on Climate Change (IPCC), Internet Archive, invention of the printing press, invisible hand, Isaac Newton, James Watt: steam engine, John Nash: game theory, John von Neumann, Kenneth Rogoff, knowledge economy, locking in a profit, Loma Prieta earthquake, market bubble, Mikhail Gorbachev, Moneyball by Michael Lewis explains big data, Monroe Doctrine, mortgage debt, Nate Silver, negative equity, new economy, Norbert Wiener, PageRank, pattern recognition, pets.com, Pierre-Simon Laplace, prediction markets, Productivity paradox, random walk, Richard Thaler, Robert Shiller, Robert Shiller, Rodney Brooks, Ronald Reagan, Saturday Night Live, savings glut, security theater, short selling, Skype, statistical model, Steven Pinker, The Great Moderation, The Market for Lemons, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, transfer pricing, University of East Anglia, Watson beat the top human players on Jeopardy!, wikimedia commons
Thus, cockpit doors were not tightly sealed and were often left entirely unlocked in practice.35 Yet suicide attacks had a rich history36—including, of course, the Japanese kamikaze pilots in World War II.37 Moreover, suicide attacks had become much more common in the years immediately preceding September 11; one database of terrorist incidents38 documented thirty-nine of them in 2000 alone, including the bombing of the USS Cole in Yemen, up from thirty-one in the 1980s. However, World War II was a distant memory, and most of the suicide attacks had occurred in the Middle East or in Third World countries. The mental shortcut that Daniel Kahneman calls the availability heuristic39—we tend to overrate the likelihood of events that are nearer to us in time and space and underpredict the ones that aren’t—may have clouded our judgment. “You can reasonably predict behavior if people would prefer not to die,” Rumsfeld told me. “But if people are just as happy dying, or feel that it’s a privilege or that it achieves their goal, then they’re going to behave in a very different way.”
Murrah Federal Building, 425 algorithms, 265, 426 all-in bet, 306 Allison, Graham, 433–35 Al Qaeda, 422, 424, 425, 426, 433, 435–36, 440, 444 Alzheimer’s, 420 Amazon.com, 352–53, 500 American exceptionalism, 10 American Football League (AFL), 185–86, 480 American League, 79 American Stock Exchange, 334 Amsterdam, 228 Anchorage, Alaska, 149 Anderson, Chris, 9 Angelo, Tommy, 324–26, 328 animals, earthquake prediction and, 147–48 Annals of Applied Statistics, 511–12 ANSS catalog, 478 Antarctic, 401 anthropology, 228 antiretroviral therapy, 221 Apple, 264 Archilochus, 53 Arctic, 397, 398 Arianism, 490 Aristotle, 2, 112 Armstrong, Scott, 380–82, 381, 388, 402–3, 405, 505, 508 Arrhenius, Svante, 376 artificial intelligence, 263, 293 Asia, 210 asset-price bubble, 190 asymmetrical information, 35 Augustine, Saint, 112 Australia, 379 autism, 218, 218, 487 availability heuristic, 424 avian flu, see bird flu A/Victoria flu strain, 205–6, 208, 483 Babbage, Charles, 263, 283 Babyak, Michael, 167–68 baby boom, 31 Babylonians, 112 Bachmann, Michele, 217 bailout bills, 19, 461 Bak, Per, 172 Baker, Dean, 22 Bane, Eddie, 87 Bank of England, 35 Barbour, Haley, 140 baseball, 9, 10, 16, 74–106, 128, 426, 446, 447, 451n aging curve in, 79, 81–83, 81, 83, 99, 164 betting on, 286 luck vs. skill in, 322 minor league system in, 92–93 results in, 327 rich data in, 79–80, 84 Baseball America, 75, 87, 89, 90, 90, 91 Baseball Encyclopedia, 94 Baseball Prospectus, 75, 78, 88, 297 basic reproduction number (R0), 214–15, 215, 224, 225, 486 basketball, 80n, 92–93, 233–37, 243, 246, 256, 258, 489 batting average, 86, 91, 95, 100, 314, 321, 321, 339 Bayer Laboratories, 11–12, 249 Bayes, Thomas, 240–43, 251, 253, 254, 255, 490 Bayesian reasoning, 240, 241–42, 259, 349, 444 biases and beliefs in, 258–59 chess computers’ use of, 291 Christianity and, 490 in climatology, 371, 377–78, 403, 406–7, 407, 410–11 consensus opinion and, 367 Fisher’s opposition to, 252 gambling esteemed in, 255–56, 362 priors in, 244, 245, 246, 252, 255, 258–59, 260, 403, 406–7, 433n, 444, 451, 490, 497 stock market and, 259–60 Bayes’s theorem, 15, 16, 242, 243–49, 246, 247, 248, 249, 250, 258, 266, 331, 331, 448–49, 450–51 in poker, 299, 301, 304, 306, 307, 322–23 Beane, Billy, 77, 92, 93–94, 99–100, 103, 105–7, 314 Bear Stearns, 37 beauty, complexity and, 173 beer, 387, 459 behavioral economics, 227–28 Belgium, 459 Bellagio, 298–99, 300, 318, 495 bell-curve distribution, 368n, 496 Bengkulu, Indonesia, 161 Benjamin, Joel, 281 Berlin, Isaiah, 53 Berners-Lee, Tim, 448, 514 BetOnSports PLC, 319 bets, see gambling Betsy, Hurricane, 140 betting markets, 201–3, 332–33 see also Intrade biases, 12–13, 16, 293 Bayesian theory’s acknowledgment of, 258–59 in chess, 273 and errors in published research, 250 favorite-longshot, 497 of Fisher, 255 objectivity and, 72–73 toward overconfidence, 179–83, 191, 203, 454 in polls, 252–53 as rational, 197–99, 200 of scouts, 91–93, 102 of statheads, 91–93 of weather forecasts, 134–38 Bible, 2 Wicked, 3, 13 Biden, Joseph, 48 Big Data, 9–12, 197, 249–50, 253, 264, 289, 447, 452 Big Short, The (Lewis), 355 Billings, Darse, 324 Bill James Baseball Abstract, The, 77, 78, 84 bin Laden, Osama, 432, 433, 434, 440, 509 binomial distribution, 479 biological weapons, 437, 438, 443 biomedical research, 11–12, 183 bird flu, 209, 216, 229 Black, Fisher, 362, 367, 369 “Black Friday,” 320 Black Swan, The (Taleb), 368n Black Tuesday, 349 Blanco, Kathleen, 140 Blankley, Tony, 50 Blodget, Henry, 352–54, 356, 364–65, 500 Blue Chip Economic Indicators survey, 199, 335–36 Bluefire, 110–11, 116, 118, 127, 131 bluffing, 301, 303, 306, 310, 311, 328 Bonus Baby rule, 94 books, 2–4 cost of producing, 2 forecasting and, 5 number of, 2–3, 3, 459 boom, dot-com, 346–48, 361 Boston, 77 Boston Red Sox, 63, 74–77, 87, 102, 103–5 Bowman, David, 161–62, 167 Box, George E.
A Declaration of the Independence of Cyberspace, Albert Einstein, AltaVista, Amazon Mechanical Turk, Asperger Syndrome, availability heuristic, Benoit Mandelbrot, biofilm, Black Swan, British Empire, conceptual framework, corporate governance, Danny Hillis, Douglas Engelbart, Douglas Engelbart, Emanuel Derman, epigenetics, Flynn Effect, Frank Gehry, Google Earth, hive mind, Howard Rheingold, index card, information retrieval, Internet Archive, invention of writing, Jane Jacobs, Jaron Lanier, John Markoff, Kevin Kelly, lifelogging, lone genius, loss aversion, mandelbrot fractal, Marc Andreessen, Marshall McLuhan, Menlo Park, meta analysis, meta-analysis, New Journalism, Nicholas Carr, out of africa, Paul Samuelson, peer-to-peer, Ponzi scheme, pre–internet, Richard Feynman, Richard Feynman, Rodney Brooks, Ronald Reagan, Schrödinger's Cat, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, Skype, slashdot, smart grid, social graph, social software, social web, Stephen Hawking, Steve Wozniak, Steven Pinker, Stewart Brand, Ted Nelson, telepresence, the medium is the message, the scientific method, The Wealth of Nations by Adam Smith, theory of mind, trade route, upwardly mobile, Vernor Vinge, Whole Earth Catalog, X Prize
For more than sixty years, psychologists have been reporting the human tendency to mistake repetition for truth. This is called the illusion-of-truth effect. You believe to be true what you hear often. The same applies to whatever comes to mind first or most easily. People, including you, believe the examples they can think of right away to be most representative and therefore indicative of the truth. This is called the availability heuristic. Let me give you a famous example. In English, what is the relative proportion of words that start with the letter K versus words that have the letter K in third position? The reason most people believe the former to be more common than the latter is that they can easily remember a lot of words that start with a K but few that have a K in the third position. The truth in fact is that there are three times more words with K in third than in first position.
The Weather of the Future by Heidi Cullen
2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, air freight, American Society of Civil Engineers: Report Card, availability heuristic, back-to-the-land, bank run, California gold rush, carbon footprint, clean water, colonial rule, energy security, illegal immigration, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, mass immigration, megacity, millennium bug, out of africa, Silicon Valley, smart cities, trade route, urban planning, Y2K
Climatic Change 77 (1–2), 103–120 (2006). See also Marx, S. M. et al., Communication and Mental Processes: Experiential and Analytic Processing of Uncertain Climate Information. Global Environmental Change 17 (1), 47–58 (2007). 4. Hirshleifer, D., and Shumway, T., Good Day Sunshine: Stock Returns and the Weather. Journal of Finance 58 (3), 1009–1032 (2003). 5. Sunstein, C. R., The Availability Heuristic, Intuitive Cost-Benefit Analysis, and Climate Change. Climatic Change 77 (1–2), 195–210 (2006). 1. Imbrie, J., and Imbrie, K. P., Ice Ages: Solving the Mystery (Enslow Publishers, Short Hills, NJ, 1979). 2. Weart, S. R. ed., The Discovery of Global Warming (Harvard University Press, Cambridge, MA, 2003). 3. Agassiz, L., Etudes sur les glaciers (privately published, Neuchâtel, 1840). 4.
Made to Stick: Why Some Ideas Survive and Others Die by Chip Heath, Dan Heath
affirmative action, availability heuristic, Barry Marshall: ulcers, correlation does not imply causation, desegregation, Menlo Park, Ronald Reagan, Rosa Parks, shareholder value, Silicon Valley, Stephen Hawking, telemarketer
When the vivid details support the core message, it is more memorable and convincing, but irrelevant vivid details can also distract people from the core and make a message less memorable and convincing (thus the concern, in educational psychology, about “seductive details”). A good summary of the issues can be found in Ernest T. Goetz and Mark Sadoski, “Commentary: The Perils of Seduction: Distracting Details or Incomprehensible Abstractions?” Reading Research Quarterly 30 (1995), 500–11. In 1986, Jonathan Shedler and Melvin Manis: Jonathan Shedler and Melvin Manis, “Can the Availability Heuristic Explain Vividness Effects?” Journal of Personality and Social Psychology 51 (1986), 26–36. “If, say, a soccer team”: The Covey example is from an excerpt from his book reprinted in Fortune, November 29, 2004, 162. A SHARK A DEER: We thank Tim O’Hara for the idea for the comparison in Message 2 of the Shark Attack Hysteria Clinic. Edible Fabrics: William McDonough, 2003 Conradin Von Gugelberg Memorial Lecture on the Environment, Stanford University, February 11, 2003; www.gsb.stanford.edu/news/headlines/2003_vongugelberg.shtml.
Priceless: The Myth of Fair Value (And How to Take Advantage of It) by William Poundstone
availability heuristic, Cass Sunstein, collective bargaining, Daniel Kahneman / Amos Tversky, delayed gratification, Donald Trump, East Village, en.wikipedia.org, endowment effect, equal pay for equal work, experimental economics, experimental subject, feminist movement, game design, German hyperinflation, Henri Poincaré, high net worth, index card, invisible hand, John von Neumann, Kenneth Arrow, laissez-faire capitalism, Landlord’s Game, loss aversion, market bubble, mental accounting, meta analysis, meta-analysis, Nash equilibrium, new economy, Paul Samuelson, payday loans, Philip Mirowski, Potemkin village, price anchoring, price discrimination, psychological pricing, Ralph Waldo Emerson, RAND corporation, random walk, RFID, Richard Thaler, risk tolerance, Robert Shiller, Robert Shiller, rolodex, Steve Jobs, The Chicago School, The Wealth of Nations by Adam Smith, ultimatum game, working poor
Said one in defense of his answer, “I thought you only asked for my opinion.” • • • Which is more common, words that begin with r (like “road”) or words with r as the third letter (like “car”)? Most say that words beginning with r are more common. It’s easy to rattle off words beginning with r; harder and slower to free-associate words with r in third place. This is an example of the availability heuristic, and here it leads us astray. Words with r in third place happen to be more common. But because words beginning with r are more mentally available, we overrate how common they are. A familiar example of availability is the way we all assume that the tastes, politics, education level, and TV viewing habits of our social set are widely shared. We marvel when such-and-such show is a hit or so-and-so gets elected.
Originals: How Non-Conformists Move the World by Adam Grant
Albert Einstein, Apple's 1984 Super Bowl advert, availability heuristic, barriers to entry, business process, business process outsourcing, Cass Sunstein, clean water, cognitive dissonance, creative destruction, cuban missile crisis, Daniel Kahneman / Amos Tversky, Dean Kamen, double helix, Elon Musk, fear of failure, Firefox, George Santayana, Ignaz Semmelweis: hand washing, Jeff Bezos, job satisfaction, job-hopping, Joseph Schumpeter, Kickstarter, Lean Startup, Louis Pasteur, Mahatma Gandhi, Mark Zuckerberg, meta analysis, meta-analysis, minimum viable product, Network effects, pattern recognition, Paul Graham, Peter Thiel, Ralph Waldo Emerson, random walk, risk tolerance, Rosa Parks, Saturday Night Live, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, Steven Pinker, The Wisdom of Crowds, women in the workforce
it makes you more trustworthy: See R. Glen Hass and Darwyn Linder, “Counterargument Availability and the Effects of Message Structure on Persuasion,” Journal of Personality and Social Psychology 23 (1972): 219–33. We use ease of retrieval: Norbert Schwarz, Herbert Bless, Fritz Strack, Gisela Klumpp, Helga Rittenauer-Schatka, and Annette Simons, “Ease of Retrieval as Information: Another Look at the Availability Heuristic,” Journal of Personality and Social Psychology 61 (1991): 195–202. they actually liked him more: Geoffrey Haddock, “It’s Easy to Like or Dislike Tony Blair: Accessibility Experiences and the Favourability of Attitude Judgments,” British Journal of Psychology 93 (2002): 257–67. tap out the rhythm of a song: Elizabeth L. Newton, “Overconfidence in the Communication of Intent: Heard and Unheard Melodies,” Ph.D. dissertation, Stanford University (1990); Chip Heath and Dan Heath, Made to Stick: Why Some Ideas Survive and Others Die (New York: Random House, 2007).
The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker
1960s counterculture, affirmative action, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, availability heuristic, Berlin Wall, Bonfire of the Vanities, British Empire, Broken windows theory, California gold rush, Cass Sunstein, citation needed, clean water, cognitive dissonance, colonial rule, Columbine, computer age, conceptual framework, correlation coefficient, correlation does not imply causation, crack epidemic, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Brooks, delayed gratification, demographic transition, desegregation, Doomsday Clock, Douglas Hofstadter, Edward Glaeser, en.wikipedia.org, European colonialism, experimental subject, facts on the ground, failed state, first-past-the-post, Flynn Effect, food miles, Francis Fukuyama: the end of history, fudge factor, full employment, George Santayana, ghettoisation, Gini coefficient, global village, Henri Poincaré, Hobbesian trap, humanitarian revolution, impulse control, income inequality, informal economy, Intergovernmental Panel on Climate Change (IPCC), invention of the printing press, Isaac Newton, lake wobegon effect, libertarian paternalism, long peace, loss aversion, Marshall McLuhan, mass incarceration, McMansion, means of production, mental accounting, meta analysis, meta-analysis, Mikhail Gorbachev, moral panic, mutually assured destruction, open economy, Peace of Westphalia, Peter Singer: altruism, QWERTY keyboard, race to the bottom, Ralph Waldo Emerson, random walk, Republic of Letters, Richard Thaler, Ronald Reagan, Rosa Parks, Saturday Night Live, security theater, Skype, Slavoj Žižek, South China Sea, statistical model, stem cell, Steven Levy, Steven Pinker, The Bell Curve by Richard Herrnstein and Charles Murray, The Wealth of Nations by Adam Smith, theory of mind, transatlantic slave trade, transatlantic slave trade, Turing machine, ultimatum game, uranium enrichment, V2 rocket, Vilfredo Pareto, Walter Mischel, WikiLeaks, women in the workforce, zero-sum game
So the death count of a war in 1600, for instance, would have to be multiplied by 4.5 for us to compare its destructiveness to those in the middle of the 20th century.9 The second illusion is historical myopia: the closer an era is to our vantage point in the present, the more details we can make out. Historical myopia can afflict both common sense and professional history. The cognitive psychologists Amos Tversky and Daniel Kahneman have shown that people intuitively estimate relative frequency using a shortcut called the availability heuristic: the easier it is to recall examples of an event, the more probable people think it is.10 People, for example, overestimate the likelihoods of the kinds of accidents that make headlines, such as plane crashes, shark attacks, and terrorist bombings, and they underestimate those that pile up unremarked, like electrocutions, falls, and drownings.11 When we are judging the density of killings in different centuries, anyone who doesn’t consult the numbers is apt to overweight the conflicts that are most recent, most studied, or most sermonized.
Dionysian cultures apologies Aquinas, Saint Thomas Arafat, Yasir archaeology Archer, John Archimedes Ardipithecus ramidus Arendt, Hannah Argentina aristocrats deaths from violence Aristophanes, Lysistrata Aristotle armed forces: as band of brothers conscription of effectiveness of Ethical Marine Warrior file closers mercenary military revolution reluctance to shoot size of willingness to die Armenians, genocide of Aronson, Elliot Asal, Victor Asch, Solomon Ash-Sheikh, Abdulaziz Asia: abortion in female infanticide in historiography in homicide rates in hunter-gatherers in legal discrimination in massacres in New Peace in spanking in violence against animals in violence against women in wars in assassinations; see also regicide Astell, Mary Athens, democracy in Atlas, Charles Atran, Scott atrocities, twenty worst in history attention deficit hyperactivity disorder Atwood, Brian Augustine, Saint Australia domestic violence in homicide in peace imposed in New Guinea penal colony warfare among aborigines australopithecenes Austria-Hungary autarky; see also trade, international Authority Ranking autocracy and Age of Nationalism and democide Islamic punishment in and the social dilemma autonomic nervous system Autonomy, ethic of availability heuristic Axelrod, Robert Aztecs baby boomers: crime among influence of television on and 1960s counterculture Bacon, Francis balance of power balance of terror Bales, Kevin Bandura, Albert Bangladesh Barbara, Saint Barth, Karl Batson, Daniel Baumeister, Roy Evil and self-control Bays, Paul Beatles Beccaria, Cesare On Crimes and Punishments Beirut, U.S. servicemen bombed in Belarus Belgium Bell, David Bell, Derrick bell curve, see normal distribution Belloc, Hillaire Benedict, Ruth, Patterns of Culture Bentham, Jeremy Berlin, Isaiah Berlin Wall Bethmann-Hollweg, Theobald von Betzig, Laura Bhagavad-Gita Bhutto, Benazir Bible: capital punishment in debt bondage in historical analysis of and homophobia human sacrifice in and legislation New Testament Old Testament popularity of slavery in Big Parade, The (film) Bill of Rights, U.S.
death toll in Clark, Gregory clash of civilizations Clauset, Aaron Clausewitz, Karl von Clay, Henry Cleaver, Eldridge Cleveland, Robert Nasruk climate change Clinton, Bill Clockwork Orange, A (film) cluster illusion Cobden, Richard Cochran, Gregory Cockburn, J. S. code of the streets; see also honor cognitive dissonance cognitive illusions, xxiii; see also availability heuristic; cluster illusion; conjunction fallacy; loss aversion; overconfidence; positive illusions; sunk-cost fallacy Cohen, Dov Cohen, Jonathan Cold War end of interstate wars in Europe mutually assured destruction proxy wars superpower confrontations Cole, Michael Collier, Paul Collins, Randall Colombia Columbine High School commerce: Christian ideology vs.
Rationality: From AI to Zombies by Eliezer Yudkowsky
Albert Einstein, Alfred Russel Wallace, anthropic principle, anti-pattern, anti-work, Arthur Eddington, artificial general intelligence, availability heuristic, Bayesian statistics, Berlin Wall, Build a better mousetrap, Cass Sunstein, cellular automata, cognitive bias, cognitive dissonance, correlation does not imply causation, cosmological constant, creative destruction, Daniel Kahneman / Amos Tversky, dematerialisation, discovery of DNA, Douglas Hofstadter, Drosophila, effective altruism, experimental subject, Extropian, friendly AI, fundamental attribution error, Gödel, Escher, Bach, hindsight bias, index card, index fund, Isaac Newton, John Conway, John von Neumann, Long Term Capital Management, Louis Pasteur, mental accounting, meta analysis, meta-analysis, money market fund, Nash equilibrium, Necker cube, NP-complete, P = NP, pattern recognition, Paul Graham, Peter Thiel, Pierre-Simon Laplace, placebo effect, planetary scale, prediction markets, random walk, Ray Kurzweil, reversible computing, Richard Feynman, Richard Feynman, risk tolerance, Rubik’s Cube, Saturday Night Live, Schrödinger's Cat, scientific mainstream, sensible shoes, Silicon Valley, Silicon Valley startup, Singularitarianism, Solar eclipse in 1919, speech recognition, statistical model, Steven Pinker, strong AI, technological singularity, The Bell Curve by Richard Herrnstein and Charles Murray, the map is not the territory, the scientific method, Turing complete, Turing machine, ultimatum game, X Prize, Y Combinator, zero-sum game
Perhaps the machinery is evolutionarily optimized to purposes that actively oppose epistemic accuracy; for example, the machinery to win arguments in adaptive political contexts. Or the selection pressure ran skew to epistemic accuracy; for example, believing what others believe, to get along socially. Or, in the classic heuristic-and-bias, the machinery operates by an identifiable algorithm that does some useful work but also produces systematic errors: the availability heuristic is not itself a bias, but it gives rise to identifiable, compactly describable biases. Our brains are doing something wrong, and after a lot of experimentation and/or heavy thinking, someone identifies the problem in a fashion that System 2 can comprehend; then we call it a “bias.” Even if we can do no better for knowing, it is still a failure that arises, in an identifiable fashion, from a particular kind of cognitive machinery—not from having too little machinery, but from the machinery’s shape.
Success-space is narrower, and therefore more can be said about it. While I am not averse (as you can see) to discussing definitions, we should remember that is not our primary goal. We are here to pursue the great human quest for truth: for we have desperate need of the knowledge, and besides, we’re curious. To this end let us strive to overcome whatever obstacles lie in our way, whether we call them “biases” or not. * 5 Availability The availability heuristic is judging the frequency or probability of an event by the ease with which examples of the event come to mind. A famous 1978 study by Lichtenstein, Slovic, Fischhoff, Layman, and Combs, “Judged Frequency of Lethal Events,” studied errors in quantifying the severity of risks, or judging which of two dangers occurred more frequently.1 Subjects thought that accidents caused about as many deaths as disease; thought that homicide was a more frequent cause of death than suicide.
But The Matrix is not an example! A neighboring flaw is the logical fallacy of arguing from imaginary evidence: “Well, if you did go to the end of the rainbow, you would find a pot of gold—which just proves my point!” (Updating on evidence predicted, but not observed, is the mathematical mirror image of hindsight bias.) The brain has many mechanisms for generalizing from observation, not just the availability heuristic. You see three zebras, you form the category “zebra,” and this category embodies an automatic perceptual inference. Horse-shaped creatures with white and black stripes are classified as “Zebras,” therefore they are fast and good to eat; they are expected to be similar to other zebras observed. So people see (moving pictures of) three Borg, their brain automatically creates the category “Borg,” and they infer automatically that humans with brain-computer interfaces are of class “Borg” and will be similar to other Borg observed: cold, uncompassionate, dressing in black leather, walking with heavy mechanical steps.
Expected Returns: An Investor's Guide to Harvesting Market Rewards by Antti Ilmanen
Andrei Shleifer, asset allocation, asset-backed security, availability heuristic, backtesting, balance sheet recession, bank run, banking crisis, barriers to entry, Bernie Madoff, Black Swan, Bretton Woods, buy low sell high, capital asset pricing model, capital controls, Carmen Reinhart, central bank independence, collateralized debt obligation, commoditize, commodity trading advisor, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, debt deflation, deglobalization, delta neutral, demand response, discounted cash flows, disintermediation, diversification, diversified portfolio, dividend-yielding stocks, equity premium, Eugene Fama: efficient market hypothesis, fiat currency, financial deregulation, financial innovation, financial intermediation, fixed income, Flash crash, framing effect, frictionless, frictionless market, George Akerlof, global reserve currency, Google Earth, high net worth, hindsight bias, Hyman Minsky, implied volatility, income inequality, incomplete markets, index fund, inflation targeting, information asymmetry, interest rate swap, invisible hand, Kenneth Rogoff, laissez-faire capitalism, law of one price, Long Term Capital Management, loss aversion, margin call, market bubble, market clearing, market friction, market fundamentalism, market microstructure, mental accounting, merger arbitrage, mittelstand, moral hazard, Myron Scholes, negative equity, New Journalism, oil shock, p-value, passive investing, Paul Samuelson, performance metric, Ponzi scheme, prediction markets, price anchoring, price stability, principal–agent problem, private sector deleveraging, purchasing power parity, quantitative easing, quantitative trading / quantitative ﬁnance, random walk, reserve currency, Richard Thaler, risk tolerance, risk-adjusted returns, risk/return, riskless arbitrage, Robert Shiller, Robert Shiller, savings glut, selection bias, Sharpe ratio, short selling, sovereign wealth fund, statistical arbitrage, statistical model, stochastic volatility, survivorship bias, systematic trading, The Great Moderation, The Myth of the Rational Market, too big to fail, transaction costs, tulip mania, value at risk, volatility arbitrage, volatility smile, working-age population, Y2K, yield curve, zero-coupon bond, zero-sum game
Memory biases. Empirically we observe that various risk factors’ ex ante excess returns seem to be higher for a long while after a large negative factor shock. This pattern could reflect time-varying risk premia (the perceived amount and/or price of risk is high) or rational learning from the past but it is also suggestive of behavioral memory biases. Our probability assessments are distorted by the availability heuristic in that we tend to overweight strong signals, salient (vivid, emotion-triggering) events, and recent events. Memories decay only gradually after a major event; this fact might cause attractive reward-to-risk ratios to linger for surprisingly long periods. Famously, the ex ante equity premium remained high for over 20 years after the Great Depression (whereas investor memories were shorter after the 2000–2002 and 2007–2009 bear markets).