systematic bias

50 results back to index


pages: 321 words: 97,661

How to Read a Paper: The Basics of Evidence-Based Medicine by Trisha Greenhalgh

call centre, complexity theory, conceptual framework, confounding variable, correlation coefficient, correlation does not imply causation, deskilling, knowledge worker, longitudinal study, meta-analysis, microbiome, New Journalism, p-value, personalized medicine, placebo effect, publication bias, randomized controlled trial, selection bias, systematic bias, systems thinking, the scientific method

An important source of difficulty (and potential bias) in a case–control study is the precise definition of who counts as a ‘case’, because one misallocated individual may substantially influence the results (see section ‘Was systematic bias avoided or minimised?’). In addition, such a design cannot demonstrate causality—in other words, the association of A with B in a case–control study does not prove that A has caused B. Clinical questions that should be addressed by a case–control study are listed here. Does the prone sleeping position increase the risk of cot death (sudden infant death syndrome)? Does whooping cough vaccine cause brain damage? (see section ‘Was systematic bias avoided or minimised?’). Do overhead power cables cause leukaemia?

Remember that what is important in the eyes of the doctor may not be valued so highly by the patient, and vice versa. One of the most exciting developments in evidence based medicine (EBM) in recent years is the emerging science of patient-reported outcomes measures, which I cover in the section ‘PROMs’ on page 223. Was systematic bias avoided or minimised? Systematic bias is defined by epidemiologists as anything that erroneously influences the conclusions about groups and distorts comparisons [4]. Whether the design of a study is an randomised control trial (RCT), a non-randomised comparative trial, a cohort study or a case–control study, the aim should be for the groups being compared to be as like one another as possible except for the particular difference being examined.

They should, as far as possible, receive the same explanations, have the same contacts with health professionals, and be assessed the same number of times by the same assessors, using the same outcome measures [5] [6]. Different study designs call for different steps to reduce systematic bias. Randomised controlled trials In an RCT, systematic bias is (in theory) avoided by selecting a sample of participants from a particular population and allocating them randomly to the different groups. Section ‘Randomised controlled trials’ describes some ways in which bias can creep into even this gold standard of clinical trial design, and Figure 4.1 summarises particular sources to check for.


Infotopia: How Many Minds Produce Knowledge by Cass R. Sunstein

affirmative action, Andrei Shleifer, availability heuristic, behavioural economics, Build a better mousetrap, c2.com, Cass Sunstein, cognitive bias, cuban missile crisis, Daniel Kahneman / Amos Tversky, Edward Glaeser, en.wikipedia.org, feminist movement, framing effect, Free Software Foundation, hindsight bias, information asymmetry, Isaac Newton, Jean Tirole, jimmy wales, market bubble, market design, minimum wage unemployment, prediction markets, profit motive, rent control, Richard Stallman, Richard Thaler, Robert Shiller, Ronald Reagan, Savings and loan crisis, slashdot, stem cell, systematic bias, Ted Sorensen, the Cathedral and the Bazaar, The Wisdom of Crowds, winner-take-all economy

The first are those in which group members show a systematic bias. The second, a generalization of the first, are those in which their answers are worse than random. The failures of statistical judgments in these circumstances have strong implications for other social failures as well—as individual blunders, with respect to actual or likely facts, are transformed into blunders by private and public institutions. Often statistical groups will be wrong. Sometimes they will be disastrously wrong. The (Occasional) Power of Numbers / 33 Bias / A systematic bias in one or another direction will create serious problems for the group’s answers.

The resulting judgments of these “statistical groups” can be remarkably accurate.18 If we have access to many minds, we might trust the average response, a point that bears on the foundations of democracy itself. But accuracy is likely only under identifiable conditions, in which people do not suffer from a systematic bias that makes their answers worse than random. If we asked everyone in the world to estimate the population of Egypt, or to say how many people have served on the U.S. Supreme Court, or to guess the distance between Mars and Venus, the average answer is likely to be wildly off. Chapters 2 and 3 turn to deliberation.

As it happens, climate change was lowest on the list, and addressing communicable diseases, reducing hunger and malnutrition, and free trade were at the top. I do not mean to say that the results of this particular exercise are correct; everything depends on whether the relevant experts were in a position to offer good answers on the questions at hand. If the experts suffer from a systematic bias, or if their answers are worse than random, any effort to aggregate expert judgments will produce blunders. Maybe we shouldn’t trust the people who participated in the Copenhagen Consensus. But if statistical averages are a good way to aggregate knowledge when ordinary people know something of relevance, then they are also a good way to aggregate knowledge from experts.43 The (Occasional) Power of Numbers / 41 / No Magic Here / At first glance, the accuracy of statistical judgments looks like a parlor trick or even a kind of magic.


pages: 442 words: 94,734

The Art of Statistics: Learning From Data by David Spiegelhalter

Abraham Wald, algorithmic bias, Anthropocene, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Brexit referendum, Carmen Reinhart, Charles Babbage, complexity theory, computer vision, confounding variable, correlation coefficient, correlation does not imply causation, dark matter, data science, deep learning, DeepMind, Edmond Halley, Estimating the Reproducibility of Psychological Science, government statistician, Gregor Mendel, Hans Rosling, Higgs boson, Kenneth Rogoff, meta-analysis, Nate Silver, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, seminal paper, sparse data, speech recognition, statistical model, sugar pill, systematic bias, TED Talk, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Two Sigma

Far from freeing us from the need for statistical skills, bigger data and the rise in the number and complexity of scientific studies makes it even more difficult to draw appropriate conclusions. More data means that we need to be even more aware of what the evidence is actually worth. For example, intensive analysis of data sets derived from routine data can increase the possibility of false discoveries, both due to systematic bias inherent in the data sources and from carrying out many analyses and only reporting whatever looks most interesting, a practice sometimes known as ‘data-dredging’. In order to be able to critique published scientific work, and even more the media reports which we all encounter on a daily basis, we should have an acute awareness of the dangers of selective reporting, the need for scientific claims to be replicated by independent researchers, and the danger of over-interpreting a single study out of context.

Going from data (Stage 1) to the sample (Stage 2): these are problems of measurement: is what we record in our data an accurate reflection of what we are interested in? We want our data to be: Reliable, in the sense of having low variability from occasion to occasion, and so being a precise or repeatable number. Valid, in the sense of measuring what you really want to measure, and not having a systematic bias. Figure 3.1 Process of inductive inference: each arrow can be interpreted as ‘tells us something about’1 For example, the adequacy of the sex survey depends on people giving the same or very similar answers to the same question each time they are asked, and this should not depend on the style of the interviewer or the vagaries of the respondent’s mood or memory.

Whereas general AI needs a causal model for how the world actually works, which would allow it to answer human-level questions concerning the effect of interventions (‘What if we do X?’), and counterfactuals (‘What if we hadn’t done X?’). We are a long way from AI having this ability. This book emphasizes the classic statistical problems of small samples, systematic bias (in the statistical sense) and lack of generalizability to new situations. The list of challenges for algorithms shows that although having masses of data may reduce the concern about sample size, the other problems tend to get worse, and we are faced with the additional problem of explaining the reasoning of an algorithm.


pages: 404 words: 92,713

The Art of Statistics: How to Learn From Data by David Spiegelhalter

Abraham Wald, algorithmic bias, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Brexit referendum, Carmen Reinhart, Charles Babbage, complexity theory, computer vision, confounding variable, correlation coefficient, correlation does not imply causation, dark matter, data science, deep learning, DeepMind, Edmond Halley, Estimating the Reproducibility of Psychological Science, government statistician, Gregor Mendel, Hans Rosling, Higgs boson, Kenneth Rogoff, meta-analysis, Nate Silver, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, seminal paper, sparse data, speech recognition, statistical model, sugar pill, systematic bias, TED Talk, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Two Sigma

Far from freeing us from the need for statistical skills, bigger data and the rise in the number and complexity of scientific studies makes it even more difficult to draw appropriate conclusions. More data means that we need to be even more aware of what the evidence is actually worth. For example, intensive analysis of data sets derived from routine data can increase the possibility of false discoveries, both due to systematic bias inherent in the data sources and from carrying out many analyses and only reporting whatever looks most interesting, a practice sometimes known as ‘data-dredging’. In order to be able to critique published scientific work, and even more the media reports which we all encounter on a daily basis, we should have an acute awareness of the dangers of selective reporting, the need for scientific claims to be replicated by independent researchers, and the danger of over-interpreting a single study out of context.

Going from data (Stage 1) to the sample (Stage 2): these are problems of measurement: is what we record in our data an accurate reflection of what we are interested in? We want our data to be: • Reliable, in the sense of having low variability from occasion to occasion, and so being a precise or repeatable number. • Valid, in the sense of measuring what you really want to measure, and not having a systematic bias. Figure 3.1 Process of inductive inference: each arrow can be interpreted as ‘tells us something about’1 For example, the adequacy of the sex survey depends on people giving the same or very similar answers to the same question each time they are asked, and this should not depend on the style of the interviewer or the vagaries of the respondent’s mood or memory.

Whereas general AI needs a causal model for how the world actually works, which would allow it to answer human-level questions concerning the effect of interventions (‘What if we do X?’), and counterfactuals (‘What if we hadn’t done X?’). We are a long way from AI having this ability. This book emphasizes the classic statistical problems of small samples, systematic bias (in the statistical sense) and lack of generalizability to new situations. The list of challenges for algorithms shows that although having masses of data may reduce the concern about sample size, the other problems tend to get worse, and we are faced with the additional problem of explaining the reasoning of an algorithm.


pages: 309 words: 78,361

Plenitude: The New Economics of True Wealth by Juliet B. Schor

Asian financial crisis, behavioural economics, big-box store, business climate, business cycle, carbon footprint, carbon tax, clean tech, Community Supported Agriculture, creative destruction, credit crunch, Daniel Kahneman / Amos Tversky, decarbonisation, degrowth, dematerialisation, demographic transition, deskilling, Edward Glaeser, en.wikipedia.org, Gini coefficient, global village, Herman Kahn, IKEA effect, income inequality, income per capita, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, Jevons paradox, Joseph Schumpeter, Kenneth Arrow, knowledge economy, life extension, McMansion, new economy, ocean acidification, off-the-grid, peak oil, pink-collar, post-industrial society, prediction markets, purchasing power parity, radical decentralization, ride hailing / ride sharing, Robert Shiller, sharing economy, Simon Kuznets, single-payer health, smart grid, systematic bias, systems thinking, The Chicago School, Thomas L Friedman, Thomas Malthus, too big to fail, transaction costs, Yochai Benkler, Zipcar

Over time, the study of environmental impacts moved out of general economics and into a subfield that sought ways to internalize these effects, that is, bring them inside the market calculus. Outside the subfield, most economists have practiced their craft as if nature did not exist. They do not incorporate natural resources into basic accounting categories and data collection. In addition, market prices do not include ecological costs, an omission that introduces a systematic bias into the analysis and evaluation of virtually all market outcomes, albeit one that has been largely ignored. Goods or activities that degrade the environment (without paying for that degradation) are priced too low. Those that are particularly dirty are especially underpriced and overproduced.

This work has the potential to revolutionize the cost-benefit calculations of environmental economics, and is beginning to make headway. For example, when climate models include ecosystem services, the case for urgent action is much stronger. By contrast, trade-off economics, particularly standard cost-benefit analysis, has tended to use a partial accounting, and is more tied to the short run. It also appears to have a systematic bias. A series of studies has found that economic calculations done to assess the potential impacts of environmental projects tend to overestimate costs and underestimate benefits. There are many cases of environmental protections that have been far less burdensome than opponents expected. Trade-off thinking has typically left technological change outside its purview, so that policies to protect the environment aren’t credited with spurring nature-saving innovation.

See Daily (1997), Daily et al. (2000), and Costanza et al. (1997). 81 when climate models include ecosystem services, the case for urgent action is much stronger: Sterner and Persson (2008). 81 standard cost-benefit analysis . . . has tended to use a partial accounting: For a critique of cost-benefit analysis, see Ackerman and Heinzerling (2004). 81 systematic bias . . . to overestimate costs and underestimate benefits: For a discussion of the literature on the accuracy of cost-benefit studies, see Ackerman (2006). 81 environmental protections that have been far less burdensome than opponents expected: For a discussion of a major chemical protection law and its light costs, as well as a more general discussion of this point, see Ackerman (2006). 82 innovation is more rapid and less costly than initially assumed: See Edenhofer et al. (2006) for a discussion of technological change in climate models. 82 Nicholas Stern . . . game-changing report: Stern (2006). 82 2 percent price tag: Jowit and Wintour (2008). 82 it would cost a mere $1.8 trillion a year: Sachs’s estimates of $1.8 trillion a year are based on a combination of plug-in hybrids and carbon-capture-and-sequestration technology and assume a cost of thirty dollars per ton of avoided emissions.


pages: 338 words: 104,815

Nobody's Fool: Why We Get Taken in and What We Can Do About It by Daniel Simons, Christopher Chabris

Abraham Wald, Airbnb, artificial general intelligence, Bernie Madoff, bitcoin, Bitcoin "FTX", blockchain, Boston Dynamics, butterfly effect, call centre, Carmen Reinhart, Cass Sunstein, ChatGPT, Checklist Manifesto, choice architecture, computer vision, contact tracing, coronavirus, COVID-19, cryptocurrency, DALL-E, data science, disinformation, Donald Trump, Elon Musk, en.wikipedia.org, fake news, false flag, financial thriller, forensic accounting, framing effect, George Akerlof, global pandemic, index fund, information asymmetry, information security, Internet Archive, Jeffrey Epstein, Jim Simons, John von Neumann, Keith Raniere, Kenneth Rogoff, London Whale, lone genius, longitudinal study, loss aversion, Mark Zuckerberg, meta-analysis, moral panic, multilevel marketing, Nelson Mandela, pattern recognition, Pershing Square Capital Management, pets.com, placebo effect, Ponzi scheme, power law, publication bias, randomized controlled trial, replication crisis, risk tolerance, Robert Shiller, Ronald Reagan, Rubik’s Cube, Sam Bankman-Fried, Satoshi Nakamoto, Saturday Night Live, Sharpe ratio, short selling, side hustle, Silicon Valley, Silicon Valley startup, Skype, smart transportation, sovereign wealth fund, statistical model, stem cell, Steve Jobs, sunk-cost fallacy, survivorship bias, systematic bias, TED Talk, transcontinental railway, WikiLeaks, Y2K

That random assignment is intended to ensure that the people in one group are comparable to those in the other group in all aspects that aren’t directly manipulated in the study. Or, more precisely, random assignment ensures that there is no systematic bias in who ends up in which group.34 Imagine we’re picking teams for a basketball game; let’s call them the Reds and the Blues. It would be unfair to assign all the jocks to the Red team and all the nerds to Blue—that would be a systematic bias. If instead we flipped a coin to assign each person to a team, then each nerd and each jock would be equally likely to end up on each team. One team might still be better, but that advantage would be due to chance, not bias.

Each person is equally likely to be in the treatment or control group, so individual differences in factors like education or age, or, more importantly, disease severity, health behaviors, and other predictors of how well a person might respond to a treatment (including ones that were not or could not be measured), will be evenly distributed on average. That is, there won’t be a systematic bias favoring the treatment group or the control group. But in any given study, random assignment won’t guarantee that the treatment and control groups will look exactly the same in every respect. In fact, it ensures that they shouldn’t. If you measure enough things in a study, the treatment and control groups are bound to differ on some of them before anyone starts receiving a drug, a placebo, or anything else.


pages: 284 words: 79,265

The Half-Life of Facts: Why Everything We Know Has an Expiration Date by Samuel Arbesman

Albert Einstein, Alfred Russel Wallace, Amazon Mechanical Turk, Andrew Wiles, Apollo 11, bioinformatics, British Empire, Cesare Marchetti: Marchetti’s constant, Charles Babbage, Chelsea Manning, Clayton Christensen, cognitive bias, cognitive dissonance, conceptual framework, data science, David Brooks, demographic transition, double entry bookkeeping, double helix, Galaxy Zoo, Gregor Mendel, guest worker program, Gödel, Escher, Bach, Ignaz Semmelweis: hand washing, index fund, invention of movable type, Isaac Newton, John Harrison: Longitude, Kevin Kelly, language acquisition, Large Hadron Collider, life extension, Marc Andreessen, meta-analysis, Milgram experiment, National Debt Clock, Nicholas Carr, P = NP, p-value, Paul Erdős, Pluto: dwarf planet, power law, publication bias, randomized controlled trial, Richard Feynman, Rodney Brooks, scientific worldview, SimCity, social contagion, social graph, social web, systematic bias, text mining, the long tail, the scientific method, the strength of weak ties, Thomas Kuhn: the structure of scientific revolutions, Thomas Malthus, Tyler Cowen, Tyler Cowen: Great Stagnation

The more papers in the field, the smaller the fraction of previous papers that were quoted in a new study. Astonishingly, no matter how many trials had been done before in that area, half the time only two or fewer studies were cited. Not only are a small fraction of the relevant studies being cited, there’s a systematic bias: The newer ones are far more likely to be mentioned. This shouldn’t be surprising after our discussion of citation decay and obsolescence in chapter 3. And it is hardly surprising that scientists might use the literature quite selectively, perhaps to bolster their own research. But when it comes to papers that are current, relevant, and necessary for the complete picture of the current state of a scientific question, this is unfortunate.

But the decline effect is not only due to measurement. One other factor involves the dissemination of measurements, and it is known as publication bias. Publication bias is the idea that the collective scientific community and the community at large only know what has been published. If there is any sort of systematic bias in what is being published (and therefore publicly measured), then we might only be seeing some of the picture. The clearest example of this is in the world of negative results. If you recall, John Maynard Smith noted that “statistics is the science that lets you do twenty experiments a year and publish one false result in Nature.”


pages: 302 words: 86,614

The Alpha Masters: Unlocking the Genius of the World's Top Hedge Funds by Maneet Ahuja, Myron Scholes, Mohamed El-Erian

"World Economic Forum" Davos, activist fund / activist shareholder / activist investor, Alan Greenspan, Asian financial crisis, asset allocation, asset-backed security, backtesting, Bear Stearns, Bernie Madoff, book value, Bretton Woods, business process, call centre, Carl Icahn, collapse of Lehman Brothers, collateralized debt obligation, computerized trading, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, diversification, Donald Trump, en.wikipedia.org, family office, financial engineering, fixed income, global macro, high net worth, high-speed rail, impact investing, interest rate derivative, Isaac Newton, Jim Simons, junk bonds, Long Term Capital Management, managed futures, Marc Andreessen, Mark Zuckerberg, merger arbitrage, Michael Milken, Myron Scholes, NetJets, oil shock, pattern recognition, Pershing Square Capital Management, Ponzi scheme, proprietary trading, quantitative easing, quantitative trading / quantitative finance, Renaissance Technologies, risk-adjusted returns, risk/return, rolodex, Savings and loan crisis, short selling, Silicon Valley, South Sea Bubble, statistical model, Steve Jobs, stock buybacks, systematic bias, systematic trading, tail risk, two and twenty, zero-sum game

“By eliminating the beta in their portfolios, hedge funds would inevitably become more attractive to large pools of institutional capital.” Deemed the world’s first institutionalized hedge fund, with 300 clients, Bridgewater is known for accepting capital only from large pension funds, endowments, central banks, and governments. Dalio believes that the issue of not having any systematic bias is a big thing. In other words, there’s no good reason there should be a bad or good environment for hedge funds—they shouldn’t have any beta—period. “There’s an equal opportunity up or down in any kind of environment,” says Dalio. “There should be just the alpha, and that is important in terms of what the role of hedge funds is and for portfolio diversification.”

But, in my view, the classic definition of generating alphas are returns that are earned by those who can forecast future cash flows or the beta factor returns (macro factors) more accurately than other market participants, which, as alluded to in the book, is a zero-sum game. Not all can outperform—those that do are paid by those who don’t—and it is extremely difficult for those that do to replicate their successes over many periods. This is not at all the story I read in this book. There is a systematic bias that favors them. They don’t believe that they are investing in a zero-sum game; they are paid for their expertise. I have defined the true earning power of these hedge fund managers as not alpha but “omega” after Ohm’s law, where omega is the varying amounts of resistance in the market. As resistance increases (decreases), they are willing to step in (step out by short-selling or exiting positions) and reduce (increase) the resistance and earn a profit by so doing as other market participants change their holdings of securities over time.


pages: 321

Finding Alphas: A Quantitative Approach to Building Trading Strategies by Igor Tulchinsky

algorithmic trading, asset allocation, automated trading system, backpropagation, backtesting, barriers to entry, behavioural economics, book value, business cycle, buy and hold, capital asset pricing model, constrained optimization, corporate governance, correlation coefficient, credit crunch, Credit Default Swap, currency risk, data science, deep learning, discounted cash flows, discrete time, diversification, diversified portfolio, Eugene Fama: efficient market hypothesis, financial engineering, financial intermediation, Flash crash, Geoffrey Hinton, implied volatility, index arbitrage, index fund, intangible asset, iterative process, Long Term Capital Management, loss aversion, low interest rates, machine readable, market design, market microstructure, merger arbitrage, natural language processing, passive investing, pattern recognition, performance metric, Performance of Mutual Funds in the Period, popular capitalism, prediction markets, price discovery process, profit motive, proprietary trading, quantitative trading / quantitative finance, random walk, Reminiscences of a Stock Operator, Renaissance Technologies, risk free rate, risk tolerance, risk-adjusted returns, risk/return, selection bias, sentiment analysis, shareholder value, Sharpe ratio, short selling, Silicon Valley, speech recognition, statistical arbitrage, statistical model, stochastic process, survivorship bias, systematic bias, systematic trading, text mining, transaction costs, Vanguard fund, yield curve

We conclude with some practical suggestions for quantitative practitioners and firms. CATEGORIES OF BIAS We broadly categorize bias as systematic or behavioral. Investors introduce systematic bias by inadvertently coding it into their quantitative processes. By contrast, investors introduce behavioral bias by making ad hoc decisions rooted in their own human behavior. Over a period of time, both systematic and behavioral bias yield suboptimal investment outcomes. SYSTEMATIC BIASES There are two important sources of systematic bias: look-ahead bias and data mining. Finding Alphas: A Quantitative Approach to Building Trading Strategies, Second Edition. Edited by Igor Tulchinsky et al. and WorldQuant Virtual Research Center. © 2020 Tulchinsky et al., WorldQuant Virtual Research Center.


pages: 340 words: 91,416

Lost in Math: How Beauty Leads Physics Astray by Sabine Hossenfelder

Adam Curtis, Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, clockwork universe, cognitive bias, cosmic microwave background, cosmological constant, cosmological principle, crowdsourcing, dark matter, data science, deep learning, double helix, game design, Henri Poincaré, Higgs boson, income inequality, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, Johannes Kepler, Large Hadron Collider, Murray Gell-Mann, Nick Bostrom, random walk, Richard Feynman, Schrödinger's Cat, Skype, Stephen Hawking, sunk-cost fallacy, systematic bias, TED Talk, the scientific method

Now the time it takes to test a new fundamental law of nature can be longer than a scientist’s full career. This forces theorists to draw upon criteria other than empirical adequacy to decide which research avenues to pursue. Aesthetic appeal is one of them. In our search for new ideas, beauty plays many roles. It’s a guide, a reward, a motivation. It is also a systematic bias. Invisible Friends The movers have picked up my boxes, most of which I never bothered to unpack, knowing I wouldn’t stay here. Echoes of past moves return from empty cabinets. I call my friend and colleague Michael Krämer, professor of physics in Aachen, Germany. Michael works on supersymmetry, “susy” for short.

A Beautiful Question (Wilczek), 27, 146 beauty anthropic argument and, 152 of chaos, 157 components of, 95 danger of, 27 of E8 theory, 165 economy and, 147 experience and, 97–98 faith in, 23, 26 of God, 19 as insightful guide, 27 is ill-defined, 180 justification for, 208–209 laws of nature and, 3–4, 20–22 misrepresentation of, 68–69 origin of arguments from, 18 in particle physics, 147 planetary orbits and, 18–19 as a promising guide, 27 pursuit of, 223–224 quantum mechanics and, 29 in quark model, 24–26 revolution and, 128–130, 152 of rigidity, 98 rigidity and, 73–74 simplicity and, 147–148 of string theory, 181–182 subjectivity of, 26 of supersymmetry, 145, 180 symmetry and, 147 as systematic bias, 10 in theoretical physics, 147 as a treacherous guide, 28 ugliness and, 19 universal recognition of, 2–3 Beauty and Revolution in Science (McAllister), 128 bet, between Lisi and Wilczek, 165–166, 235 bias, 228–231, 245 big bang aesthetic bias against, 30 multiple, 100 repugnancy of, 29–30 black holes evaporation of, 183–185, 228–229 firewall of, 184–185, 228–229 formation of, 182 microstates of, 184 multiverses and, 107 stellar-mass, 182–183 string theory and, 175, 182, 184–185 supermassive, 182–183 Bohr, Niels, 6, 67 Boltzmann, Ludwig, 32 Bondi, Hermann, 30 Bose, Satyendra, 11 bosons, 11, 13, 239 fermion formation of, 159 gauge, 52–53, 53 (fig.)


pages: 428 words: 103,544

The Data Detective: Ten Easy Rules to Make Sense of Statistics by Tim Harford

Abraham Wald, access to a mobile phone, Ada Lovelace, affirmative action, algorithmic bias, Automated Insights, banking crisis, basic income, behavioural economics, Black Lives Matter, Black Swan, Bretton Woods, British Empire, business cycle, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, Charles Babbage, clean water, collapse of Lehman Brothers, contact tracing, coronavirus, correlation does not imply causation, COVID-19, cuban missile crisis, Daniel Kahneman / Amos Tversky, data science, David Attenborough, Diane Coyle, disinformation, Donald Trump, Estimating the Reproducibility of Psychological Science, experimental subject, fake news, financial innovation, Florence Nightingale: pie chart, Gini coefficient, Great Leap Forward, Hans Rosling, high-speed rail, income inequality, Isaac Newton, Jeremy Corbyn, job automation, Kickstarter, life extension, meta-analysis, microcredit, Milgram experiment, moral panic, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, opioid epidemic / opioid crisis, Paul Samuelson, Phillips curve, publication bias, publish or perish, random walk, randomized controlled trial, recommendation engine, replication crisis, Richard Feynman, Richard Thaler, rolodex, Ronald Reagan, selection bias, sentiment analysis, Silicon Valley, sorting algorithm, sparse data, statistical model, stem cell, Stephen Hawking, Steve Bannon, Steven Pinker, survivorship bias, systematic bias, TED Talk, universal basic income, W. E. B. Du Bois, When a measure becomes a target

See also official statistics affirmative action, 33–34 age data and childhood mortality trends, 66–67, 91 and criminal justice data, 178 and defining “children,” 70n and divorce rates, 253n and evaluation of statistical claims, 133 and incidence of strokes, 98 and misuse of statistical methods, 117–19 and “priming” research, 121 and scale of numerical comparisons, 93–94 and self-harm statistics, 75 and smoking research, 4–5 and teen pregnancy, 55 and vaccination research, 53–54 aggressive behavior, defining, 70–71 AIDS, 28 alchemy, 171, 173–75, 173n, 179, 207 algorithms alchemy analogy, 175 and consumer data, 159–64 criminal justice applications, 176–79 and excessive credulity of data, 164–67 and found data, 149, 151, 154 and Google Flu Trends, 153–57 vs. human judgment, 167–71 pattern-recognizing, 183 and proliferation of big data, 157–59 and systematic bias, 166 and teacher evaluations, 163–64 trustworthiness of, 179–82 See also big data Allegory of Faith, The (Vermeer), 29–30 Allied Art Commission, 21 alternative sanctions, 176–79 altimeters, 172–73 Amazon, 148, 175, 181 American Economic Association, 186n American Statistical Association, 194 Anderson, Chris, 156 Angwin, Julia, 176 anorexia, 75 antidepressants, 126 anti-vaccination sentiment, 53–54 Apple, 175 Argentina, 194–95, 211 Army Medical Board (UK), 215 art forgeries, 19–23, 29–32, 42–45 Asch, Solomon, 135–38, 141–42, 260 assessable decisions, 180 astrology, 124 Atkinson, Tony, 83 atomic weapons, 90, 249–50 Attenborough, David, 277–78 authority, deference to, 138–39 autism, 53–54 automation, 128 Avogadro’s number, 246 Babbage, Charles, 219 Babcock, Linda, 27 Babson, Roger, 263 backfire effect, 129 Bad Science (Goldacre), 2 bail recommendations, 158, 169, 180 Bakelite, 32 Bank for International Settlements, 100–101 Bank of England, 256, 258 Bannon, Steve, 13 bar charts, 228, 234, 235 Bargh, John, 121, 122 barometric pressure, 172–73 Barrack Hospital, Scutari (Istanbul), 213–14, 220, 225, 233, 235 base rates, 253–54, 253n Battistella, Annabelle (Fanne Foxe), 185–86, 212 battlefield awareness, 58–59 Baumeister, Roy, 121, 122 BBC, 10, 276 behavioral data, 69–70 behavioral economics, 25, 27, 41, 96, 271.

See also choice research Bell, Vanessa, 256–57 Bem, Daryl, 111, 113–14, 119–23 benefits of statistical analysis, 9 Berti, Gasparo, 172 Bevacqua, Graciela, 194–95, 212 Beyth, Ruth, 248–49, 251, 254 biases biased assimilation, 35–36 confirmation bias, 33 current offense bias, 169 and motivated reasoning, 27–29, 32–36, 38, 131, 268 negativity bias, 95–99 non-response bias, 146–47 novelty bias, 95–99, 113, 114, 122 optimism bias, 96 and power of doubt, 13 publication bias, 113–16, 118–23, 125–27 racial bias in criminal justice, 176–79 in sampling, 135–38, 142–45, 147–51 selection bias, 2, 245–46 survivorship bias, 109–10, 112–13, 122–26 systematic bias in algorithms, 166 and value of statistical knowledge, 17 big data and certification of researchers, 182 and criminal justice, 176–79 and excessive credulity in data, 164–67 and found data, 149, 151, 152, 154 and Google Flu Trends, 153–57 historical perspective on, 171–75 influence in today’s world, 183 limitations and misuse of, 159–63, 170–71 proliferation of, 157–59 and teacher evaluations, 163–64 See also algorithms Big Data (Cukier and Mayer-Schönberger), 148, 157 “Big Duck” graphics, 216–18, 217, 229–30 Big Issue, The, 226n “Billion Pound-O-Gram, The” (infographic), 223 billionaires, 78–80 binge drinking, 75 Bird, Sheila, 68 bird’s-eye view of data, 61–64, 203, 221, 265 BizzFit, 108 Black Swan, The (Taleb), 101 Blastland, Michael, 10, 68, 93 blogs, 76 Bloomberg TV, 89 body count metrics, 58 Boijmans Museum, 20 Boon, Gerard, 19, 30–31 border wall debate, 93–94 Borges, Jorge Luis, 118 Boyle, Robert, 172–75 brain physiology, 270 Bredius, Abraham, 19–23, 29–32, 35, 43–45, 78, 242, 262 Bretton Woods conference, 262 Brettschneider, Brian, 224 Brexit, 71, 277 British Army, 213–14, 220–21 British Election Study, 145–46 British Medical Journal, 6, 67 British Treasury, 256–57 Broward County Sheriff’s Office, 176 Brown, Derren, 115 Brown, Zack “Danger,” 108 Buchanan, Larry, 229, 232 budget deficits, 188, 192–93, 195 Buffett, Warren, 259 Bureau of Economic Analysis, 190, 205 Bureau of Labor Statistics, 190, 205, 212 business-cycle forecasting, 258–59 business writing, 123–24 Butoyi, Imelda, 62–63 Cairo, Alberto, 227 Cambridge Analytica, 158 Cambridge University, 162.


pages: 755 words: 121,290

Statistics hacks by Bruce Frey

Bayesian statistics, Berlin Wall, correlation coefficient, Daniel Kahneman / Amos Tversky, distributed generation, en.wikipedia.org, feminist movement, G4S, game design, Hacker Ethic, index card, Linda problem, Milgram experiment, Monty Hall problem, p-value, place-making, reshoring, RFID, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, statistical model, sugar pill, systematic bias, Thomas Bayes

Random assignment of participants to an experimental group and a control group solves this problem nicely. Selection There might be systematic bias in assigning subjects to groups. The solution is to assign subjects randomly. Testing Just taking a pretest might affect the level of the research variable. Create a comparison group and give both groups the pretest, so any changes will be equal between the groups. And assign subjects to the two groups randomly (are you starting to see a pattern here?). Instrumentation There might be systematic bias in the measurement. The solution is to use valid, standardized, objectively scored tests.


pages: 287 words: 69,655

Don't Trust Your Gut: Using Data to Get What You Really Want in LIfe by Seth Stephens-Davidowitz

affirmative action, Airbnb, cognitive bias, commoditize, correlation does not imply causation, COVID-19, Daniel Kahneman / Amos Tversky, data science, deep learning, digital map, Donald Trump, en.wikipedia.org, Erik Brynjolfsson, General Magic , global pandemic, Mark Zuckerberg, meta-analysis, Moneyball by Michael Lewis explains big data, Paul Graham, peak-end rule, randomized controlled trial, Renaissance Technologies, Sam Altman, science of happiness, selection bias, side hustle, Silicon Valley, Steve Jobs, Steve Wozniak, systematic bias, Tony Fadell, twin studies, Tyler Cowen, urban planning, Y Combinator

Here are the biggest misjudged activities: Underrated Activities: These Tend to Make People Happier Than We Think* Exhibition/Museum/Library Sports/Running/Exercise Drinking Alcohol Gardening Shopping/Errands Overrated Activities: These Tend to Make People Less Happy Than We Think Sleeping/Resting/Relaxing Computer Games/iPhone Games Watching TV/Film Eating/Snacking Browsing the Internet So, what should we make of those two lists? “Drinking alcohol” is obviously a complicated route to happiness, due to its addictive nature; I will talk more about the relationship between alcohol and happiness in the next chapter. But one systematic bias people have is they seem to overestimate the happiness effect of many passive activities. Think of the activities on the “Overrated Activities” list. Sleeping. Relaxing. Playing games. Watching TV. Snacking. Browsing the internet. These are not exactly activities that require a lot of energy.


pages: 263 words: 75,610

Delete: The Virtue of Forgetting in the Digital Age by Viktor Mayer-Schönberger

digital divide, en.wikipedia.org, Erik Brynjolfsson, Firefox, full text search, George Akerlof, information asymmetry, information retrieval, information security, information trail, Internet Archive, invention of movable type, invention of the printing press, John Markoff, Joi Ito, lifelogging, moveable type in China, Network effects, packet switching, Panopticon Jeremy Bentham, pattern recognition, power law, RFID, slashdot, Steve Jobs, Steven Levy, systematic bias, The Market for Lemons, The Structural Transformation of the Public Sphere, Vannevar Bush, Yochai Benkler

Of course, the information we have to ground our decisions in is almost always incomplete—by necessity. But in the analog world, random pieces of information are missing. With digital memory, the exclusion is biased against information that is not captured in digital form and not fed into digital memory. That is a systematic bias, and one that not only falsifies our understanding of events but that can also be gamed. In short, because digital memory amplifies only digitized information, humans like Jane trusting digital memory may find themselves worse off than if they’d relied solely on their human memory, with its tendency to forget information that is no longer important or relevant.


pages: 269 words: 77,042

Sex, Lies, and Pharmaceuticals: How Drug Companies Plan to Profit From Female Sexual Dysfunction by Ray Moynihan, Barbara Mintzes

business intelligence, clean water, meta-analysis, moral panic, Naomi Klein, New Journalism, placebo effect, profit motive, Ralph Nader, systematic bias

While individual links may not necessarily influence a researcher, there is growing evidence that, looked at in its totality, this web of influence may well be distorting medical science in the most profound way. At a research level, trials funded by drug companies are more likely to find favourable results for sponsors’ products, leading to a ‘systematic bias’ in the medical literature that overstates the benefits of drugs and underplays their harms.7 At the level of education, in some nations drug and device companies fund at least half of the seminars where our doctors undertake their ongoing professional development, with strong anecdotal evidence that sponsors sometimes influence these activities in important but often hidden ways.8 At the level of practice, studies have shown that doctors who accept gifts, and expose themselves to marketing in its many forms, tend to more often prescribe the latest and most expensive drugs, which may not always be in the interests of their patients or the public purse.9 So strong is the accumulating evidence that the calls for fundamental reform are no longer coming just from grass-roots activists like the New View, No Free Lunch and Healthy Scepticism.10 Powerful voices from within the heart of mainstream medicine are now calling for a much greater transparency in the relationship, and much greater independence between health professionals and the industries whose products those professionals prescribe.


pages: 275 words: 82,640

Money Mischief: Episodes in Monetary History by Milton Friedman

Bretton Woods, British Empire, business cycle, classic study, currency peg, double entry bookkeeping, fiat currency, financial innovation, fixed income, floating exchange rates, foreign exchange controls, full employment, German hyperinflation, income per capita, law of one price, Money creation, money market fund, oil shock, price anchoring, price stability, Savings and loan crisis, systematic bias, Tax Reform Act of 1986, transaction costs

For 1875–79 and 1901–14, it approximates the actual pattern. The U.S. hypothetical annual monetary demand for silver is simply the increment in the U.S. hypothetical silver stock: The possible errors in this approach are numerous. Some simply affect the year-to-year movements as a result of the use of a trend for k1. Any systematic bias arises primarily from the assumption that the same specie reserves would have been maintained under a silver standard in the early and late years of the period as those maintained under a gold standard. The possible sources of error are different for the specie reserve ratio and the real stock of money.


pages: 296 words: 78,631

Hello World: Being Human in the Age of Algorithms by Hannah Fry

23andMe, 3D printing, Air France Flight 447, Airbnb, airport security, algorithmic bias, algorithmic management, augmented reality, autonomous vehicles, backpropagation, Brixton riot, Cambridge Analytica, chief data officer, computer vision, crowdsourcing, DARPA: Urban Challenge, data science, deep learning, DeepMind, Douglas Hofstadter, driverless car, Elon Musk, fake news, Firefox, Geoffrey Hinton, Google Chrome, Gödel, Escher, Bach, Ignaz Semmelweis: hand washing, John Markoff, Mark Zuckerberg, meta-analysis, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, pattern recognition, Peter Thiel, RAND corporation, ransomware, recommendation engine, ride hailing / ride sharing, selection bias, self-driving car, Shai Danziger, Silicon Valley, Silicon Valley startup, Snapchat, sparse data, speech recognition, Stanislav Petrov, statistical model, Stephen Hawking, Steven Levy, systematic bias, TED Talk, Tesla Model S, The Wisdom of Crowds, Thomas Bayes, trolley problem, Watson beat the top human players on Jeopardy!, web of trust, William Langewiesche, you are the product

For every Christopher Drew Brooks, treated unfairly by an algorithm, there are countless cases like that of Nicholas Robinson, where a judge errs on their own. Having an algorithm – even an imperfect algorithm – working with judges to support their often faulty cognition is, I think, a step in the right direction. At least a well-designed and properly regulated algorithm can help get rid of systematic bias and random error. You can’t change a whole cohort of judges, especially if they’re not able to tell you how they make their decisions in the first place. Designing an algorithm for use in the criminal justice system demands that we sit down and think hard about exactly what the justice system is for.


pages: 220 words: 73,451

Democratizing innovation by Eric von Hippel

additive manufacturing, correlation coefficient, Debian, disruptive innovation, Free Software Foundation, hacker house, informal economy, information asymmetry, inventory management, iterative process, James Watt: steam engine, knowledge economy, longitudinal study, machine readable, meta-analysis, Network effects, placebo effect, principal–agent problem, Richard Stallman, software patent, systematic bias, the Cathedral and the Bazaar, tragedy of the anticommons, transaction costs, vertical integration, Vickrey auction

In the general literature, Armstrong’s (2001) review on forecast bias for new product introduction indicates that sales forecasts are generally optimistic, but that that upward bias decreases as the magnitude of the sales forecast increases. Coller and Yohn (1998) review the literature on bias in accuracy of management earnings forecasts and find that little systematic bias occurs. Tull’s (1967) model calculates $15 million in revenue as a level above which forecasts actually become pessimistic on average. We think it reasonable to apply the same deflator to LU vs. non-LU project sales projections. Even if LU project personnel were for some reason more likely to be optimistic with respect to such projections than non-LU project personnel, that would not significantly affect our findings.


pages: 250 words: 79,360

Escape From Model Land: How Mathematical Models Can Lead Us Astray and What We Can Do About It by Erica Thompson

Alan Greenspan, Bayesian statistics, behavioural economics, Big Tech, Black Swan, butterfly effect, carbon tax, coronavirus, correlation does not imply causation, COVID-19, data is the new oil, data science, decarbonisation, DeepMind, Donald Trump, Drosophila, Emanuel Derman, Financial Modelers Manifesto, fudge factor, germ theory of disease, global pandemic, hindcast, I will remember that I didn’t make the world, and it doesn’t satisfy my equations, implied volatility, Intergovernmental Panel on Climate Change (IPCC), John von Neumann, junk bonds, Kim Stanley Robinson, lockdown, Long Term Capital Management, moral hazard, mouse model, Myron Scholes, Nate Silver, Neal Stephenson, negative emissions, paperclip maximiser, precautionary principle, RAND corporation, random walk, risk tolerance, selection bias, self-driving car, social distancing, Stanford marshmallow experiment, statistical model, systematic bias, tacit knowledge, tail risk, TED Talk, The Great Moderation, The Great Resignation, the scientific method, too big to fail, trolley problem, value at risk, volatility smile, Y2K

Methods of mathematical inference from models, by contrast, typically do assign some kind of truth value to the models. Statistical methods may assume that model outcomes are related to the truth in some consistent and discoverable way, such as a random error that will converge on zero given enough data, or a systematic bias that can be estimated and corrected. Other statistical methods assume that from a set of candidate models one can identify a single best model or construct a new best model by weighting the candidates according to agreement with other data. Essentially, they assume that the observations are generated by a process that has rules, that those rules can be written formally, that they are sufficiently close to the candidate set of models we are examining and that the only limit to our discovery of the rules is our observation of further data.


Statistics in a Nutshell by Sarah Boslaugh

Antoine Gombaud: Chevalier de Méré, Bayesian statistics, business climate, computer age, confounding variable, correlation coefficient, experimental subject, Florence Nightingale: pie chart, income per capita, iterative process, job satisfaction, labor-force participation, linear programming, longitudinal study, meta-analysis, p-value, pattern recognition, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, purchasing power parity, randomized controlled trial, selection bias, six sigma, sparse data, statistical model, systematic bias, The Design of Experiments, the scientific method, Thomas Bayes, Two Sigma, Vilfredo Pareto

Random measurement error is the result of chance circumstances such as room temperature, variance in administrative procedure, or fluctuation in the individual’s mood or alertness. We do not expect random error to affect an individual’s score consistently in one direction or the other. Random error makes measurement less precise but does not systematically bias results because it can be expected to have a positive effect on one occasion and a negative effect on another, thus canceling itself out over the long run. Because there are so many potential sources for random error, we have no expectation that it can be completely eliminated, but we desire to reduce it as much as possible to increase the precision of our measurements.

This is not a problem if you are clear about where and how your sample was obtained. Imagine that you are a microbiologist interested in examining bacteria present in hospitals. If you use a filter with pores of diameter one µm (micrometer), any bacteria smaller than this will not be part of the population that you are observing. This sampling limitation will introduce systematic bias into the study; however, as long as you are clear that the population about which you can make inferences is bacteria of diameter greater than one µm, and nothing else, your results will be valid. In reality, we often want to generalize to a larger population than we sampled from, and whether we can do this depends on a number of factors.


pages: 306 words: 82,765

Skin in the Game: Hidden Asymmetries in Daily Life by Nassim Nicholas Taleb

anti-fragile, availability heuristic, behavioural economics, Benoit Mandelbrot, Bernie Madoff, Black Swan, Brownian motion, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, cellular automata, Claude Shannon: information theory, cognitive dissonance, complexity theory, data science, David Graeber, disintermediation, Donald Trump, Edward Thorp, equity premium, fake news, financial independence, information asymmetry, invisible hand, knowledge economy, loss aversion, mandelbrot fractal, Mark Spitznagel, mental accounting, microbiome, mirror neurons, moral hazard, Murray Gell-Mann, offshore financial centre, p-value, Paradox of Choice, Paul Samuelson, Ponzi scheme, power law, precautionary principle, price mechanism, principal–agent problem, public intellectual, Ralph Nader, random walk, rent-seeking, Richard Feynman, Richard Thaler, Ronald Coase, Ronald Reagan, Rory Sutherland, Rupert Read, Silicon Valley, Social Justice Warrior, Steven Pinker, stochastic process, survivorship bias, systematic bias, tail risk, TED Talk, The Nature of the Firm, Tragedy of the Commons, transaction costs, urban planning, Yogi Berra

Hence the more “systemic” things are, the more important survival becomes. FIGURE 3. An illustration of the bias-variance tradeoff. Assume two people (sober) shooting at a target in, say, Texas. The left shooter has a bias, a systematic “error,” but on balance gets closer to the target than the right shooter, who has no systematic bias but a high variance. Typically, you cannot reduce one without increasing the other. When fragile, the strategy at the left is the best: maintain a distance from ruin, that is, from hitting a point in the periphery should it be dangerous. This schema explains why if you want to minimize the probability of the plane crashing, you may make mistakes with impunity provided you lower your dispersion.


pages: 290 words: 83,248

The Greed Merchants: How the Investment Banks Exploited the System by Philip Augar

Alan Greenspan, Andy Kessler, AOL-Time Warner, barriers to entry, Bear Stearns, Berlin Wall, Big bang: deregulation of the City of London, Bonfire of the Vanities, business cycle, buttonwood tree, buy and hold, capital asset pricing model, Carl Icahn, commoditize, corporate governance, corporate raider, crony capitalism, cross-subsidies, deal flow, equity risk premium, financial deregulation, financial engineering, financial innovation, fixed income, Glass-Steagall Act, Gordon Gekko, high net worth, information retrieval, interest rate derivative, invisible hand, John Meriwether, junk bonds, Long Term Capital Management, low interest rates, Martin Wolf, Michael Milken, new economy, Nick Leeson, offshore financial centre, pensions crisis, proprietary trading, regulatory arbitrage, risk free rate, Sand Hill Road, shareholder value, short selling, Silicon Valley, South Sea Bubble, statistical model, systematic bias, Telecommunications Act of 1996, The Chicago School, The Predators' Ball, The Wealth of Nations by Adam Smith, transaction costs, tulip mania, value at risk, yield curve

They paid the price in the end with horrendous collapses in their share prices.14 Hector Sants, who worked for a full spectrum of British-, European-and American-owned investment banks before joining the FSA as Managing Director, points out that London’s ‘culture of separation in the wholesale area, harking back to the traditional divide between stock broking and merchant banking’ spared London the worst of the investment banks’ excesses. But not by much: ‘Even in the UK there was evidence of systematic bias in analyst recommendations, poor management of conflicts of interest and a feeling that if it happened there it could happen here. I’m not sure the UK industry could put hand on heart and say in 2002 there was an inherently better conflict management culture in London than in New York.’15 It was clear that the City had to change, but did it have to swallow the American bait hook, line and sinker?


pages: 288 words: 81,253

Thinking in Bets by Annie Duke

banking crisis, behavioural economics, Bernie Madoff, Cass Sunstein, cognitive bias, cognitive dissonance, cognitive load, Daniel Kahneman / Amos Tversky, delayed gratification, Demis Hassabis, disinformation, Donald Trump, Dr. Strangelove, en.wikipedia.org, endowment effect, Estimating the Reproducibility of Psychological Science, fake news, Filter Bubble, Herman Kahn, hindsight bias, Jean Tirole, John Nash: game theory, John von Neumann, loss aversion, market design, mutually assured destruction, Nate Silver, p-value, phenotype, prediction markets, Richard Feynman, ride hailing / ride sharing, Stanford marshmallow experiment, Stephen Hawking, Steven Pinker, systematic bias, TED Talk, the scientific method, The Signal and the Noise by Nate Silver, urban planning, Walter Mischel, Yogi Berra, zero-sum game

“Thus, the conventional view that natural selection favors nervous systems which produce ever more accurate images of the world must be a very naïve view of mental evolution.” Dawkins, in turn, considered Trivers, for his work, one of the heroes of his groundbreaking book, devoting four chapters of The Selfish Gene to developing Trivers’s ideas. * This is a systematic bias, not a guarantee that we always grab credit or always deflect blame. There are some people, to be sure, who exhibit the opposite of self-serving bias, treating everything bad that happens as their fault and attributing anything good in their lives to luck. That pattern is much rarer (and more likely in women).


pages: 304 words: 82,395

Big Data: A Revolution That Will Transform How We Live, Work, and Think by Viktor Mayer-Schonberger, Kenneth Cukier

23andMe, Affordable Care Act / Obamacare, airport security, Apollo 11, barriers to entry, Berlin Wall, big data - Walmart - Pop Tarts, Black Swan, book scanning, book value, business intelligence, business process, call centre, cloud computing, computer age, correlation does not imply causation, dark matter, data science, double entry bookkeeping, Eratosthenes, Erik Brynjolfsson, game design, hype cycle, IBM and the Holocaust, index card, informal economy, intangible asset, Internet of things, invention of the printing press, Jeff Bezos, Joi Ito, lifelogging, Louis Pasteur, machine readable, machine translation, Marc Benioff, Mark Zuckerberg, Max Levchin, Menlo Park, Moneyball by Michael Lewis explains big data, Nate Silver, natural language processing, Netflix Prize, Network effects, obamacare, optical character recognition, PageRank, paypal mafia, performance metric, Peter Thiel, Plato's cave, post-materialism, random walk, recommendation engine, Salesforce, self-driving car, sentiment analysis, Silicon Valley, Silicon Valley startup, smart grid, smart meter, social graph, sparse data, speech recognition, Steve Jobs, Steven Levy, systematic bias, the scientific method, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, Thomas Davenport, Turing test, vertical integration, Watson beat the top human players on Jeopardy!

If we have only one temperature sensor for the whole plot of land, we must make sure it’s accurate and working at all times: no messiness allowed. In contrast, if we have a sensor for every one of the hundreds of vines, we can use cheaper, less sophisticated sensors (as long as they do not introduce a systematic bias). Chances are that at some points a few sensors may report incorrect data, creating a less exact, or “messier,” dataset than the one from a single precise sensor. Any particular reading may be incorrect, but the aggregate of many readings will provide a more comprehensive picture. Because this dataset consists of more data points, it offers far greater value that likely offsets its messiness.


pages: 337 words: 89,075

Understanding Asset Allocation: An Intuitive Approach to Maximizing Your Portfolio by Victor A. Canto

accounting loophole / creative accounting, airline deregulation, Alan Greenspan, Andrei Shleifer, asset allocation, Bretton Woods, business cycle, buy and hold, buy low sell high, California energy crisis, capital asset pricing model, commodity trading advisor, corporate governance, discounted cash flows, diversification, diversified portfolio, equity risk premium, financial engineering, fixed income, frictionless, global macro, high net worth, index fund, inflation targeting, invisible hand, John Meriwether, junk bonds, law of one price, liquidity trap, London Interbank Offered Rate, Long Term Capital Management, low cost airline, low interest rates, market bubble, merger arbitrage, money market fund, new economy, passive investing, Paul Samuelson, Performance of Mutual Funds in the Period, Phillips curve, price mechanism, purchasing power parity, risk free rate, risk tolerance, risk-adjusted returns, risk/return, rolling blackouts, Ronald Reagan, Savings and loan crisis, selection bias, seminal paper, shareholder value, Sharpe ratio, short selling, statistical arbitrage, stocks for the long run, survivorship bias, systematic bias, Tax Reform Act of 1986, the market place, transaction costs, Y2K, yield curve, zero-sum game

Consequently, the model is overly pessimistic during high and/or rising growth periods, and overly optimistic during low and/or declining growth periods. An investor can do better than the CEM, but that doesn’t mean the model should be thrown out entirely. Rather, let’s build upon the CEM. For the CEM to be a useful investment tool, we need to correct for the systematic bias produced by its failure to account for earnings growth. Modifying the valuation formula to account for sustainable growth is a trivial adjustment in the formulation. It turns out earnings growth acts to reduce the discount rate on a one-for-one basis. In other words, a $1 income stream in perpetuity discounted at a 5 percent rate has the same value as $1 that grows at 1 percent per year and is discounted at a 6 percent rate.


Corbyn by Richard Seymour

anti-communist, banking crisis, battle of ideas, Bernie Sanders, Boris Johnson, Brexit referendum, British Empire, call centre, capital controls, capitalist realism, centre right, collective bargaining, credit crunch, Donald Trump, eurozone crisis, fake news, first-past-the-post, full employment, gender pay gap, gentrification, housing crisis, income inequality, Jeremy Corbyn, knowledge economy, land value tax, liberal world order, mass immigration, means of production, moral panic, Naomi Klein, negative equity, Neil Kinnock, new economy, non-tariff barriers, Northern Rock, Occupy movement, offshore financial centre, pension reform, Philip Mirowski, post-war consensus, precariat, quantitative easing, race to the bottom, rent control, Snapchat, stakhanovite, systematic bias, Washington Consensus, wealth creators, Winter of Discontent, Wolfgang Streeck, working-age population, éminence grise

In terms of evidence, I know of no study which has said the BBC tends to reflect marginal or critical perspectives or ignores powerful interests. There is no evidence on that side. The problem is that bias gets discussed as if it reflected a hidden personal political agenda.’ So what, if not a hidden agenda, would explain systematic bias? Partly, Mills argues, it is a matter of the circulation of powerful milieus between media organisations, consultancies, political parties, and the state. Partly it is a matter of the BBC’s dependency on the government not just to ensure continued funding, uphold its charter and appoint its board members, but also for a great deal of its news content.


The Knowledge Machine: How Irrationality Created Modern Science by Michael Strevens

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Atul Gawande, coronavirus, COVID-19, dark matter, data science, Eddington experiment, Edmond Halley, Fellow of the Royal Society, fudge factor, germ theory of disease, Great Leap Forward, Gregor Mendel, heat death of the universe, Higgs boson, Intergovernmental Panel on Climate Change (IPCC), invention of movable type, invention of the telescope, Isaac Newton, Islamic Golden Age, Johannes Kepler, Large Hadron Collider, longitudinal study, Louis Pasteur, military-industrial complex, Murray Gell-Mann, Peace of Westphalia, Richard Feynman, Stephen Hawking, Steven Pinker, systematic bias, Thales of Miletus, the scientific method, Thomas Bayes, William of Occam

Kennefick clearly explains how a “change of scale” would create a systematic error, but he does not address the curious fact that Eddington and his coauthors make no attempt to convince their readers that such a change had occurred, rather than there having been a simple loss of focus that would not create any systematic bias in the astrographic measurements. As far as we can tell, Eddington simply chose the explanation for the blurriness of the astrographic’s photos that best suited his goals. I will take up the question of Eddington’s omission again in Chapters 3 and 7. Matthew Stanley’s “An Expedition to Heal the Wounds of War” is also largely sympathetic to Eddington’s treatment of the data and provides much fascinating historical background.


Calling Bullshit: The Art of Scepticism in a Data-Driven World by Jevin D. West, Carl T. Bergstrom

airport security, algorithmic bias, AlphaGo, Amazon Mechanical Turk, Andrew Wiles, Anthropocene, autism spectrum disorder, bitcoin, Charles Babbage, cloud computing, computer vision, content marketing, correlation coefficient, correlation does not imply causation, crowdsourcing, cryptocurrency, data science, deep learning, deepfake, delayed gratification, disinformation, Dmitri Mendeleev, Donald Trump, Elon Musk, epigenetics, Estimating the Reproducibility of Psychological Science, experimental economics, fake news, Ford Model T, Goodhart's law, Helicobacter pylori, Higgs boson, invention of the printing press, John Markoff, Large Hadron Collider, longitudinal study, Lyft, machine translation, meta-analysis, new economy, nowcasting, opioid epidemic / opioid crisis, p-value, Pluto: dwarf planet, publication bias, RAND corporation, randomized controlled trial, replication crisis, ride hailing / ride sharing, Ronald Reagan, selection bias, self-driving car, Silicon Valley, Silicon Valley startup, social graph, Socratic dialogue, Stanford marshmallow experiment, statistical model, stem cell, superintelligent machines, systematic bias, tech bro, TED Talk, the long tail, the scientific method, theory of mind, Tim Cook: Apple, twin studies, Uber and Lyft, Uber for X, uber lyft, When a measure becomes a target

These gender differences in recommendation letters could be driving some of the gender inequality in the academic and corporate worlds. In this context, a friend of ours posted this message on Twitter, describing a research study in which the authors analyzed the text from nearly nine hundred letters of recommendation for faculty positions in chemistry and in biochemistry looking for systematic bias: male associated words female associated words The implication of our friend’s tweet was that this study had found large and systematic differences in how letter writers describe men and women as candidates. From the image that he shared, it appears that writers use words associated with exceptionalism and research ability when describing men, and words associated with diligence, teamwork, and teaching when describing women.


pages: 367 words: 97,136

Beyond Diversification: What Every Investor Needs to Know About Asset Allocation by Sebastien Page

Andrei Shleifer, asset allocation, backtesting, Bernie Madoff, bitcoin, Black Swan, Bob Litterman, book value, business cycle, buy and hold, Cal Newport, capital asset pricing model, commodity super cycle, coronavirus, corporate governance, COVID-19, cryptocurrency, currency risk, discounted cash flows, diversification, diversified portfolio, en.wikipedia.org, equity risk premium, Eugene Fama: efficient market hypothesis, fixed income, future of work, Future Shock, G4S, global macro, implied volatility, index fund, information asymmetry, iterative process, loss aversion, low interest rates, market friction, mental accounting, merger arbitrage, oil shock, passive investing, prediction markets, publication bias, quantitative easing, quantitative trading / quantitative finance, random walk, reserve currency, Richard Feynman, Richard Thaler, risk free rate, risk tolerance, risk-adjusted returns, risk/return, Robert Shiller, robo advisor, seminal paper, shareholder value, Sharpe ratio, sovereign wealth fund, stochastic process, stochastic volatility, stocks for the long run, systematic bias, systematic trading, tail risk, transaction costs, TSMC, value at risk, yield curve, zero-coupon bond, zero-sum game

Consequently, after adjusting for the exposure to the market in the 40 post-event days, we could see a reduction in the alpha generated by our strategy of about 20 bps—a fraction of our 300 bps of precost alpha. We left it to the reader to interpret whether this slight excess beta constitutes a systematic bias, but if so, the impact remains small relative to the magnitude of the net alphas. Regarding liquidity, our outsiders have a similar liquidity profile, on average, to that of their peer constituents. The distribution is symmetrical: roughly half the low-ETF-beta stocks have above-average liquidity, and half have below-average liquidity. 17 Sample Portfolios and Something About Gunslingers Asset allocation is simply about seeking the highest possible return given our risk tolerance.


Forward: Notes on the Future of Our Democracy by Andrew Yang

2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, Affordable Care Act / Obamacare, Amazon Web Services, American Society of Civil Engineers: Report Card, basic income, benefit corporation, Bernie Sanders, blockchain, blue-collar work, call centre, centre right, clean water, contact tracing, coronavirus, correlation does not imply causation, COVID-19, data is the new oil, data science, deepfake, disinformation, Donald Trump, facts on the ground, fake news, forensic accounting, future of work, George Floyd, gig economy, global pandemic, income inequality, independent contractor, Jaron Lanier, Jeff Bezos, job automation, Kevin Roose, labor-force participation, Marc Benioff, Mark Zuckerberg, medical bankruptcy, new economy, obamacare, opioid epidemic / opioid crisis, pez dispenser, QAnon, recommendation engine, risk tolerance, rolodex, Ronald Reagan, Rutger Bregman, Sam Altman, Saturday Night Live, shareholder value, Shoshana Zuboff, Silicon Valley, Simon Kuznets, single-payer health, Snapchat, social distancing, SoftBank, surveillance capitalism, systematic bias, tech billionaire, TED Talk, The Day the Music Died, the long tail, TikTok, universal basic income, winner-take-all economy, working poor

We take seriously any complaints about the accuracy of our coverage, and any errors in our graphics were unintended. We apologize for any mistakes and omissions and look forward to working with Mr. Yang in the future.” They could apologize for the graphics—one of which they had already apologized for—without copping to any systematic bias, and everyone would move on. I figured my boycott would last a few days or so while they issued a press release and sent a message. Instead, they took my public complaints as an affront. At first, network sources told reporters that they had called and apologized to us when they had not.


pages: 322 words: 107,576

Bad Science by Ben Goldacre

Asperger Syndrome, classic study, confounding variable, correlation does not imply causation, disinformation, Edward Jenner, experimental subject, food desert, hygiene hypothesis, Ignaz Semmelweis: hand washing, John Snow's cholera map, Louis Pasteur, meta-analysis, Nelson Mandela, nocebo, offshore financial centre, p-value, placebo effect, public intellectual, publication bias, Richard Feynman, risk tolerance, Ronald Reagan, selection bias, selective serotonin reuptake inhibitor (SSRI), sugar pill, systematic bias, the scientific method, urban planning

In some inept trials, in all areas of medicine, patients are ‘randomised’ into the treatment or placebo group by the order in which they are recruited onto the study—the first patient in gets the real treatment, the second gets the placebo, the third the real treatment, the fourth the placebo, and so on. This sounds fair enough, but in fact it’s a glaring hole that opens your trial up to possible systematic bias. Let’s imagine there is a patient who the homeopath believes to be a no-hoper, a heart-sink patient who’ll never really get better, no matter what treatment he or she gets, and the next place available on the study is for someone going into the ‘homeopathy’ arm of the trial. It’s not inconceivable that the homeopath might just decide—again, consciously or unconsciously—that this particular patient ‘probably wouldn’t really be interested’ in the trial.


pages: 417 words: 109,367

The End of Doom: Environmental Renewal in the Twenty-First Century by Ronald Bailey

3D printing, additive manufacturing, agricultural Revolution, Albert Einstein, Anthropocene, Asilomar, autonomous vehicles, biodiversity loss, business cycle, carbon tax, Cass Sunstein, Climatic Research Unit, commodity super cycle, conceptual framework, corporate governance, creative destruction, credit crunch, David Attenborough, decarbonisation, dematerialisation, demographic transition, disinformation, disruptive innovation, diversified portfolio, double helix, energy security, failed state, financial independence, Ford Model T, Garrett Hardin, Gary Taubes, Great Leap Forward, hydraulic fracturing, income inequality, Induced demand, Intergovernmental Panel on Climate Change (IPCC), invisible hand, knowledge economy, meta-analysis, Naomi Klein, negative emissions, Neolithic agricultural revolution, ocean acidification, oil shale / tar sands, oil shock, pattern recognition, peak oil, Peter Calthorpe, phenotype, planetary scale, precautionary principle, price stability, profit motive, purchasing power parity, race to the bottom, RAND corporation, Recombinant DNA, rent-seeking, rewilding, Stewart Brand, synthetic biology, systematic bias, Tesla Model S, trade liberalization, Tragedy of the Commons, two and twenty, University of East Anglia, uranium enrichment, women in the workforce, yield curve

They find that the models actually do simulate similar lengthy hiatuses during that period; they just don’t happen to coincide with the current observational hiatus. They find that due to natural variation, the observed warming might be at the upper or lower limit of simulated rates, but there is no indication of a systematic bias in model process. “Our conclusion is that climate models are fundamentally doing the right thing,” University of Leeds researcher Piers Forster explained. “They [climate models] do in fact correctly represent these 15-year short-term fluctuations but because they are inherently chaotic they don’t get them at the right time.”


pages: 336 words: 113,519

The Undoing Project: A Friendship That Changed Our Minds by Michael Lewis

Albert Einstein, availability heuristic, behavioural economics, Cass Sunstein, choice architecture, complexity theory, Daniel Kahneman / Amos Tversky, Donald Trump, Douglas Hofstadter, endowment effect, feminist movement, framing effect, hindsight bias, John von Neumann, Kenneth Arrow, Linda problem, loss aversion, medical residency, Menlo Park, Murray Gell-Mann, Nate Silver, New Journalism, Paul Samuelson, peak-end rule, Richard Thaler, Saturday Night Live, Skinner box, Stanford marshmallow experiment, statistical model, systematic bias, the new new thing, Thomas Bayes, Walter Mischel, Yom Kippur War

He went and pulled all the other articles in other publications written by Kahneman and Tversky. “I have vivid memories of running from one article to another,” says Thaler. “As if I have discovered the secret pot of gold. For a while I wasn’t sure why I was so excited. Then I realized: They had one idea. Which was systematic bias.” If people could be systematically wrong, their mistakes couldn’t be ignored. The irrational behavior of the few would not be offset by the rational behavior of the many. People could be systematically wrong, and so markets could be systematically wrong, too. Thaler got someone to send him a draft of “Value Theory.”


Fortunes of Change: The Rise of the Liberal Rich and the Remaking of America by David Callahan

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, affirmative action, Albert Einstein, American Legislative Exchange Council, An Inconvenient Truth, automated trading system, benefit corporation, Bernie Sanders, Big Tech, Bonfire of the Vanities, book value, carbon credits, carbon footprint, carbon tax, Carl Icahn, carried interest, clean water, corporate social responsibility, David Brooks, demographic transition, desegregation, don't be evil, Donald Trump, Douglas Engelbart, Douglas Engelbart, Edward Thorp, financial deregulation, financial engineering, financial independence, global village, Gordon Gekko, greed is good, Herbert Marcuse, high net worth, income inequality, Irwin Jacobs: Qualcomm, Jeff Bezos, John Bogle, John Markoff, Kickstarter, knowledge economy, knowledge worker, Larry Ellison, Marc Andreessen, Mark Zuckerberg, market fundamentalism, medical malpractice, mega-rich, Mitch Kapor, Naomi Klein, NetJets, new economy, offshore financial centre, Peter Thiel, plutocrats, power law, profit maximization, quantitative trading / quantitative finance, Ralph Nader, Renaissance Technologies, Richard Florida, Robert Bork, rolodex, Ronald Reagan, school vouchers, short selling, Silicon Valley, Social Responsibility of Business Is to Increase Its Profits, stem cell, Steve Ballmer, Steve Jobs, systematic bias, systems thinking, unpaid internship, Upton Sinclair, Vanguard fund, War on Poverty, working poor, World Values Survey

Moreover, America’s best-endowed universities didn’t get that way by accident; they built their wealth by giving preference to alumni kids, many of whom come from money, and to so-called development cases—the children of wealthy donors or potential donors. In The Price of Admission, reporter Daniel Golden documents a systematic bias on the part of elite universities to admit rich kids. In that sense, these universities may do more to entrench today’s inequality than to challenge such patterns. That said, the progressive values now being inculcated at elite universities matter. Universities have a long track record of turning rich kids into critics of the existing order.


pages: 320 words: 33,385

Market Risk Analysis, Quantitative Methods in Finance by Carol Alexander

asset allocation, backtesting, barriers to entry, Brownian motion, capital asset pricing model, constrained optimization, credit crunch, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, financial engineering, fixed income, implied volatility, interest rate swap, low interest rates, market friction, market microstructure, p-value, performance metric, power law, proprietary trading, quantitative trading / quantitative finance, random walk, risk free rate, risk tolerance, risk-adjusted returns, risk/return, seminal paper, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, systematic bias, Thomas Bayes, transaction costs, two and twenty, value at risk, volatility smile, Wiener process, yield curve, zero-sum game

Later we shall prove that the OLS estimation method – i.e. to minimize the residual sum of squares – is optimal when the error is generated by an independent and identically distributed (i.i.d.) process.7 So for the moment we shall assume that 2 (I.4.12) t ∼ iid 0 It makes sense of course to assume that the expectation of the error is zero. If it were not zero we would not have a random error, we would have a systematic bias in the error and the regression line would not pass through the middle of the scatter plot. So the definition (I.4.12) is introducing the variance of the error process, 2 , into our notation. This is the third and final parameter of the simple linear model. The OLS estimate of 2 is s2 = RSS T−2 (I.4.13) where the numerator in (I.4.13) is understood to be the residual sum of squares that has been minimized by the choice of the OLS estimates.


pages: 402 words: 129,876

Bad Pharma: How Medicine Is Broken, and How We Can Fix It by Ben Goldacre

behavioural economics, classic study, data acquisition, framing effect, if you build it, they will come, illegal immigration, income per capita, meta-analysis, placebo effect, publication bias, randomized controlled trial, Ronald Reagan, selective serotonin reuptake inhibitor (SSRI), Simon Singh, sugar pill, systematic bias, WikiLeaks

One way to restrict the harm that can come from early stopping is to set up ‘stopping rules’, specified before the trial begins, and carefully calculated to be extreme enough that they are unlikely to be triggered by the chance variation you’d expect to see, over time, in any trial. Such rules are useful because they restrict the intrusion of human judgement, which can introduce systematic bias. But whatever we do about early stopping in medicine, it will probably pollute the data. A review from 2010 took around a hundred truncated trials, and four hundred matched trials that ran their natural course to the end: the truncated trials reported much bigger benefits, overstating the usefulness of the treatments they were testing by about a quarter.13 Another recent review found that the number of trials stopped early has doubled since 1990,14 which is probably not good news.


pages: 500 words: 145,005

Misbehaving: The Making of Behavioral Economics by Richard H. Thaler

3Com Palm IPO, Alan Greenspan, Albert Einstein, Alvin Roth, Amazon Mechanical Turk, Andrei Shleifer, Apple's 1984 Super Bowl advert, Atul Gawande, behavioural economics, Berlin Wall, Bernie Madoff, Black-Scholes formula, book value, business cycle, capital asset pricing model, Cass Sunstein, Checklist Manifesto, choice architecture, clean water, cognitive dissonance, conceptual framework, constrained optimization, Daniel Kahneman / Amos Tversky, delayed gratification, diversification, diversified portfolio, Edward Glaeser, endowment effect, equity premium, equity risk premium, Eugene Fama: efficient market hypothesis, experimental economics, Fall of the Berlin Wall, George Akerlof, hindsight bias, Home mortgage interest deduction, impulse control, index fund, information asymmetry, invisible hand, Jean Tirole, John Nash: game theory, John von Neumann, Kenneth Arrow, Kickstarter, late fees, law of one price, libertarian paternalism, Long Term Capital Management, loss aversion, low interest rates, market clearing, Mason jar, mental accounting, meta-analysis, money market fund, More Guns, Less Crime, mortgage debt, Myron Scholes, Nash equilibrium, Nate Silver, New Journalism, nudge unit, PalmPilot, Paul Samuelson, payday loans, Ponzi scheme, Post-Keynesian economics, presumed consent, pre–internet, principal–agent problem, prisoner's dilemma, profit maximization, random walk, randomized controlled trial, Richard Thaler, risk free rate, Robert Shiller, Robert Solow, Ronald Coase, Silicon Valley, South Sea Bubble, Stanford marshmallow experiment, statistical model, Steve Jobs, sunk-cost fallacy, Supply of New York City Cabdrivers, systematic bias, technology bubble, The Chicago School, The Myth of the Rational Market, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, transaction costs, ultimatum game, Vilfredo Pareto, Walter Mischel, zero-sum game

Even across many people, the errors will not average out to zero. Although I did not appreciate it fully at the time, Kahneman and Tversky’s insights had inched me forward so that I was just one step away from doing something serious with my list. Each of the items on the List was an example of a systematic bias. The items on the List had another noteworthy feature. In every case, economic theory had a highly specific prediction about some key factor—such as the presence of the cashews or the amount paid for the basketball game tickets—that the theory said should not influence decisions. They were all supposedly irrelevant factors, or SIFs.


pages: 494 words: 142,285

The Future of Ideas: The Fate of the Commons in a Connected World by Lawrence Lessig

AltaVista, Andy Kessler, AOL-Time Warner, barriers to entry, Bill Atkinson, business process, Cass Sunstein, commoditize, computer age, creative destruction, dark matter, decentralized internet, Dennis Ritchie, disintermediation, disruptive innovation, Donald Davies, Erik Brynjolfsson, Free Software Foundation, Garrett Hardin, George Gilder, Hacker Ethic, Hedy Lamarr / George Antheil, history of Unix, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, Innovator's Dilemma, invention of hypertext, inventory management, invisible hand, Jean Tirole, Jeff Bezos, John Gilmore, John Perry Barlow, Joseph Schumpeter, Ken Thompson, Kenneth Arrow, Larry Wall, Leonard Kleinrock, linked data, Marc Andreessen, Menlo Park, Mitch Kapor, Network effects, new economy, OSI model, packet switching, peer-to-peer, peer-to-peer model, price mechanism, profit maximization, RAND corporation, rent control, rent-seeking, RFC: Request For Comment, Richard Stallman, Richard Thaler, Robert Bork, Ronald Coase, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, smart grid, software patent, spectrum auction, Steve Crocker, Steven Levy, Stewart Brand, systematic bias, Ted Nelson, Telecommunications Act of 1996, the Cathedral and the Bazaar, The Chicago School, tragedy of the anticommons, Tragedy of the Commons, transaction costs, vertical integration, Yochai Benkler, zero-sum game

Allison and Mark A. Lemley, “Who's Patenting What? An Empirical Exploration of Patent Prosecution,” Vanderbilt Law Review 53 (2000): 2099, 2146; John R. Allison and Mark A. Lemley, “How Federal Circuit Judges Vote in Patent Validity Cases,” Florida State University Law Review 27 (2000): 745, 765 (concluding no systematic bias in judges' votes). 71 Jaffe, 46. 72 Ibid., 47. Jaffe's argument here is narrower than the point I am making in this section. His concern is the social costs from too much effort being devoted to the pursuit of patented innovation. My concern is the cost of patents on the innovation process generally. 73 “Patently Absurd?”


pages: 577 words: 149,554

The Problem of Political Authority: An Examination of the Right to Coerce and the Duty to Obey by Michael Huemer

Cass Sunstein, Chelsea Manning, cognitive dissonance, cuban missile crisis, Daniel Kahneman / Amos Tversky, en.wikipedia.org, Eratosthenes, experimental subject, framing effect, Garrett Hardin, Gini coefficient, illegal immigration, impulse control, Isaac Newton, Julian Assange, laissez-faire capitalism, land bank, Machinery of Freedom by David Friedman, Milgram experiment, moral hazard, Phillip Zimbardo, profit maximization, profit motive, Ralph Nader, RAND corporation, rent-seeking, Ronald Coase, Stanford prison experiment, systematic bias, The Wealth of Nations by Adam Smith, Tyler Cowen, unbiased observer, uranium enrichment, WikiLeaks

And if we felt this requirement to obey, it is likely that this would lead us to think and say that we were obliged to obey and then – in the case of the more philosophically minded among us – to devise theories to explain why we have this obligation. Thus, the widespread belief in political authority does not provide strong evidence for the reality of political authority, since that belief can be explained as the product of systematic bias. 6.3 Cognitive dissonance According to the widely accepted theory of cognitive dissonance, we experience an uncomfortable state, known as ‘cognitive dissonance’, when we have two or more cognitions that stand in conflict or tension with one another – and particularly when our behavior or other reactions appear to conflict with our self-image.15 We then tend to alter our beliefs or reactions to reduce the dissonance.


Beautiful Data: The Stories Behind Elegant Data Solutions by Toby Segaran, Jeff Hammerbacher

23andMe, airport security, Amazon Mechanical Turk, bioinformatics, Black Swan, business intelligence, card file, cloud computing, computer vision, correlation coefficient, correlation does not imply causation, crowdsourcing, Daniel Kahneman / Amos Tversky, DARPA: Urban Challenge, data acquisition, data science, database schema, double helix, en.wikipedia.org, epigenetics, fault tolerance, Firefox, Gregor Mendel, Hans Rosling, housing crisis, information retrieval, lake wobegon effect, Large Hadron Collider, longitudinal study, machine readable, machine translation, Mars Rover, natural language processing, openstreetmap, Paradox of Choice, power law, prediction markets, profit motive, semantic web, sentiment analysis, Simon Singh, social bookmarking, social graph, SPARQL, sparse data, speech recognition, statistical model, supply-chain management, systematic bias, TED Talk, text mining, the long tail, Vernor Vinge, web application

Spatial treemap of terms occurring in geograph titles and comments for selected element descriptors in the beach base level. Displacement vectors show absolute locations of leaf nodes in this enlarged section of Figure 6-9. (See Color Plate 20.) 100 CHAPTER SIX Download at Boykma.Com Our graphics and our exploration are incomplete. We are investigating the effects of systematic bias in community-contributed geographic information and developing strategies to mitigate this. We are developing notations to describe the visual design space and interactive applications through which this can be explored. We are yet to consider whether the geographically varying relationships that we are able to identify in Geograph are consistent over time.


pages: 470 words: 148,730

Good Economics for Hard Times: Better Answers to Our Biggest Problems by Abhijit V. Banerjee, Esther Duflo

3D printing, accelerated depreciation, affirmative action, Affordable Care Act / Obamacare, air traffic controllers' union, Airbnb, basic income, behavioural economics, Bernie Sanders, Big Tech, business cycle, call centre, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, carbon credits, carbon tax, Cass Sunstein, charter city, company town, congestion pricing, correlation does not imply causation, creative destruction, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, decarbonisation, Deng Xiaoping, Donald Trump, Edward Glaeser, en.wikipedia.org, endowment effect, energy transition, Erik Brynjolfsson, experimental economics, experimental subject, facts on the ground, fake news, fear of failure, financial innovation, flying shuttle, gentrification, George Akerlof, Great Leap Forward, green new deal, high net worth, immigration reform, income inequality, Indoor air pollution, industrial cluster, industrial robot, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), Jane Jacobs, Jean Tirole, Jeff Bezos, job automation, Joseph Schumpeter, junk bonds, Kevin Roose, labor-force participation, land reform, Les Trente Glorieuses, loss aversion, low skilled workers, manufacturing employment, Mark Zuckerberg, mass immigration, middle-income trap, Network effects, new economy, New Urbanism, no-fly zone, non-tariff barriers, obamacare, off-the-grid, offshore financial centre, One Laptop per Child (OLPC), open economy, Paul Samuelson, place-making, post-truth, price stability, profit maximization, purchasing power parity, race to the bottom, RAND corporation, randomized controlled trial, restrictive zoning, Richard Thaler, ride hailing / ride sharing, Robert Gordon, Robert Solow, Ronald Reagan, Savings and loan crisis, school choice, Second Machine Age, secular stagnation, self-driving car, shareholder value, short selling, Silicon Valley, smart meter, social graph, spinning jenny, Steve Jobs, systematic bias, Tax Reform Act of 1986, tech worker, technology bubble, The Chicago School, The Future of Employment, The Market for Lemons, The Rise and Fall of American Growth, The Wealth of Nations by Adam Smith, total factor productivity, trade liberalization, transaction costs, trickle-down economics, Twitter Arab Spring, universal basic income, urban sprawl, very high income, War on Poverty, women in the workforce, working-age population, Y2K

If people don’t have the right information in Nepal, with its many employment agencies, vast flows of workers in and out, and a government genuinely concerned about the welfare of its international migrants, one can only guess at how confused most potential migrants are elsewhere. Confusion could of course go either way, dampening migration, like in Nepal, or boosting it if people are overoptimistic. Why then is there a systematic bias against going? RISK VERSUS UNCERTAINTY Perhaps the exaggerated sense of mortality Maheshwor’s respondents reported should be read as a metaphor for a general sense of foreboding. Migration, after all, is leaving the familiar to embrace the unknown, and the unknown is more than just a list of different potential outcomes with associated probabilities, as economists would like to describe it.


pages: 592 words: 161,798

The Future of War by Lawrence Freedman

Albert Einstein, autonomous vehicles, Berlin Wall, Black Swan, Boeing 747, British Empire, colonial rule, conceptual framework, crowdsourcing, cuban missile crisis, currency manipulation / currency intervention, disinformation, Donald Trump, Dr. Strangelove, driverless car, drone strike, en.wikipedia.org, energy security, Ernest Rutherford, failed state, Fall of the Berlin Wall, Francis Fukuyama: the end of history, global village, Google Glasses, Herman Kahn, Intergovernmental Panel on Climate Change (IPCC), John Markoff, long peace, megacity, Mikhail Gorbachev, military-industrial complex, moral hazard, mutually assured destruction, New Journalism, Norbert Wiener, nuclear taboo, open economy, pattern recognition, Peace of Westphalia, RAND corporation, Ronald Reagan, South China Sea, speech recognition, Steven Pinker, Strategic Defense Initiative, Stuxnet, Suez canal 1869, Suez crisis 1956, systematic bias, the scientific method, uranium enrichment, urban sprawl, Valery Gerasimov, Wargames Reagan, WarGames: Global Thermonuclear War, WikiLeaks, zero day

Much of the MID was put together before the availability of modern search engines, and so used whatever material was then available in libraries. In the 2010s, a team of researchers going through the individual cases meticulously found the MID database to be unreliable, although that was not a word they used. They praised the effort and the utility of the database, insisted that they found no evidence of systematic bias, and offered detailed proposals to rectify the problems they encountered.37 Nonetheless, their investigations identified problems with almost 70 per cent of the MID cases, leading to proposals to drop 240, merge another 72 with similar cases, revise substantially a further 234, and make minor changes to another 1009.


The Washington Connection and Third World Fascism by Noam Chomsky

anti-communist, business climate, colonial rule, death from overwork, declining real wages, deliberate practice, disinformation, European colonialism, friendly fire, Gini coefficient, guns versus butter model, income inequality, income per capita, land bank, land reform, land tenure, low interest rates, military-industrial complex, new economy, RAND corporation, Seymour Hersh, strikebreaker, systematic bias, union organizing

Many of them were directly installed by us or are the beneficiaries of our direct intervention, and most of the others came into existence with our tacit support, using military equipment and training supplied by the United States. Our massive intervention and subversion over the past 25 years has been confined almost exclusively to overthrowing reformers, democrats, and radicals—we have rarely “destabilized” right-wing military regimes no matter how corrupt or terroristic.50 This systematic bias in intervention is only part of the larger system of connections—military, economic, and political—that have allowed the dominant power to shape the primary characteristics of the other states in its domains in accordance with its interests. The Brazilian counterrevolution, as we have noted (cf. note 6), took place with the connivance of the United States and was followed by immediate recognition and consistent support, just as in Guatemala ten years earlier and elsewhere, repeatedly.


pages: 578 words: 168,350

Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies by Geoffrey West

"World Economic Forum" Davos, Alfred Russel Wallace, Anthropocene, Anton Chekhov, Benoit Mandelbrot, Black Swan, British Empire, butterfly effect, caloric restriction, caloric restriction, carbon footprint, Cesare Marchetti: Marchetti’s constant, clean water, coastline paradox / Richardson effect, complexity theory, computer age, conceptual framework, continuous integration, corporate social responsibility, correlation does not imply causation, cotton gin, creative destruction, dark matter, Deng Xiaoping, double helix, driverless car, Dunbar number, Edward Glaeser, endogenous growth, Ernest Rutherford, first square of the chessboard, first square of the chessboard / second half of the chessboard, Frank Gehry, Geoffrey West, Santa Fe Institute, Great Leap Forward, Guggenheim Bilbao, housing crisis, Index librorum prohibitorum, invention of agriculture, invention of the telephone, Isaac Newton, Jane Jacobs, Jeff Bezos, Johann Wolfgang von Goethe, John von Neumann, Kenneth Arrow, laissez-faire capitalism, Large Hadron Collider, Larry Ellison, Lewis Mumford, life extension, Mahatma Gandhi, mandelbrot fractal, Marc Benioff, Marchetti’s constant, Masdar, megacity, Murano, Venice glass, Murray Gell-Mann, New Urbanism, Oklahoma City bombing, Peter Thiel, power law, profit motive, publish or perish, Ray Kurzweil, Richard Feynman, Richard Florida, Salesforce, seminal paper, Silicon Valley, smart cities, Stephen Hawking, Steve Jobs, Stewart Brand, Suez canal 1869, systematic bias, systems thinking, technological singularity, The Coming Technological Singularity, The Death and Life of Great American Cities, the scientific method, the strength of weak ties, time dilation, too big to fail, transaction costs, urban planning, urban renewal, Vernor Vinge, Vilfredo Pareto, Von Neumann architecture, Whole Earth Catalog, Whole Earth Review, wikimedia commons, working poor

A lingering issue of possible concern is that the data cover only sixty years, so companies older than this are automatically excluded. Actually, it’s worse than this because the analysis includes only those companies that were born and died in the time window between 1950 and 2009, thereby excluding all those that were born before 1950 and/or were still alive in 2009. This could clearly lead to a systematic bias in the estimates of life expectancy. A more complete analysis therefore needs to include these so-called censored companies, whose life spans are at least as long as and likely longer than the period over which they appear in the data set. This actually involves a sizable number of companies: in the sixty years covered, 6,873 firms were still alive at the end of the window in 2009.


Basic Income: A Radical Proposal for a Free Society and a Sane Economy by Philippe van Parijs, Yannick Vanderborght

Airbnb, Albert Einstein, basic income, Berlin Wall, Bertrand Russell: In Praise of Idleness, carbon tax, centre right, collective bargaining, cryptocurrency, David Graeber, declining real wages, degrowth, diversified portfolio, Edward Snowden, eurozone crisis, Fall of the Berlin Wall, feminist movement, full employment, future of work, George Akerlof, Herbert Marcuse, illegal immigration, income per capita, informal economy, Jeremy Corbyn, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Kickstarter, Marshall McLuhan, means of production, minimum wage unemployment, Money creation, open borders, Paul Samuelson, pension reform, Post-Keynesian economics, precariat, price mechanism, profit motive, purchasing power parity, quantitative easing, race to the bottom, road to serfdom, Robert Solow, Rutger Bregman, Second Machine Age, secular stagnation, selection bias, sharing economy, sovereign wealth fund, systematic bias, The Spirit Level, The Wealth of Nations by Adam Smith, Thomas Malthus, Tobin tax, universal basic income, urban planning, urban renewal, War on Poverty, working poor

It also helps to prevent unemployed workers from sinking into unemployability through the mutual reinforcement of the obsolescence of their productive skills and the lowering of their professional aspirations. Second, the combination of the last two unconditionalities—Â�universality and freedom from obligation—Â�generates a systematic bias in Â�favor of the creation and survival of jobs with high training content. One aspect of this is that a basic income helps give all young Â�people access to unpaid or low-Â�paid internships, otherÂ�wise monopolized by the privileged whose parents are able and willing to provide them with what amounts to privately funded basic incomes.


pages: 733 words: 179,391

Adaptive Markets: Financial Evolution at the Speed of Thought by Andrew W. Lo

Alan Greenspan, Albert Einstein, Alfred Russel Wallace, algorithmic trading, Andrei Shleifer, Arthur Eddington, Asian financial crisis, asset allocation, asset-backed security, backtesting, bank run, barriers to entry, Bear Stearns, behavioural economics, Berlin Wall, Bernie Madoff, bitcoin, Bob Litterman, Bonfire of the Vanities, bonus culture, break the buck, Brexit referendum, Brownian motion, business cycle, business process, butterfly effect, buy and hold, capital asset pricing model, Captain Sullenberger Hudson, carbon tax, Carmen Reinhart, collapse of Lehman Brothers, collateralized debt obligation, commoditize, computerized trading, confounding variable, corporate governance, creative destruction, Credit Default Swap, credit default swaps / collateralized debt obligations, cryptocurrency, Daniel Kahneman / Amos Tversky, delayed gratification, democratizing finance, Diane Coyle, diversification, diversified portfolio, do well by doing good, double helix, easy for humans, difficult for computers, equity risk premium, Ernest Rutherford, Eugene Fama: efficient market hypothesis, experimental economics, experimental subject, Fall of the Berlin Wall, financial deregulation, financial engineering, financial innovation, financial intermediation, fixed income, Flash crash, Fractional reserve banking, framing effect, Glass-Steagall Act, global macro, Gordon Gekko, greed is good, Hans Rosling, Henri Poincaré, high net worth, housing crisis, incomplete markets, index fund, information security, interest rate derivative, invention of the telegraph, Isaac Newton, it's over 9,000, James Watt: steam engine, Jeff Hawkins, Jim Simons, job satisfaction, John Bogle, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Meriwether, Joseph Schumpeter, Kenneth Rogoff, language acquisition, London Interbank Offered Rate, Long Term Capital Management, longitudinal study, loss aversion, Louis Pasteur, mandelbrot fractal, margin call, Mark Zuckerberg, market fundamentalism, martingale, megaproject, merger arbitrage, meta-analysis, Milgram experiment, mirror neurons, money market fund, moral hazard, Myron Scholes, Neil Armstrong, Nick Leeson, old-boy network, One Laptop per Child (OLPC), out of africa, p-value, PalmPilot, paper trading, passive investing, Paul Lévy, Paul Samuelson, Paul Volcker talking about ATMs, Phillips curve, Ponzi scheme, predatory finance, prediction markets, price discovery process, profit maximization, profit motive, proprietary trading, public intellectual, quantitative hedge fund, quantitative trading / quantitative finance, RAND corporation, random walk, randomized controlled trial, Renaissance Technologies, Richard Feynman, Richard Feynman: Challenger O-ring, risk tolerance, Robert Shiller, Robert Solow, Sam Peltzman, Savings and loan crisis, seminal paper, Shai Danziger, short selling, sovereign wealth fund, Stanford marshmallow experiment, Stanford prison experiment, statistical arbitrage, Steven Pinker, stochastic process, stocks for the long run, subprime mortgage crisis, survivorship bias, systematic bias, Thales and the olive presses, The Great Moderation, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Malthus, Thorstein Veblen, Tobin tax, too big to fail, transaction costs, Triangle Shirtwaist Factory, ultimatum game, uptick rule, Upton Sinclair, US Airways Flight 1549, Walter Mischel, Watson beat the top human players on Jeopardy!, WikiLeaks, Yogi Berra, zero-sum game

., 100 Sobel, Russell, 206 social Darwinism, 215 social exclusion, 85–86 social media, 55, 270, 405 Société Générale, 60–61 Society of Mind, The (Minsky), 132–133 sociobiology, 170–174, 216–217 Sociobiology (Wilson), 170–171 Solow, Herbert, 395 Soros, George, 6, 219, 222–223, 224, 227, 234, 244, 277 sovereign wealth funds, 230, 299, 409–410 Soviet Union, 411 Space Shuttle Challenger, 12–16, 24, 38 specialization, 217 speech synthesis, 132 Sperry, Roger, 113–114 “spoofing,” 360 Springer, James, 159 SR-52 programmable calculator, 357 stagflation, 37 Standard Portfolio Analysis of Risk (SPAN), 369–370 Stanton, Angela, 338 starfish, 192, 242 Star Trek, 395–397, 411, 414 stationarity, 253–255, 279, 282 statistical arbitrage (“statarb”), 284, 286, 288–291, 292–293, 362 statistical tests, 47 Steenbarger, Brett, 94 Stein, Carolyn, 69 sterilization, 171, 174 Stiglitz, Joseph, 224, 278, 310 Stocks for the Long Run (Siegel), 253 stock splits, 24, 47 Stone, Oliver, 346 Stone Age, 150, 163, 165 stone tools, 150–151, 153 stop-loss orders, 359 Strasberg, Lee, 105 stress, 3, 75, 93, 101, 122, 160–161, 346, 413–415 strong connectedness, 374 Strong Story Hypothesis, 133 Strumpf, Koleman, 39 “stub quotes,” 360 subjective value, 100 sublenticular extended amygdala, 89 subprime mortgages, 290, 292, 293, 297, 321, 327, 376, 377, 410 Sugihara, George, 366 suicide, 160 Sullenberger, Chesley, 381 Summers, Lawrence (Larry), 50, 315–316, 319–320, 379 sunlight, 108 SuperDot (trading system), 236 supply and demand curves, 29, 30, 31–33, 34 Surowiecki, James, 5, 16 survey research, 40 Sussman, Donald, 237–238 swaps, 243, 298, 300 Swedish Twin Registry, 161 systematic bias, 56 systematic risk, 194, 199–203, 204, 205, 250–251, 348, 389 systemic risk, 319; Bank of England’s measurement of, 366–367; government as source of, 361; in hedge fund industry, 291, 317; of large vs. small shocks, 315; managing, 370–371, 376–378, 387; transparency of, 384–385; trust linked to, 344 Takahashi, Hidehiko, 86 Tanner, Carmen, 353 Tanzania, 150 Tartaglia, Nunzio, 236 Tattersall, Ian, 150, 154 Tech Bubble, 40 telegraphy, 356 Tennyson, Alfred, Baron, 144 testosterone, 108, 337–338 Texas hold ’em, 59–60 Texas Instruments, 357, 384 Thackray, John, 234 Thales, 16 Théorie de la Spéculation (Bachelier), 19 theory of mind, 109–111 thermal homeostasis, 367–368, 370 This Time Is Different (Reinhart and Rogoff), 310 Thompson, Robert, 1, 81–82, 83, 103–104 three-body problem, 214 ticker tape machine, 356 tight coupling, 321, 322, 361, 372Tiger Fund, 234 Tinker, Grant, 395 Tobin tax, 245 Tokugawa era, 17 Tooby, John, 173, 174 tool use, 150–151, 153, 162, 165 “toxic assets,” 299 trade execution, 257, 356 trade secrets, 284–285, 384 trading volume, 257, 359 transactions tax, 245 Treynor, Jack, 263 trial and error, 133, 141, 142, 182, 183, 188, 198, 265 Triangle Shirtwaist Fire, 378–379 tribbles, 190–205, 216 Trivers, Robert, 172 trolley dilemma, 339 Trusty, Jessica, 120 Tversky, Amos, 55, 58, 66–67, 68–69, 70–71, 90, 106, 113, 388 TWA Flight 800, 84–85 twins, 159, 161, 348 “two-legged goat effect,” 155 UBS, 61 Ultimatum Game, 336–338 uncertainty, 212, 218; risk vs., 53–55, 415 unemployment, 36–37 unintended consequences, 7, 248, 269, 330, 358, 375 United Kingdom, 222–223, 242, 377 University of Chicago, 22 uptick rule, 233 Urbach-Wiethe disease, 82–83 U.S.


pages: 654 words: 191,864

Thinking, Fast and Slow by Daniel Kahneman

Albert Einstein, Atul Gawande, availability heuristic, Bayesian statistics, behavioural economics, Black Swan, book value, Cass Sunstein, Checklist Manifesto, choice architecture, classic study, cognitive bias, cognitive load, complexity theory, correlation coefficient, correlation does not imply causation, Daniel Kahneman / Amos Tversky, delayed gratification, demand response, endowment effect, experimental economics, experimental subject, Exxon Valdez, feminist movement, framing effect, hedonic treadmill, hindsight bias, index card, information asymmetry, job satisfaction, John Bogle, John von Neumann, Kenneth Arrow, libertarian paternalism, Linda problem, loss aversion, medical residency, mental accounting, meta-analysis, nudge unit, pattern recognition, Paul Samuelson, peak-end rule, precautionary principle, pre–internet, price anchoring, quantitative trading / quantitative finance, random walk, Richard Thaler, risk tolerance, Robert Metcalfe, Ronald Reagan, Shai Danziger, sunk-cost fallacy, Supply of New York City Cabdrivers, systematic bias, TED Talk, The Chicago School, The Wisdom of Crowds, Thomas Bayes, transaction costs, union organizing, Walter Mischel, Yom Kippur War

Some individuals greatly overestimate the true number, others underestimate it, but when many judgments are averaged, the average tends to be quite accurate. The mechanism is straightforward: all individuals look at the same jar, and all their judgments have a common basis. On the other hand, the errors that individuals make are independent of the errors made by others, and (in the absence of a systematic bias) they tend to average to zero. However, the magic of error reduction works well only when the observations are independent and their errors uncorrelated. If the observers share a bias, the aggregation of judgments will not reduce it. Allowing the observers to influence each other effectively reduces the size of the sample, and with it the precision of the group estimate.


Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, billion-dollar mistake, bitcoin, blockchain, business intelligence, business logic, business process, c2.com, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, data science, database schema, deep learning, DevOps, distributed ledger, Donald Knuth, Edward Snowden, end-to-end encryption, Ethereum, ethereum blockchain, exponential backoff, fake news, fault tolerance, finite state, Flash crash, Free Software Foundation, full text search, functional programming, general-purpose programming language, Hacker News, informal economy, information retrieval, Internet of things, iterative process, John von Neumann, Ken Thompson, Kubernetes, Large Hadron Collider, level 1 cache, loose coupling, machine readable, machine translation, Marc Andreessen, microservices, natural language processing, Network effects, no silver bullet, operational security, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, SQL injection, statistical model, surveillance capitalism, systematic bias, systems thinking, Tragedy of the Commons, undersea cable, web application, WebSocket, wikimedia commons

When we develop predictive analytics systems, we are not merely automating a human’s decision by using software to specify the rules for when to say yes or no; we are even leaving the rules themselves to be inferred from data. However, the patterns learned by these systems are opaque: even if there is some correlation in the data, we may not know why. If there is a systematic bias in the input to an algorithm, the sys‐ tem will most likely learn and amplify that bias in its output [84]. In many countries, anti-discrimination laws prohibit treating people differently depending on protected traits such as ethnicity, age, gender, sexuality, disability, or beliefs. Other features of a person’s data may be analyzed, but what happens if they are correlated with protected traits?


pages: 1,237 words: 227,370

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, billion-dollar mistake, bitcoin, blockchain, business intelligence, business logic, business process, c2.com, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, data science, database schema, deep learning, DevOps, distributed ledger, Donald Knuth, Edward Snowden, end-to-end encryption, Ethereum, ethereum blockchain, exponential backoff, fake news, fault tolerance, finite state, Flash crash, Free Software Foundation, full text search, functional programming, general-purpose programming language, Hacker News, informal economy, information retrieval, Infrastructure as a Service, Internet of things, iterative process, John von Neumann, Ken Thompson, Kubernetes, Large Hadron Collider, level 1 cache, loose coupling, machine readable, machine translation, Marc Andreessen, microservices, natural language processing, Network effects, no silver bullet, operational security, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, SQL injection, statistical model, surveillance capitalism, systematic bias, systems thinking, Tragedy of the Commons, undersea cable, web application, WebSocket, wikimedia commons

When we develop predictive analytics systems, we are not merely automating a human’s decision by using software to specify the rules for when to say yes or no; we are even leaving the rules themselves to be inferred from data. However, the patterns learned by these systems are opaque: even if there is some correlation in the data, we may not know why. If there is a systematic bias in the input to an algorithm, the system will most likely learn and amplify that bias in its output [84]. In many countries, anti-discrimination laws prohibit treating people differently depending on protected traits such as ethnicity, age, gender, sexuality, disability, or beliefs. Other features of a person’s data may be analyzed, but what happens if they are correlated with protected traits?


pages: 1,351 words: 404,177

Nixonland: The Rise of a President and the Fracturing of America by Rick Perlstein

Aaron Swartz, affirmative action, Alistair Cooke, Alvin Toffler, American ideology, Apollo 11, Apollo 13, Bay Area Rapid Transit, Berlin Wall, Bretton Woods, cognitive dissonance, company town, cuban missile crisis, delayed gratification, desegregation, Dr. Strangelove, East Village, European colonialism, false flag, full employment, Future Shock, Golden Gate Park, guns versus butter model, Haight Ashbury, Herbert Marcuse, immigration reform, In Cold Blood by Truman Capote, index card, indoor plumbing, Joan Didion, Kitchen Debate, liberal capitalism, Mahatma Gandhi, Marshall McLuhan, military-industrial complex, Monroe Doctrine, moral panic, Neil Armstrong, New Urbanism, Norman Mailer, Own Your Own Home, Paul Samuelson, plutocrats, price mechanism, Ralph Nader, RAND corporation, rolodex, Ronald Reagan, sexual politics, Seymour Hersh, systematic bias, the medium is the message, traveling salesman, upwardly mobile, urban planning, urban renewal, W. E. B. Du Bois, walking around money, War on Poverty, white picket fence, Whole Earth Catalog

… “The answer, I think, is that Mayor Daley and his supporters have a point. Most of us in what is called the communication field are not rooted in the great mass of ordinary Americans—in Middle America. And the results show up not merely in occasional episodes such as the Chicago violence but more importantly in the systematic bias toward young people, minority groups, and the kind of presidential candidate who appeals to them. “To get a feel of this bias it is first necessary to understand the antagonism that divides the middle class of this country. On the one hand there are highly educated upper-income whites sure of themselves and brimming with ideas for doing things differently.


Principles of Corporate Finance by Richard A. Brealey, Stewart C. Myers, Franklin Allen

3Com Palm IPO, accelerated depreciation, accounting loophole / creative accounting, Airbus A320, Alan Greenspan, AOL-Time Warner, Asian financial crisis, asset allocation, asset-backed security, banking crisis, Bear Stearns, Bernie Madoff, big-box store, Black Monday: stock market crash in 1987, Black-Scholes formula, Boeing 747, book value, break the buck, Brownian motion, business cycle, buy and hold, buy low sell high, California energy crisis, capital asset pricing model, capital controls, Carl Icahn, Carmen Reinhart, carried interest, collateralized debt obligation, compound rate of return, computerized trading, conceptual framework, corporate governance, correlation coefficient, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-border payments, cross-subsidies, currency risk, discounted cash flows, disintermediation, diversified portfolio, Dutch auction, equity premium, equity risk premium, eurozone crisis, fear index, financial engineering, financial innovation, financial intermediation, fixed income, frictionless, fudge factor, German hyperinflation, implied volatility, index fund, information asymmetry, intangible asset, interest rate swap, inventory management, Iridium satellite, James Webb Space Telescope, junk bonds, Kenneth Rogoff, Larry Ellison, law of one price, linear programming, Livingstone, I presume, London Interbank Offered Rate, Long Term Capital Management, loss aversion, Louis Bachelier, low interest rates, market bubble, market friction, money market fund, moral hazard, Myron Scholes, new economy, Nick Leeson, Northern Rock, offshore financial centre, PalmPilot, Ponzi scheme, prediction markets, price discrimination, principal–agent problem, profit maximization, purchasing power parity, QR code, quantitative trading / quantitative finance, random walk, Real Time Gross Settlement, risk free rate, risk tolerance, risk/return, Robert Shiller, Scaled Composites, shareholder value, Sharpe ratio, short selling, short squeeze, Silicon Valley, Skype, SpaceShipOne, Steve Jobs, subprime mortgage crisis, sunk-cost fallacy, systematic bias, Tax Reform Act of 1986, The Nature of the Firm, the payments system, the rule of 72, time value of money, too big to fail, transaction costs, University of East Anglia, urban renewal, VA Linux, value at risk, Vanguard fund, vertical integration, yield curve, zero-coupon bond, zero-sum game, Zipcar

The investor may not stop to reflect on how little one can learn about expected returns from three years’ experience. Most individuals are also too conservative, that is, too slow to update their beliefs in the face of new evidence. People tend to update their beliefs in the correct direction but the magnitude of the change is less than rationality would require. Another systematic bias is overconfidence. For example, an American small business has just a 35% chance of surviving for five years. Yet the great majority of entrepreneurs think that they have a better than 70% chance of success.22 Similarly, most investors think they are better-than-average stock pickers. Two speculators who trade with each other cannot both make money, but nevertheless they may be prepared to continue trading because each is confident that the other is the patsy.