The Design of Experiments

14 results back to index


pages: 219 words: 74,775

Liquid: The Delightful and Dangerous Substances That Flow Through Our Lives by Mark Miodownik

3D printing, airport security, clean water, Ignaz Semmelweis: hand washing, mass immigration, megacity, The Design of Experiments

But, for others, the very notion of pouring the milk in first is anathema. In their perfect cup of tea, you put the tea in first, and then the milk. George Orwell was in this camp, arguing that this allows you to add exactly the right amount of milk for your preferred level of creaminess. You might doubt whether adding the milk before or after makes any difference to the taste – it being such a subtle difference. But in his book The Design of Experiments, Ronald Fisher investigated this question rigorously, inventing new statistical methods to do so. In his randomized tasting experiments, he found that, yes, people can taste the difference between adding milk before or after the tea. The methods described by Ronald Fisher revolutionized the mathematical discipline of statistics. It unfortunately did not revolutionize tea making in Britain, so even now if you order a cup of tea in a café, very rarely will they acknowledge that the sequence of milk and tea makes any difference to anyone.

Perhaps it will no longer involve the props of life jackets, oxygen masks and seatbelts – but we will always need ceremonies to celebrate the dangerous and delightful power of liquids. Further Reading Ball, Philip, Bright Earth: Art and the Invention of Colour, Vintage Books (2001) Faraday, Michael, The Chemical History of a Candle, Oxford University Press (2011) Fisher, Ronald, The Design of Experiments, Oliver and Boyd (1951) Jha, Alok, The Water Book, Headline (2016) Melville, Herman, Moby-Dick, Penguin Books (2001) Mitov, Michel, Sensitive Matter: Foams, Gels, Liquid Crystals, and Other Miracles, Harvard University Press (2012) Pretor-Pinney, Gavin, The Cloudspotter’s Guide, Sceptre (2007) Roach, Mary, Gulp: Adventures on the Alimentary Canal, Oneworld (2013) Rogers, Adam, Proof: The Science of Booze, Mariner Books (2015) Salsburg, David, The Lady Tasting Tea: How Statistics Revolutionized Science in the Twentieth century, Holt McDougal (2012) Spence, Charles, and Bentina Piqueras-Fiszman, The Perfect Meal: The Multisensory Science of Food and Dining, Wiley–Blackwell (2014) Standage, Tom, A History of the World in Six Glasses, Walker (2005) Vanhoenacker, Mark, Skyfaring: A Journey with a Pilot, Chatto & Windus (2015) Picture Credits Here: Pond skater.


The Book of Why: The New Science of Cause and Effect by Judea Pearl, Dana Mackenzie

affirmative action, Albert Einstein, Asilomar, Bayesian statistics, computer age, computer vision, correlation coefficient, correlation does not imply causation, Daniel Kahneman / Amos Tversky, Edmond Halley, Elon Musk, en.wikipedia.org, experimental subject, Isaac Newton, iterative process, John Snow's cholera map, Loebner Prize, loose coupling, Louis Pasteur, Menlo Park, pattern recognition, Paul Erdős, personalized medicine, Pierre-Simon Laplace, placebo effect, prisoner's dilemma, probability theory / Blaise Pascal / Pierre de Fermat, randomized controlled trial, selection bias, self-driving car, Silicon Valley, speech recognition, statistical model, Stephen Hawking, Steve Jobs, strong AI, The Design of Experiments, the scientific method, Thomas Bayes, Turing test

Statisticians very often control for proxies when the actual causal variable can’t be measured; for instance, party affiliation might be used as a proxy for political beliefs. Because Z isn’t a perfect measure of M, some of the influence of X on Y might “leak through” if you control for Z. Nevertheless, controlling for Z is still a mistake. While the bias might be less than if you controlled for M, it is still there. For this reason later statisticians, notably David Cox in his textbook The Design of Experiments (1958), warned that you should only control for Z if you have a “strong prior reason” to believe that it is not affected by X. This “strong prior reason” is nothing more or less than a causal assumption. He adds, “Such hypotheses may be perfectly in order, but the scientist should always be aware when they are being appealed to.” Remember that it’s 1958, in the midst of the great prohibition on causality.

., on, 87–88 reduction of, 85 in science, 6, 84–85 See also Big Data David, Richard, 187 Dawid, Phillip, 237, 350 de Fermat, Pierre, 4–5 de Moivre, Abraham, 5 death, proximate cause of, 288 decision problem, 238–239 decoding, 125–126, 127 (fig.), 128 deconfounders, 139–140 back-door paths for, 158–159 in intervention, 220 deconfounding games, 159–165 deduction, induction and, 93 deep learning, 3, 30, 359, 362 Democritus, 34 The Design of Experiments (Cox), 154 developmental factors, of guinea pigs, 74–76, 75 (fig.) Dewar, James, 53 Diaconis, Persi, 196 difference, in coefficients, 327 direct effect, 297, 300–301, 317–318 in causal diagram, 320–321 of intervention, 323–324 in mediation formula, 333 mediators and, 326, 332 See also indirect effects; natural direct effect The Direction of Time (Reichenbach), 199 discrimination, 311–312, 315–316 DNA test, 94–95, 122, 123 (fig.), 124, 342 do-calculus, 241–242 backdoor criterion in, 234 completeness of, 243–244 decision problem in, 238–239 elimination procedure in, 231–232 front-door adjustment in, 235–237, 236 (fig.)


pages: 239 words: 70,206

Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else by Steve Lohr

"Robert Solow", 23andMe, Affordable Care Act / Obamacare, Albert Einstein, big data - Walmart - Pop Tarts, bioinformatics, business cycle, business intelligence, call centre, cloud computing, computer age, conceptual framework, Credit Default Swap, crowdsourcing, Daniel Kahneman / Amos Tversky, Danny Hillis, data is the new oil, David Brooks, East Village, Edward Snowden, Emanuel Derman, Erik Brynjolfsson, everywhere but in the productivity statistics, Frederick Winslow Taylor, Google Glasses, impulse control, income inequality, indoor plumbing, industrial robot, informal economy, Internet of things, invention of writing, Johannes Kepler, John Markoff, John von Neumann, lifelogging, Mark Zuckerberg, market bubble, meta analysis, meta-analysis, money market fund, natural language processing, obamacare, pattern recognition, payday loans, personalized medicine, precision agriculture, pre–internet, Productivity paradox, RAND corporation, rising living standards, Robert Gordon, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley startup, six sigma, skunkworks, speech recognition, statistical model, Steve Jobs, Steven Levy, The Design of Experiments, the scientific method, Thomas Kuhn: the structure of scientific revolutions, unbanked and underbanked, underbanked, Von Neumann architecture, Watson beat the top human players on Jeopardy!

As soon as he accepted the job at Facebook, Hammerbacher began poring through technical papers and books that provided clues to the evolution of data science. In the spring of 2012, he taught a course in data science at the University of California at Berkeley. His first talk included a brisk yet comprehensive tour of the pertinent literature. His survey stretched from the first half of the twentieth century and the English statistician and biologist Ronald A. Fisher, who did pioneering work on the design of experiments using agricultural and gene data; to Hans Peter Luhn, the IBM scientist whose paper in the late 1950s imagined a computerized “business intelligence” system for mining information to improve business decisions; to contemporary computer scientists who have authored important works on data and discovery—Jim Gray, Tom Mitchell, Randy Bryant, and Peter Norvig. For each, Hammerbacher cited that person’s contribution, and what he had learned from reading each one’s books or papers.


pages: 360 words: 85,321

The Perfect Bet: How Science and Math Are Taking the Luck Out of Gambling by Adam Kucharski

Ada Lovelace, Albert Einstein, Antoine Gombaud: Chevalier de Méré, beat the dealer, Benoit Mandelbrot, butterfly effect, call centre, Chance favours the prepared mind, Claude Shannon: information theory, collateralized debt obligation, correlation does not imply causation, diversification, Edward Lorenz: Chaos theory, Edward Thorp, Everything should be made as simple as possible, Flash crash, Gerolamo Cardano, Henri Poincaré, Hibernia Atlantic: Project Express, if you build it, they will come, invention of the telegraph, Isaac Newton, Johannes Kepler, John Nash: game theory, John von Neumann, locking in a profit, Louis Pasteur, Nash equilibrium, Norbert Wiener, p-value, performance metric, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Richard Feynman, Ronald Reagan, Rubik’s Cube, statistical model, The Design of Experiments, Watson beat the top human players on Jeopardy!, zero-sum game

But if we distribute them by picking locations at random, there is a chance that we will repeatedly pick similar locations. In which case, a treatment ends up concentrated only in one area, and we have a pretty lousy experiment. Suppose we want to test four treatments across sixteen trial sites, arranged in a four-by-four grid. How can we scatter the treatments across the area without risking all of them ending up in the same place? In his landmark book The Design of Experiments, Fisher suggested that the four treatments be distributed so that they appear in each row and column only once. If the field had good soil at one end and poor land at the other, all treatments therefore would be exposed to both conditions. As it happened, the pattern Fisher proposed had already found popularity elsewhere. It was common in classical architecture, where it was known as a Latin square, as shown in Figure 2.1.


pages: 315 words: 93,628

Is God a Mathematician? by Mario Livio

Albert Einstein, Antoine Gombaud: Chevalier de Méré, Brownian motion, cellular automata, correlation coefficient, correlation does not imply causation, cosmological constant, Dava Sobel, double helix, Edmond Halley, Eratosthenes, Georg Cantor, Gerolamo Cardano, Gödel, Escher, Bach, Henri Poincaré, Isaac Newton, Johannes Kepler, John von Neumann, music of the spheres, Myron Scholes, probability theory / Blaise Pascal / Pierre de Fermat, Russell's paradox, Thales of Miletus, The Design of Experiments, the scientific method, traveling salesman

The person who brought probability: For an excellent biography, see Orel 1996. Mendel published his paper: Mendel 1865. An English translation can be found on the Web page created by R. B. Blumberg at http://www.mendelweb.org. While some questions related to the accuracy: See Fisher 1936, for example. the influential British statistician: For a brief description of some of his work see Tabak 2004. Fisher wrote an extremely original, nontechnical article about the design of experiments entitled “Mathematics of a Lady Tasting Tea” (see Fisher 1956). in his book Ars Conjectandi: For a superb translation see Bernoulli 1713b. He then proceeded to explain: Reprinted in Newman 1956. Shaw once wrote an insightful article: The article “The Vice of Gambling and the Virtue of Insurance” appears in Newman 1956. In a pamphlet entitled The Analyst: The pamphlet was written by George Berkeley in 1734.


pages: 442 words: 94,734

The Art of Statistics: Learning From Data by David Spiegelhalter

Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Carmen Reinhart, complexity theory, computer vision, correlation coefficient, correlation does not imply causation, dark matter, Edmond Halley, Estimating the Reproducibility of Psychological Science, Hans Rosling, Kenneth Rogoff, meta analysis, meta-analysis, Nate Silver, Netflix Prize, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, speech recognition, statistical model, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus

Margins of error for homicide statistics: https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/compendium/focusonviolentcrimeandsexualoffences/yearendingmarch2016/homicide#statistical-interpretation-of-trends-in-homicides. CHAPTER 10: ANSWERING QUESTIONS AND CLAIMING DISCOVERIES 1. J. Arbuthnot, ‘An Argument for Divine Providence …’, Philosophical Transactions 27 (1710), 186–90. 2. R. A. Fisher, The Design of Experiments (Oliver and Boyd, 1935), p. 19. 3. There are 54 × 53 × 52 … × 2 × 1 permutations, which is termed ‘54 factorial’ and denoted 54!. This is roughly 2, with 71 zeros following it. Note that the number of possible ways a deck of 52 cards can be dealt is 52!, and so even if we dealt a million million hands a second, the number of years it would take to work through all possible permutations has 48 zeros after it, whereas the age of the universe is only 14,000,000,000 years.


pages: 314 words: 91,652

The Structure of Scientific Revolutions by Thomas S. Kuhn, Ian Hacking

Albert Einstein, Arthur Eddington, business cycle, cuban missile crisis, experimental subject, Isaac Newton, The Design of Experiments, Thomas Kuhn: the structure of scientific revolutions

If the vacuum had a heat capacity, for example, heating by compression could be explained as the result of mixing gas with void. Or it might be due to a change in the specific heat of gases with changing pressure. And there were several other explanations besides. Many experiments were undertaken to elaborate these various possibilities and to distinguish between them; all these experiments arose from the caloric theory as paradigm, and all exploited it in the design of experiments and in the interpretation of results.8Once the phenomenon of heating by compression had been established, all further experiments in the area were paradigm-dependent in this way. Given the phenomenon, how else could an experiment to elucidate it have been chosen? Turn now to the theoretical problems of normal science, which fall into very nearly the same classes as the experimental and observational.


pages: 550 words: 89,316

The Sum of Small Things: A Theory of the Aspirational Class by Elizabeth Currid-Halkett

assortative mating, back-to-the-land, barriers to entry, Bernie Sanders, BRICs, Capital in the Twenty-First Century by Thomas Piketty, clean water, cognitive dissonance, David Brooks, deindustrialization, Deng Xiaoping, discrete time, disruptive innovation, Downton Abbey, East Village, Edward Glaeser, en.wikipedia.org, Etonian, Geoffrey West, Santa Fe Institute, income inequality, iterative process, knowledge economy, longitudinal study, Mason jar, means of production, NetJets, new economy, New Urbanism, plutocrats, Plutocrats, post scarcity, post-industrial society, profit maximization, Richard Florida, selection bias, Silicon Valley, The Design of Experiments, the High Line, The inhabitant of London could order by telephone, sipping his morning tea in bed, the various products of the whole earth, the market place, Thorstein Veblen, Tony Hsieh, Tyler Cowen: Great Stagnation, upwardly mobile, Veblen good, women in the workforce

On this particular occasion, the host poured Bristol a cup of tea and poured in the milk thereafter. Bristol protested, explaining that she liked her “milk in first,” as the tea tasted better that way. Despite skeptical resistance from those in attendance, Bristol insisted she could tell the difference. Ronald Alymer Fischer, one of those present, who would later go on to become “Sir Fischer” and the godfather of modern empirical statistics with his famous book The Design of Experiments, had an idea. Surely, if eight cups of tea were poured, four with “milk in first” and the other four with tea in first, and the lady identified them correctly then she would be proven right (her chances of merely guessing by chance would be 1 in 70). Fischer, like everyone else present, believed Bristol would likely fail the test. In other words, they believed Bristol’s belief in her tea acumen was embedded in a false sense of aesthetics and taste rather than reality.


pages: 357 words: 110,072

Trick or Treatment: The Undeniable Facts About Alternative Medicine by Edzard Ernst, Simon Singh

animal electricity, Barry Marshall: ulcers, Berlin Wall, correlation does not imply causation, false memory syndrome, Florence Nightingale: pie chart, germ theory of disease, John Snow's cholera map, Louis Pasteur, meta analysis, meta-analysis, placebo effect, profit motive, publication bias, randomized controlled trial, Ronald Reagan, Simon Singh, The Design of Experiments, the scientific method

The trial had shown that there was a difference, that the woman was right and that the scientists were wrong. In fact, there is a good scientific reason why the two forms of tea should taste different. Milk added to tea leads to a less satisfying cup, because the milk becomes superheated and this causes proteins in the milk to deteriorate – these proteins then taste slightly sour. Fisher used this simple example as the basis for an entire book on scientific testing, The Design of Experiments, which went into great detail about the subtleties of trials. Despite its sheer simplicity and powerful ability to get to the truth, some alternative therapists argue that the clinical trial is a harsh test, which is somehow biased against their treatments. But that sort of attitude betrays a skewed understanding of the clinical trial, which merely seeks to establish the truth, regardless of the type of treatment being examined.


pages: 385 words: 101,761

Creative Intelligence: Harnessing the Power to Create, Connect, and Inspire by Bruce Nussbaum

3D printing, Airbnb, Albert Einstein, Berlin Wall, Black Swan, Chuck Templeton: OpenTable:, clean water, collapse of Lehman Brothers, creative destruction, Credit Default Swap, crony capitalism, crowdsourcing, Danny Hillis, declining real wages, demographic dividend, disruptive innovation, Elon Musk, en.wikipedia.org, Eugene Fama: efficient market hypothesis, Fall of the Berlin Wall, follow your passion, game design, housing crisis, Hyman Minsky, industrial robot, invisible hand, James Dyson, Jane Jacobs, Jeff Bezos, jimmy wales, John Gruber, John Markoff, Joseph Schumpeter, Kickstarter, lone genius, longitudinal study, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Martin Wolf, new economy, Paul Graham, Peter Thiel, QR code, race to the bottom, reshoring, Richard Florida, Ronald Reagan, shareholder value, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, six sigma, Skype, Steve Ballmer, Steve Jobs, Steve Wozniak, supply-chain management, Tesla Model S, The Chicago School, The Design of Experiments, the High Line, The Myth of the Rational Market, thinkpad, Tim Cook: Apple, too big to fail, tulip mania, We are the 99%, Y Combinator, young professional, Zipcar

David Kelley, Tim Brown, and Bill Moggridge have been key tutors in my learning about design, innovation, and creativity. David is not only a cofounder of IDEO; he founded the extraordinary d.school at Stanford. He and Tim are on the cover of what is probably my most impactful cover story, “The Power of Design.” David, Tim, and Bill believed that design should go beyond the physical to include the design of experiences, services, and even social systems, such as health and education. They codified that approach into the concept of design thinking, the scaffolding upon which Creative Intelligence is built. When Bill Moggridge took over the leadership of the Cooper Hewitt National Design Museum, I got to know the humor and grace of the man who designed one of the first laptop computers and fathered the field of interaction design.


pages: 523 words: 111,615

The Economics of Enough: How to Run the Economy as if the Future Matters by Diane Coyle

"Robert Solow", accounting loophole / creative accounting, affirmative action, bank run, banking crisis, Berlin Wall, bonus culture, Branko Milanovic, BRICs, business cycle, call centre, Cass Sunstein, central bank independence, collapse of Lehman Brothers, conceptual framework, corporate governance, correlation does not imply causation, Credit Default Swap, deindustrialization, demographic transition, Diane Coyle, different worldview, disintermediation, Edward Glaeser, endogenous growth, Eugene Fama: efficient market hypothesis, experimental economics, Fall of the Berlin Wall, Financial Instability Hypothesis, Francis Fukuyama: the end of history, George Akerlof, Gini coefficient, global supply chain, Gordon Gekko, greed is good, happiness index / gross national happiness, hedonic treadmill, Hyman Minsky, If something cannot go on forever, it will stop - Herbert Stein's Law, illegal immigration, income inequality, income per capita, industrial cluster, information asymmetry, intangible asset, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Jane Jacobs, Joseph Schumpeter, Kenneth Arrow, Kenneth Rogoff, knowledge economy, light touch regulation, low skilled workers, market bubble, market design, market fundamentalism, megacity, Network effects, new economy, night-watchman state, Northern Rock, oil shock, Pareto efficiency, principal–agent problem, profit motive, purchasing power parity, railway mania, rising living standards, Ronald Reagan, selective serotonin reuptake inhibitor (SSRI), Silicon Valley, South Sea Bubble, Steven Pinker, The Design of Experiments, The Fortune at the Bottom of the Pyramid, The Market for Lemons, The Myth of the Rational Market, The Spirit Level, transaction costs, transfer pricing, tulip mania, ultimatum game, University of East Anglia, web application, web of trust, winner-take-all economy, World Values Survey, zero-sum game

Behavioral economics has become so fashionable partly because the experimental results are fascinating but also partly because so many of its new aficionados are delighted that it seems to overturn a key assumption in economics. Their delight is misplaced. For one thing, other experimental evidence indicates that humans behave selfishly in other contexts. In some experiments, economists have shown that markets operate exactly as conventional economic models based on rational self-interest would predict.3 For another thing, subtle changes in the design of experiments can change the outcomes dramatically, as economist John List has documented.4 List cautions against drawing hard and fast conclusions about human nature from the results available to us now: “A first lesson that I take from this body of research is that what we do not know dwarfs what we know.”5 Having admired this modesty in an economist, however, there is a good amount of experimental evidence from a wide range of contexts that people do have an innate sense of fairness.


Enriching the Earth: Fritz Haber, Carl Bosch, and the Transformation of World Food Production by Vaclav Smil

agricultural Revolution, Albert Einstein, demographic transition, Deng Xiaoping, Haber-Bosch Process, invention of gunpowder, Louis Pasteur, Pearl River Delta, precision agriculture, recommendation engine, The Design of Experiments

Even those mixtures that contained some ammonium sulfate had too little nitrogen to become commercially successful, and the enterprise, launched jointly with Sheridan Muspratt (1793–1886), an alkali manufacturer in Liverpool, failed after only about three years.44 Nitrogen in Agriculture 11 Liebig’s incorrect but aggressively held opinion that plants receive all of their nitrogen from the atmospheric deposition and benefit only from mineral fertilizer had a very beneficial effect: it inspired a series of field trials that still continue as the world’s longest-running crop experiment. In 1843 John Bennet Lawes (fig. 1.5, an Eton- and Oxford-educated landowner, began experimenting with unfertilized and variously fertilized crops on Broadbalk field in Rothamsted, a family estate in Hertfordshire he inherited in 1834. In the same year he invited Joseph Henry Gilbert, a young chemist with a doctorate from Liebig’s laboratory, to help with the design of experiments and to perform the necessary chemical analyses, a cooperative effort that ended only with Lawes’s death.45 Figure 1.5 John Bennet Lawes (1814–1900). Courtesy of Rothamsted Experimental Station, Harpenden, Hertfordshire, England. 12 Chapter 1 In order to test the validity of Liebig’s mineral theory, Lawes and Gilbert began continuous cultivation of wheat on plots receiving either no fertilizer or the following combinations of nutrients: minerals only (P, K, Na, Mg), minerals with nitrogen, or farmyard manure.


pages: 564 words: 163,106

The Rise and Fall of Modern Medicine by M. D. James le Fanu M. D.

Barry Marshall: ulcers, clean water, cuban missile crisis, discovery of penicillin, double helix, experimental subject, Gary Taubes, Isaac Newton, lateral thinking, meta analysis, meta-analysis, rising living standards, selective serotonin reuptake inhibitor (SSRI), stem cell, telerobotics, The Design of Experiments, the scientific method, V2 rocket

She refused, remarking that she preferred milk to be in the cup before the tea was added. Fisher could not believe that there would be any difference in the taste and when the woman suggested an experiment be performed, he was enthusiastic. An immediate trial was organised, the woman confidently identified more than enough of the cups of tea into which the tea had been poured first to prove her case. In his classic book The Design of Experiments, published in 1935, Fisher used this example ‘to state the terms of the experiment minutely and distinctly; predicted all possible results, ascertaining by sensible reasoning, what probability should be assigned to each possible result under the assumption that the woman was guessing’.5 Thus Greenwood’s main intellectual legacy, which he was to pass on to Bradford Hill, was essentially two-fold: the historical contribution of statistical methods to elucidating the cause of substantial public health problems; and the importance of conducting properly designed experiments to test whether a new treatment was effective.


Statistics in a Nutshell by Sarah Boslaugh

Antoine Gombaud: Chevalier de Méré, Bayesian statistics, business climate, computer age, correlation coefficient, experimental subject, Florence Nightingale: pie chart, income per capita, iterative process, job satisfaction, labor-force participation, linear programming, longitudinal study, meta analysis, meta-analysis, p-value, pattern recognition, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, purchasing power parity, randomized controlled trial, selection bias, six sigma, statistical model, The Design of Experiments, the scientific method, Thomas Bayes, Vilfredo Pareto

Chapter 18 Christensen, Larry B. 2006. Experimental Methodology, 10th ed. Boston: Allyn & Bacon. This is a very readable and comprehensive introduction to research and experimental design with a focus on educational and psychological topics. Fisher, R.A. 1990. Statistical Methods, Experimental Design, and Scientific Inference: A Re-issue of Statistical Methods for Research Workers, the Design of Experiments, and Statistical Methods and Scientific Inference. Oxford: Oxford University Press. If you want to read the original rationale for many of the designs and issues described in this chapter, there is no better place than the original source. The Framingham Heart Study. http://www.framinghamheartstudy.org/. This is the official website of one of the largest, longest, and most famous prospective cohort studies in the history of medicine.