9 results back to index
23andMe, Affordable Care Act / Obamacare, Albert Einstein, big data - Walmart - Pop Tarts, bioinformatics, business intelligence, call centre, cloud computing, computer age, conceptual framework, Credit Default Swap, crowdsourcing, Daniel Kahneman / Amos Tversky, Danny Hillis, data is the new oil, David Brooks, East Village, Edward Snowden, Emanuel Derman, Erik Brynjolfsson, everywhere but in the productivity statistics, Frederick Winslow Taylor, Google Glasses, impulse control, income inequality, indoor plumbing, industrial robot, informal economy, Internet of things, invention of writing, John Markoff, John von Neumann, lifelogging, Mark Zuckerberg, market bubble, meta analysis, meta-analysis, money market fund, natural language processing, obamacare, pattern recognition, payday loans, personalized medicine, precision agriculture, pre–internet, Productivity paradox, RAND corporation, rising living standards, Robert Gordon, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley startup, six sigma, skunkworks, speech recognition, statistical model, Steve Jobs, Steven Levy, The Design of Experiments, the scientific method, Thomas Kuhn: the structure of scientific revolutions, unbanked and underbanked, underbanked, Von Neumann architecture, Watson beat the top human players on Jeopardy!
As soon as he accepted the job at Facebook, Hammerbacher began poring through technical papers and books that provided clues to the evolution of data science. In the spring of 2012, he taught a course in data science at the University of California at Berkeley. His first talk included a brisk yet comprehensive tour of the pertinent literature. His survey stretched from the first half of the twentieth century and the English statistician and biologist Ronald A. Fisher, who did pioneering work on the design of experiments using agricultural and gene data; to Hans Peter Luhn, the IBM scientist whose paper in the late 1950s imagined a computerized “business intelligence” system for mining information to improve business decisions; to contemporary computer scientists who have authored important works on data and discovery—Jim Gray, Tom Mitchell, Randy Bryant, and Peter Norvig. For each, Hammerbacher cited that person’s contribution, and what he had learned from reading each one’s books or papers.
Is God a Mathematician? by Mario Livio
Albert Einstein, Antoine Gombaud: Chevalier de Méré, Brownian motion, cellular automata, correlation coefficient, correlation does not imply causation, cosmological constant, Dava Sobel, double helix, Edmond Halley, Eratosthenes, Georg Cantor, Gerolamo Cardano, Gödel, Escher, Bach, Henri Poincaré, Isaac Newton, John von Neumann, music of the spheres, Myron Scholes, probability theory / Blaise Pascal / Pierre de Fermat, Russell's paradox, The Design of Experiments, the scientific method, traveling salesman
The person who brought probability: For an excellent biography, see Orel 1996. Mendel published his paper: Mendel 1865. An English translation can be found on the Web page created by R. B. Blumberg at http://www.mendelweb.org. While some questions related to the accuracy: See Fisher 1936, for example. the influential British statistician: For a brief description of some of his work see Tabak 2004. Fisher wrote an extremely original, nontechnical article about the design of experiments entitled “Mathematics of a Lady Tasting Tea” (see Fisher 1956). in his book Ars Conjectandi: For a superb translation see Bernoulli 1713b. He then proceeded to explain: Reprinted in Newman 1956. Shaw once wrote an insightful article: The article “The Vice of Gambling and the Virtue of Insurance” appears in Newman 1956. In a pamphlet entitled The Analyst: The pamphlet was written by George Berkeley in 1734.
Trick or Treatment: The Undeniable Facts About Alternative Medicine by Edzard Ernst, Simon Singh
Barry Marshall: ulcers, Berlin Wall, correlation does not imply causation, false memory syndrome, Florence Nightingale: pie chart, germ theory of disease, John Snow's cholera map, Louis Pasteur, meta analysis, meta-analysis, placebo effect, profit motive, publication bias, randomized controlled trial, Ronald Reagan, Simon Singh, The Design of Experiments, the scientific method
The trial had shown that there was a difference, that the woman was right and that the scientists were wrong. In fact, there is a good scientific reason why the two forms of tea should taste different. Milk added to tea leads to a less satisfying cup, because the milk becomes superheated and this causes proteins in the milk to deteriorate – these proteins then taste slightly sour. Fisher used this simple example as the basis for an entire book on scientific testing, The Design of Experiments, which went into great detail about the subtleties of trials. Despite its sheer simplicity and powerful ability to get to the truth, some alternative therapists argue that the clinical trial is a harsh test, which is somehow biased against their treatments. But that sort of attitude betrays a skewed understanding of the clinical trial, which merely seeks to establish the truth, regardless of the type of treatment being examined.
accounting loophole / creative accounting, affirmative action, bank run, banking crisis, Berlin Wall, bonus culture, Branko Milanovic, BRICs, call centre, Cass Sunstein, central bank independence, collapse of Lehman Brothers, conceptual framework, corporate governance, correlation does not imply causation, Credit Default Swap, deindustrialization, demographic transition, Diane Coyle, disintermediation, Edward Glaeser, endogenous growth, Eugene Fama: efficient market hypothesis, experimental economics, Fall of the Berlin Wall, Financial Instability Hypothesis, Francis Fukuyama: the end of history, George Akerlof, Gini coefficient, global supply chain, Gordon Gekko, greed is good, happiness index / gross national happiness, Hyman Minsky, If something cannot go on forever, it will stop - Herbert Stein's Law, illegal immigration, income inequality, income per capita, industrial cluster, information asymmetry, intangible asset, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Jane Jacobs, Joseph Schumpeter, Kenneth Arrow, Kenneth Rogoff, knowledge economy, labour market flexibility, light touch regulation, low skilled workers, market bubble, market design, market fundamentalism, megacity, Network effects, new economy, night-watchman state, Northern Rock, oil shock, Pareto efficiency, principal–agent problem, profit motive, purchasing power parity, railway mania, rising living standards, Ronald Reagan, selective serotonin reuptake inhibitor (SSRI), Silicon Valley, South Sea Bubble, Steven Pinker, The Design of Experiments, The Fortune at the Bottom of the Pyramid, The Market for Lemons, The Myth of the Rational Market, The Spirit Level, transaction costs, transfer pricing, tulip mania, ultimatum game, University of East Anglia, web application, web of trust, winner-take-all economy, World Values Survey, zero-sum game
Behavioral economics has become so fashionable partly because the experimental results are fascinating but also partly because so many of its new aficionados are delighted that it seems to overturn a key assumption in economics. Their delight is misplaced. For one thing, other experimental evidence indicates that humans behave selfishly in other contexts. In some experiments, economists have shown that markets operate exactly as conventional economic models based on rational self-interest would predict.3 For another thing, subtle changes in the design of experiments can change the outcomes dramatically, as economist John List has documented.4 List cautions against drawing hard and fast conclusions about human nature from the results available to us now: “A first lesson that I take from this body of research is that what we do not know dwarfs what we know.”5 Having admired this modesty in an economist, however, there is a good amount of experimental evidence from a wide range of contexts that people do have an innate sense of fairness.
3D printing, Airbnb, Albert Einstein, Berlin Wall, Black Swan, Chuck Templeton: OpenTable, clean water, collapse of Lehman Brothers, creative destruction, Credit Default Swap, crony capitalism, crowdsourcing, Danny Hillis, declining real wages, demographic dividend, Elon Musk, en.wikipedia.org, Eugene Fama: efficient market hypothesis, Fall of the Berlin Wall, follow your passion, game design, housing crisis, Hyman Minsky, industrial robot, invisible hand, James Dyson, Jane Jacobs, Jeff Bezos, jimmy wales, John Gruber, John Markoff, Joseph Schumpeter, Kickstarter, lone genius, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Martin Wolf, new economy, Paul Graham, Peter Thiel, QR code, race to the bottom, reshoring, Richard Florida, Ronald Reagan, shareholder value, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, six sigma, Skype, Steve Ballmer, Steve Jobs, Steve Wozniak, supply-chain management, Tesla Model S, The Chicago School, The Design of Experiments, the High Line, The Myth of the Rational Market, thinkpad, Tim Cook: Apple, too big to fail, tulip mania, We are the 99%, Y Combinator, young professional, Zipcar
David Kelley, Tim Brown, and Bill Moggridge have been key tutors in my learning about design, innovation, and creativity. David is not only a cofounder of IDEO; he founded the extraordinary d.school at Stanford. He and Tim are on the cover of what is probably my most impactful cover story, “The Power of Design.” David, Tim, and Bill believed that design should go beyond the physical to include the design of experiences, services, and even social systems, such as health and education. They codified that approach into the concept of design thinking, the scaffolding upon which Creative Intelligence is built. When Bill Moggridge took over the leadership of the Cooper Hewitt National Design Museum, I got to know the humor and grace of the man who designed one of the first laptop computers and fathered the field of interaction design.
The Structure of Scientific Revolutions by Thomas S. Kuhn, Ian Hacking
If the vacuum had a heat capacity, for example, heating by compression could be explained as the result of mixing gas with void. Or it might be due to a change in the specific heat of gases with changing pressure. And there were several other explanations besides. Many experiments were undertaken to elaborate these various possibilities and to distinguish between them; all these experiments arose from the caloric theory as paradigm, and all exploited it in the design of experiments and in the interpretation of results.8Once the phenomenon of heating by compression had been established, all further experiments in the area were paradigm-dependent in this way. Given the phenomenon, how else could an experiment to elucidate it have been chosen? Turn now to the theoretical problems of normal science, which fall into very nearly the same classes as the experimental and observational.
The Perfect Bet: How Science and Math Are Taking the Luck Out of Gambling by Adam Kucharski
Ada Lovelace, Albert Einstein, Antoine Gombaud: Chevalier de Méré, beat the dealer, Benoit Mandelbrot, butterfly effect, call centre, Chance favours the prepared mind, Claude Shannon: information theory, collateralized debt obligation, correlation does not imply causation, diversification, Edward Lorenz: Chaos theory, Edward Thorp, Everything should be made as simple as possible, Flash crash, Gerolamo Cardano, Henri Poincaré, Hibernia Atlantic: Project Express, if you build it, they will come, invention of the telegraph, Isaac Newton, John Nash: game theory, John von Neumann, locking in a profit, Louis Pasteur, Nash equilibrium, Norbert Wiener, p-value, performance metric, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative ﬁnance, random walk, Richard Feynman, Richard Feynman, Ronald Reagan, Rubik’s Cube, statistical model, The Design of Experiments, Watson beat the top human players on Jeopardy!, zero-sum game
But if we distribute them by picking locations at random, there is a chance that we will repeatedly pick similar locations. In which case, a treatment ends up concentrated only in one area, and we have a pretty lousy experiment. Suppose we want to test four treatments across sixteen trial sites, arranged in a four-by-four grid. How can we scatter the treatments across the area without risking all of them ending up in the same place? In his landmark book The Design of Experiments, Fisher suggested that the four treatments be distributed so that they appear in each row and column only once. If the field had good soil at one end and poor land at the other, all treatments therefore would be exposed to both conditions. As it happened, the pattern Fisher proposed had already found popularity elsewhere. It was common in classical architecture, where it was known as a Latin square, as shown in Figure 2.1.
The Rise and Fall of Modern Medicine by M. D. James le Fanu M. D.
Barry Marshall: ulcers, clean water, cuban missile crisis, discovery of penicillin, double helix, experimental subject, Gary Taubes, Isaac Newton, meta analysis, meta-analysis, rising living standards, selective serotonin reuptake inhibitor (SSRI), stem cell, telerobotics, The Design of Experiments, the scientific method, V2 rocket
She refused, remarking that she preferred milk to be in the cup before the tea was added. Fisher could not believe that there would be any difference in the taste and when the woman suggested an experiment be performed, he was enthusiastic. An immediate trial was organised, the woman confidently identified more than enough of the cups of tea into which the tea had been poured first to prove her case. In his classic book The Design of Experiments, published in 1935, Fisher used this example ‘to state the terms of the experiment minutely and distinctly; predicted all possible results, ascertaining by sensible reasoning, what probability should be assigned to each possible result under the assumption that the woman was guessing’.5 Thus Greenwood’s main intellectual legacy, which he was to pass on to Bradford Hill, was essentially two-fold: the historical contribution of statistical methods to elucidating the cause of substantial public health problems; and the importance of conducting properly designed experiments to test whether a new treatment was effective.
agricultural Revolution, Albert Einstein, demographic transition, Deng Xiaoping, Haber-Bosch Process, invention of gunpowder, Louis Pasteur, Pearl River Delta, precision agriculture, recommendation engine, The Design of Experiments
Even those mixtures that contained some ammonium sulfate had too little nitrogen to become commercially successful, and the enterprise, launched jointly with Sheridan Muspratt (1793–1886), an alkali manufacturer in Liverpool, failed after only about three years.44 Nitrogen in Agriculture 11 Liebig’s incorrect but aggressively held opinion that plants receive all of their nitrogen from the atmospheric deposition and benefit only from mineral fertilizer had a very beneficial effect: it inspired a series of field trials that still continue as the world’s longest-running crop experiment. In 1843 John Bennet Lawes (fig. 1.5, an Eton- and Oxford-educated landowner, began experimenting with unfertilized and variously fertilized crops on Broadbalk field in Rothamsted, a family estate in Hertfordshire he inherited in 1834. In the same year he invited Joseph Henry Gilbert, a young chemist with a doctorate from Liebig’s laboratory, to help with the design of experiments and to perform the necessary chemical analyses, a cooperative effort that ended only with Lawes’s death.45 Figure 1.5 John Bennet Lawes (1814–1900). Courtesy of Rothamsted Experimental Station, Harpenden, Hertfordshire, England. 12 Chapter 1 In order to test the validity of Liebig’s mineral theory, Lawes and Gilbert began continuous cultivation of wheat on plots receiving either no fertilizer or the following combinations of nutrients: minerals only (P, K, Na, Mg), minerals with nitrogen, or farmyard manure.