122 results back to index

**
Darwin Among the Machines
** by
George Dyson

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, British Empire, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, Danny Hillis, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, Richard Feynman, spectrum auction, strong AI, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture

., 23. 48.Turing, “Computing Machinery,” 456. 49.Turing, “Intelligent Machinery,” 21–22. 50.Turing, “Systems of Logic Based on Ordinals,” Proceedings of the London Mathematical Society, 2d ser. 45 (1939); reprinted in Davis, The Undecidable, 209. 51.John von Neumann, 1948, “The General and Logical Theory of Automata,” in Lloyd A. Jeffress, ed., Cerebral Mechanisms in Behavior: The Hixon Symposium (New York: Hafner, 1951), 26. 52.Leibniz, 1714, The Monadology, in George R. Montgomery, trans., Basic Writings: Discourse on Metaphysics; Correspondence with Arnauld; Monadology (La Salle, Ill.: Open Court, 1902), 253. CHAPTER 5 1.John von Neumann to Gleb Wataghin, ca. 1946, as reported by Freeman J. Dyson, Disturbing the Universe (New York: Harper & Row, 1979), 194. 2.Stanislaw Ulam, Adventures of a Mathematician (New York: Scribner’s, 1976), 231. 3.Nicholas Vonneumann, “John von Neumann: Formative Years,” Annals of the History of Computing 11, no. 3 (1989): 172. 4.Eugene P. Wigner, “John von Neumann—A Case Study of Scientific Creativity,” Annals of the History of Computing. 11, no. 3 (1989): 168. 5.Edward Teller, in Jean R.

…

Brink and Roland Haden, “Interviews with Edward Teller and Eugene P. Wigner,” Annals of the History of Computing 11, no. 3 (1989): 177. 6.Stanislaw Ulam, “John von Neumann, 1903–1957,” Bulletin of the American Mathematical Society 64, no. 3 (May 1958): 1. 7.Eugene Wigner, “Two Kinds of Reality,” The Monnist 49, no. 2 (April 1964); reprinted in Symmetries and Reflections (Cambridge: MIT Press, 1967), 198. 8.John von Neumann, statement on nomination to membership in the AEC, 8 March 1955, von Neumann Papers, Library of Congress; in William Aspray, John von Neumann and the Origins of Modern Computing (Cambridge: MIT Press, 1990), 247. 9.John von Neumann, as quoted by J. Robert Oppenheimer in testimony before the AEC Personnel Security Board, 16 April 1954, In the Matter of J. Robert Oppenheimer (Washington, D.C.: Government Printing Office, 1954; reprint, Cambridge: MIT Press, 1970), 246 (page citation is to the reprint edition). 10.Nicholas Metropolis, “The MANIAC,” in Nicholas Metropolis, J.

…

.: Mathematical Association of America, 1977), 119. 43.Martin Schwarzschild, interview by William Aspray, 18 November 1986, OH 124, Charles Babbage Institute, University of Minnesota, Minneapolis. 44.Edmund C. Berkeley, Giant Brains (New York: John Wiley, 1949), 5. 45.John von Neumann, 1948, “The General and Logical Theory of Automata,” in Lloyd A. Jeffress, ed., Cerebral Mechanisms in Behavior: The Hixon Symposium (New York: Hafner, 1951), 31. 46.Stanislaw Ulam, Adventures of a Mathematician (New York: Scribner’s, 1976), 242. 47.John von Neumann, 1948, response to W. S. McCulloch’s paper “Why the Mind Is in the Head,” Hixon Symposium, September 1948, in Jeffress, Cerebral Mechanisms, 109–111. 48.John von Neumann to Oswald Veblen, memorandum, 26 March 1945, “On the Use of Variational Methods in Hydrodynamics,” reprinted in John von Neumann, Theory of Games, Astrophysics, Hydrodynamics and Meteorology, vol. 6 of Collected Works, ed. Abraham Taub (Oxford: Pergamon Press, 1963), 357.

**
The Man Who Invented the Computer
** by
Jane Smiley

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1919 Motor Transport Corps convoy, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Arthur Eddington, British Empire, c2.com, computer age, Fellow of the Royal Society, Henri Poincaré, IBM and the Holocaust, Isaac Newton, John von Neumann, Karl Jansky, Norbert Wiener, RAND corporation, Turing machine, V2 rocket, Vannevar Bush, Von Neumann architecture

In the book Colossus by Jack Copeland, photograph 50 is a picture of John von Neumann, standing beside the Princeton IAS computer. The picture is undated, but the IAS computer began to operate in the summer of 1951 and was officially operational on June 10, 1952. Along the bottom of the wall of hardware runs a row of shiny metal cylinders, their ends pointing upward at about a forty-five-degree angle (fifteen are visible in the photo). These cylinders are Williams tubes, and they constituted the memory of the IAS computer. At this point, von Neumann had been organizing his computer project for at least seven years. Back in the summer of 1946, when Atanasoff was told that the navy computer project was off, he was not told why, but part of the reason was that in late 1945, the very well connected John von Neumann had entertained letters of interest from the University of Chicago and MIT, with further feelers from Harvard and Columbia.

…

John Mauchly, aged thirty, was teaching at Ursinus College in Pennsylvania—his passion was weather prediction, and he had his students attempting to find mathematical correlations between U.S. rainfall and patterns of solar rotation. J. Presper Eckert, only eighteen, was applying to college at MIT, though in the end he went to business school at the University of Pennsylvania. Konrad Zuse, in Berlin, had already built one computer (the Z1) in his parents’ apartment. He later said that if the building had not been bombed, he would not have been able to get his machine out of the apartment. John von Neumann, born in Hungary but living in Princeton, New Jersey, had become so convinced that war in Europe was inevitable that he had applied for U.S. citizenship. He received his naturalization papers in December 1937. Von Neumann was one of the most talented mathematicians of his day, but he wasn’t yet involved with computers. It is the weaving of these individual stories that makes up the whole story and causes it to become not merely the tale of an invention, but a saga of how the mind works, and of how the world works.

…

In some ways, Alan Turing was Atanasoff’s precise opposite, drawn to pure mathematics rather than practical physics, educated to think rather than to tinker, disorganized in his approach rather than systematic, never a family man and required by his affections and his war work to be utterly secretive. His figure is now so mysterious and tragically evocative that he has become the most famous of our inventors. The man who was best known in his own lifetime, John von Neumann, has retreated into history, more associated with the atomic bomb and the memory of the cold war than with the history of the computer, but it was von Neumann who made himself the architect of that history without, in some sense, ever lifting a screwdriver (in fact, his wife said that he was not really capable of lifting a screwdriver). It is von Neumann for whom partisans of John Mauchly and J.

**
Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World
** by
James D. Miller

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

23andMe, affirmative action, Albert Einstein, artificial general intelligence, Asperger Syndrome, barriers to entry, brain emulation, cloud computing, cognitive bias, correlation does not imply causation, crowdsourcing, Daniel Kahneman / Amos Tversky, David Brooks, David Ricardo: comparative advantage, Deng Xiaoping, en.wikipedia.org, feminist movement, Flynn Effect, friendly AI, hive mind, impulse control, indoor plumbing, invention of agriculture, Isaac Newton, John von Neumann, knowledge worker, Long Term Capital Management, low skilled workers, Netflix Prize, neurotypical, pattern recognition, Peter Thiel, phenotype, placebo effect, prisoner's dilemma, profit maximization, Ray Kurzweil, recommendation engine, reversible computing, Richard Feynman, Richard Feynman, Rodney Brooks, Silicon Valley, Singularitarianism, Skype, statistical model, Stephen Hawking, Steve Jobs, supervolcano, technological singularity, The Coming Technological Singularity, the scientific method, Thomas Malthus, transaction costs, Turing test, Vernor Vinge, Von Neumann architecture

Von Neumann made Stalin unwilling to risk war because von Neumann shaped U.S. weapons policy—in part by pushing the United States to develop hydrogen bombs—to let Stalin know that the only human life Stalin actually valued would almost certainly perish in World War III. 20 Johnny helped develop a superweapon, played a key role in integrating it into his nation’s military, advocated that it be used, and then made sure that his nation’s enemies knew that in a nuclear war they would be personally struck by this superweapon. John von Neumann could himself reasonably be considered the most powerful weapon ever to rest on American soil. Now consider the strategic implications if the Chinese high-tech sector and military acquired a million computers with the brilliance of John von Neumann, or if, through genetic manipulation, they produced a few thousand von Neumann-ish minds every year. Contemplate the magnitude of the resources the US military would pour into artificial intelligence if it thought that a multitude of digital or biological von Neumanns would someday power the Chinese economy and military.

…

The economic and martial advantages of having a von Neumann-or-above-level intellect are so enormous that if it proves practical to mass-produce them, they will be mass-produced. A biographer of John von Neumann wrote, “The cheapest way to make the world richer would be to get lots of his like.”21 A world with a million Johnnies, cooperating and competing with each other, has a reasonable chance of giving us something spectacular, beyond what even science fiction authors can imagine—at least if mankind survives the experience. Von Neumann’s existence highlights the tremendous variance in human intelligence, and so illuminates the minimum potential gains of simply raising a new generation’s intelligence to the maximum of what our species’ current phenotype can sustain. John von Neumann and a few other Hungarian scientists who immigrated to the United States were jokingly called “Martians” because of their strange accents and seemingly superhuman intelligence.22 If von Neumann really did have an extraterrestrial parent, whose genes arose, say, out of an advanced eugenics program that Earth couldn’t hope to replicate for a million years, then I wouldn’t infer from his existence that we could get many of him.

…

But whole brain emulation is still a path to the Singularity that could work, even if a Kurzweilian merger proves beyond the capacity of bioengineers. If we had whole brain emulations, Moore’s Law would eventually give us some kind of Singularity. Imagine we just simulated the brain of John von Neumann. If the (software adjusted) speed of computers doubled every year, then in twenty years we could run this software on computers that were a million times faster and in forty years on computers that were a trillion times faster. The innovations that a trillion John von Neumanns could discover in one year would change the world beyond our current ability to imagine. 3.Clues from the Brain Even if we never figure out how to emulate our brains or merge them with machines, clues to how our brains work could help scientists figure out how to create other kinds of human-level artificial intelligence.

**
A Beautiful Mind
** by
Sylvia Nasar

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Al Roth, Albert Einstein, Andrew Wiles, Brownian motion, cognitive dissonance, Columbine, experimental economics, fear of failure, Henri Poincaré, invisible hand, Isaac Newton, John Conway, John Nash: game theory, John von Neumann, Kenneth Rogoff, linear programming, lone genius, market design, medical residency, Nash equilibrium, Norbert Wiener, Paul Erdős, prisoner's dilemma, RAND corporation, Ronald Coase, second-price auction, Silicon Valley, Simon Singh, spectrum auction, The Wealth of Nations by Adam Smith, Thorstein Veblen, upwardly mobile

Nash shared von Neumann’s interest in game theory, quantum mechanics, real algebraic variables, hydrodynamic turbulence, and computer architecture. 6. See, for example, Ulam, “John von Neumann,” op. cit. 7. Norman McRae, John von Neumann (New York: Pantheon Books, 1992), pp. 350–56. 8. John von Neumann, The Computer and the Brain (New Haven: Yale University Press, 1959). 9. See, for example, G. H. Hardy, A Mathematician’s Apology (Cambridge, U.K.: Cambridge University Press, 1967), with a foreword by C. P. Snow. 10. Ulam, “John von Neumann,” op. cit. 11. Poundstone, op. cit. 12. Poundstone, Prisoner’s Dilemma, p. 190. 13. Clay Blair, Jr., “Passing of a Great Mind,” Life (February 1957), pp. 89–90, as quoted by Poundstone, op. cit., p. 143. 14. Poundstone, op. cit. 15. Ulam, “John von Neumann,” op. cit. 16. Harold Kuhn, interview, 3.97. 17. Paul R. Halmos, “The Legend of John von Neumann,” op. cit. 18. Ibid. 19. Poundstone, op. cit. 20.

…

John Milnor, “A Nobel Prize for John Nash,” op. cit. 22. Ibid.; Gardner, op. cit. 23. Gale, interview. 24. Ibid. 25. Ibid. 26. Kuhn, interview. 27. Ibid. 28. Milnor, interview, 9.26.95. 7: John von Neumann 1. See, for example, Stanislaw Ulam, “John von Neumann, 1903–1957,” Bulletin of the American Mathematical Society, vol. 64, no. 3, part 2 (May 1958); Stanislaw Ulam, Adventures of a Mathematician (New York: Scribner’s, 1983); Paul R. Halmos, “The Legend of John von Neumann,” American Mathematical Monthly, vol. 80 (1973); William Poundstone, Prisoner’s Dilemma, op. cit.; Ed Regis, Who Got Einstein’s Office?, op. cit. 2. Poundstone, op. cit. 3. Ulam, “John von Neumann,” op. cit.; Poundstone, op. cit., pp. 94–96. 4. Harold Kuhn, interview, 1.10.96. 5. In remarks at a Nobel luncheon at the American Economics Association meeting on 1.5.96, Nash traced a lineage from Newton to von Neumann to himself.

…

Halmos, op. cit. 21. Ibid. 22. Poundstone, op. cit. 23. Ulam, Adventures of a Mathematician, op. cit. 24. Ulam, “John von Neumann,” op. cit. 25. Ibid. 26. Ibid., p. 10; Robert J. Leonard, “From Parlor Games to Social Science,” op. cit. 27. Richard Duffin, interview, 10.94. 28. Halmos, op. cit. 29. Ulam, “John von Neumann,” op. cit., pp. 35–39. 30. Interviews with Donald Spencer, 11.18.95; David Gale, 9.20.95; and Harold Kuhn, 9.23.95. 31. Poundstone, op. cit. 32. Herman H. Goldstine, “A Brief History of the Computer,” A Century of Mathematics in America, Part I, op. cit. 33. John von Neumann, as quoted in ibid. 8: The Theory of Games 1. John von Neumann and Oskar Morgenstern, The Theory of Games and Economic Behavior (Princeton: Princeton University Press, 1944, 1947, 1953). 2.

**
The Rise of the Quants: Marschak, Sharpe, Black, Scholes and Merton
** by
Colin Read

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Black-Scholes formula, Bretton Woods, Brownian motion, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, David Ricardo: comparative advantage, discovery of penicillin, discrete time, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, floating exchange rates, full employment, Henri Poincaré, implied volatility, index fund, Isaac Newton, John von Neumann, Joseph Schumpeter, Long Term Capital Management, Louis Bachelier, margin call, market clearing, martingale, means of production, moral hazard, naked short selling, price stability, principal–agent problem, quantitative trading / quantitative ﬁnance, RAND corporation, random walk, risk tolerance, risk/return, Ronald Reagan, shareholder value, Sharpe ratio, short selling, stochastic process, The Chicago School, the scientific method, too big to fail, transaction costs, tulip mania, Works Progress Administration, yield curve

Nevertheless, the necessity for action and for decision compels us as practical men to do our best to overlook this awkward fact and to behave exactly as we should if we had behind us a good Benthamite calculation of a series of prospective advantages and disadvantages, each multiplied by its appropriate probability waiting to be summed.1 The finance literature further clarified that there are calculable risks and that there are uncertainties that cannot be quantified. In the 1930s, John von Neumann set about producing a model of expected utility that permitted the inclusion of risk. Then, Leonard Jimmie Savage described how our individual perceptions affect the probability of uncertainty, and Kenneth Arrow was able to include these probabilities of uncertainty in a model that established the existence of equilibrium in a market for financial securities. With the existence of equilibrium and a better understanding of the meaning and significance of probability at hand, Harry Markowitz then packaged up these intuitions into a tidy set of insights we now call Modern Portfolio Theory.

…

These are the questions that the pricing analysts sought to resolve. 2 A Roadmap to Resolve the Big Questions In the first half of the twentieth century, Irving Fischer described why people save. John Maynard Keynes then showed how individuals adjust their portfolios between cash and less liquid assets, while Franco Modigliani demonstrated how all these personal financial decisions evolve over one’s lifetime. John von Neumann, Leonard Jimmie Savage, and Kenneth Arrow then incorporated uncertainty into the mix, and Harry Markowitz packaged the state of financial science into Modern Portfolio Theory. However, none of these great minds provided a satisfactory explanation for how the price of individual securities evolve over time. By the 1960s, the finance discipline was begging for a revolution that could turn the theoretical into the quantitative and practical.

…

When we discover that Marschak made discoveries that were subtle and humble but were so timely and related to the essence of the work of William Sharpe, Fischer Black, and Myron Scholes, we must conclude that he was more than a mentor of other great minds – he was a great mind himself. We will begin with his story. This page intentionally left blank 3 The Early Years Jacob Marschak was not at all unusual among the cadre of great minds that formed the discipline of finance in the first half of the twentieth century. Like the families of Milton Friedman, Franco Modigliani, Leonard Jimmie Savage, Kenneth Arrow, John von Neumann, and Harry Markowitz, Marschak’s family tree was originally rooted in the Jewish culture and derived from the intellectually stimulating region of Eastern, Central and Southern Europe at the beginning of the twentieth century. This region, comprising what is now Ukraine, Hungary, Poland, Romania, and parts of Italy, was under the influence of the AustroHungarian Empire in the late nineteenth and early twentieth centuries.

**
How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy)
** by
Benjamin Peters

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrei Shleifer, Benoit Mandelbrot, bitcoin, Brownian motion, Claude Shannon: information theory, cloud computing, cognitive dissonance, computer age, conceptual framework, crony capitalism, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, Dissolution of the Soviet Union, double helix, Drosophila, Francis Fukuyama: the end of history, From Mathematics to the Technologies of Life and Death, hive mind, index card, informal economy, invisible hand, Jacquard loom, Jacquard loom, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, linear programming, mandelbrot fractal, Marshall McLuhan, means of production, Menlo Park, Mikhail Gorbachev, mutually assured destruction, Network effects, Norbert Wiener, packet switching, pattern recognition, Paul Erdős, Peter Thiel, RAND corporation, rent-seeking, road to serfdom, Ronald Coase, scientific mainstream, Steve Jobs, Stewart Brand, stochastic process, technoutopianism, The Structural Transformation of the Public Sphere, transaction costs, Turing machine

.: A Comment on Certain Points Where Cybernetics Impinges on Religion (Cambridge: MIT Press, 1964). 7. Wiener, Cybernetics, 1–25, 155–168. 8. Ibid., 16. 9. Dupuy, Mechanization of the Mind. See also John von Neumann, The Computer and the Brain, 2nd ed. (New Haven: Yale University Press, [1958] 2000). 10. Quoted in Claus Pias, “Analog, Digital, and the Cybernetic Illusion,” Kybernetes 34 (3–4) (2005): 544. 11. Claus Pias, ed., Cybernetics-Kybernetik 2: The Macy-Conferences 1946–1953 (Berlin: Diaphanes, 2004). 12. Steve J. Heims, The Cybernetics Group (Cambridge: MIT Press, 1991). 13. Ibid., 52–53, 207. 14. William Aspray, John von Neumann and the Origins of Modern Computing (Cambridge: MIT Press, 1990). 15. David Lipset, Gregory Bateson: The Legacy of a Scientist (New York: Prentice Hall, 1980). See also Fred Turner, From Counterculture to Cyberculture (Chicago: University of Chicago Press, 2006), 121–125. 16.

…

Borrowing from the language of Hannah Arendt, it recasts the Soviet network experience in light of other national network projects in the latter half of the twentieth century, suggesting the ways that the Soviet experience may appear uncomfortably close to our modern network situation. A few other summary observations for scholar and general-interest reader are offered in close. 1 A Global History of Cybernetics I am thinking about something much more important than bombs. I am thinking about computers. —John von Neumann, 1946 Cybernetics nursed early national computer network projects on both sides of the cold war. Cybernetics was a postwar systems science concerned with communication and control—and although its significance has been well documented in the history of science and technology, its implications as a carrier of early ideas about and language for computational communication have been largely neglected by communication and media scholars.1 This chapter discusses how cybernetics became global early in the cold war, coalescing first in postwar America before diffusing to other parts of the world, especially Soviet Union after Stalin’s death in 1953, as well as how Soviet cybernetics shaped the scientific regime for governing economics that eventually led to the nationwide network projects imagined in the late 1950s and early 1960s.

…

If cybernetics in the United States sprang from the teams of researchers channeling Wiener and McCulloch, it took disciplinary shape at the Macy Conferences on Cybernetics, a series of semiannual (1946–1947) and then annual (1948–1953) interdisciplinary gatherings chaired by Warren McCulloch and organized by the Josiah Macy, Jr. Foundation in New York City. The Macy Conferences, as they were informally known, staked out a spacious interdisciplinary purview for cybernetic research.11 In addition to McCulloch, who directed the conferences, a few noted participants included Wiener himself, the mathematician and game theorist John von Neumann, leading anthropologist Margaret Mead and her then husband Gregory Bateson, founding information theorist and engineer Claude Shannon, sociologist-statistician and communication theorist Paul Lazarsfeld, psychologist and computer scientist J.C.R. Licklider, as well as influential psychiatrists, psychoanalysts, and philosophers such as Kurt Lewin, F.S.C. Northrop, Molly Harrower, and Lawrence Kubie, among others.

**
A Fiery Peace in a Cold War: Bernard Schriever and the Ultimate Weapon
** by
Neil Sheehan

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, anti-communist, Berlin Wall, Bretton Woods, British Empire, cuban missile crisis, double helix, European colonialism, John von Neumann, Menlo Park, Mikhail Gorbachev, mutually assured destruction, nuclear winter, operation paperclip, RAND corporation, Ronald Reagan, uranium enrichment

BOOK IV STARTING A RACE Chapters 29–31: Schriever interviews; also interviews with Marina von Neumann Whitman and Françoise Ulam and their reminiscences at Hofstra University conference on von Neumann, May 29-June 3, 1988; interviews with Foster Evans and Jacob Wechsler; also Evans’s lecture, “Early Super Work,” published in the Los Alamos Historical Society’s 1996 Behind Tall Fences; interview with Nicholas Vonneuman and his unpublished biography of his brother, “The Legacy of John von Neumann”; John von Neumann Papers in the Manuscript Division of the Library of Congress; Rhodes’s The Making of the Atomic Bomb and Dark Sun; Herman Goldstine’s 1972 The Computer from Pascal to von Neumann; Stanislaw Ulam’s 1976 Adventures of a Mathematician; William Poundstone’s 1992 Prisoner’s Dilemma; Norman Macrae’s 1992 John von Neumann; and Kati Marton’s 2006 The Great Escape: Nine Jews Who Fled Hitler and Changed the World. Chapter 32: Interviews with General Schriever, Col. Vincent Ford, and Trevor Gardner, Jr.; Colonel Ford’s unpublished memoir on the building of the ICBM; Air Force Space and Missile Pioneers biography of Gardner.

…

Both Schriever and Gardner knew Ramo was indispensable for assembling the array of engineering and scientific talent needed to overcome the technological obstacles. COURTESY OF GENERAL BERNARD SCHRIEVER Cold War forgiveness: John von Neumann (right), a Jewish exile from Hitler’s Europe, conferring with Wernher von Braun, a former SS officer, Nazi Party member, and the führer’s V-2 missile man, during a visit to the Army’s Redstone Arsenal in Alabama. A mathematician and mathematical physicist with a mind second only to Albert Einstein’s, von Neumann headed the scientific advisory committee for the ICBM and lent the project his prestige. JOHN VON NEUMANN PAPERS, MANUSCRIPT DIVISION, LIBRARY OF CONGRESS The heartlessness of an early end: Seven months after immensely impressing Eisenhower at the July 28, 1955, White House briefing on the missile project, “Johnny” von Neumann had been driven to a wheelchair by the ravages of his cancer.

…

Chapters 72–77: Neufeld, Ballistic Missiles in the United States Air Force, 1945–1960; Heppenheimer’s Countdown; Zubok and Pleshakov, Inside the Kremlin’s Cold War; Taubman’s Khrushchev; Robert Kennedy’s 1968 Thirteen Days: A Memoir of the Cuban Missile Crisis; Fred Kaplan’s 1983 The Wizards of Armageddon; Anatoly Dobrynin’s 1995 In Confidence; Aleksandr Fursenko and Timothy Naftali’s 1997 One Hell of a Gamble: Khrushchev, Castro, and Kennedy, 1958–1964; The Kennedy Tapes: Inside the White House During the Cuban Missile Crisis, Ernest May and Philip Zelikow’s 1997 editing of the tapes of the White House meetings during the crisis; Max Frankel’s 2004 High Noon in the Cold War: Kennedy, Khrushchev and the Cuban Missile Crisis; Fursenko and Naftali’s 2006 Khrushchev’s Cold War; Michael Dobbs’s 2008 One Minute to Midnight: Kennedy, Khrushchev, and Castro on the Brink of Nuclear War; the official SAC history, The Development of Strategic Air Command; Wynn’s RAF Nuclear Deterrent Forces. Chapter 78: Leonid Brezhnev’s cynical remark to his brother is recounted in the 1995 memoir by his niece, Luba Brezhneva’s The World I Left Behind: Pieces of a Past. EPILOGUE THE SCHRIEVER LUCK Chapter 79: The John von Neumann Papers, Manuscript Division of the Library of Congress; Col. Vincent Ford’s memoir; Macrae’s John von Neumann; Pound-stone’s Prisoner’s Dilemma. Chapter 80: Schriever interviews; Col. Vincent Ford’s memoir; interview with Trevor Gardner, Jr. Chapter 81: November 1, 1968, historical monograph on Army Ballistic Missile Agency; Edward Hall interview. Chapter 82: Schriever interviews; personal attendance at annual Oldtimers Reunions as an invitee; interviews with Joni James Schriever.

**
The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal
** by
M. Mitchell Waldrop

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, double helix, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process

Wiesner, "The Communications Sciences-Those Early Days," in R.L.E.: 1946+20 (Cambridge, Mass.: Research Laboratory for Electronics, MIT, 1966), 12. 4. Steve Helms, John von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life and Death (Cambridge, Mass.: MIT Press, 1980), 206. 5. Norbert Wiener, Cybernetics, or Control and CommunicatiOn in the Animal and the Machine, 2d ed. (Cambndge, Mass.: MIT Press, 1961),23. 6. Heims, Von Neumann/Wiener, 189. 7. Norbert Wiener, "A Scientist Rebels," Atlantic Monthly, January 1947, and Bulletin of the Atomic Sci- entlSts, January 1947. 8. Helms, Von Neumann/Wiener, 334-35. 9. John von Neumann and Oskar Morgenstern, Theory of Games and Economic BehaviOr (Princeton, N.J.: Pnnceton University Press, 1944). 10. Heims, Von Neumann/Wiener, 359. 11. Richard Rhodes, Dark Sun: The Making of the Hydrogen Bomb (New York: Simon & Schuster, 1995), 389. 12.

…

Winston, OH 196. 484 BIBLIOGRAPHY BOOKS AND ARTICLES Again, the written matenals listed below are only a tiny fraction of what's avaIlable on the history of computing, but they were partICularly helpful to me in telling the story of LICk and the ARPA com- munity. Aspray, Wilham. "The SCientific ConceptualIzation of Information: A Survey." Annals of the H15tory of Computing 7 (1985). -. "John von Neumann's Contributions to Computing and Computer Science." Annals of the H15- tory of ComputIng 11, no. 3 (1989). -.John von Neumann and the Orzglns of Modern ComputIng. Cambridge, Mass.: MIT Press, 1990. -. "The Intel 4004 MIcroprocessor: What Constituted Invention?" IEEE Annals of the H15tory of Computing 19, no. 3 (1997). Augarten, Stan. BIt by BIt: An Illustrated H15tory of Computers. New York: TICknor & Fields, 1984. Baars, Bernard J. The CognItIve Revolution In Psychology.

…

THE LAST TRANSITION 41 pened to catch sight of another man stepping onto the platform-a short, plump, rather ordinary-looking fellow with an expansive forehead, a round, cheerful face, and the air of a meticulously dressed banker. Von Neumann! Goldstine was awestruck. Before his current incarnation-he was liaison officer for the army's computing substation at the University of Pennsylvania's Moore School of Engineering-Goldstine had been a Ph.D. mathematics instructor at the University of Michigan. So he already knew the legends. At age forty, John von Neumann (pronounced fon NaY-man) held a place in mathematics that could be compared only to that of Albert Einstein in physics. In the single year of 1927, for example, while still a mere instructor at the University of Berlin, von Neumann had put the newly emerging theory of quantum mechanics on a rigorous mathematical footing; established new links between formal logical systems and the foundations of mathematics; and cre- ated a whole new branch of mathematics known as game theory, a way of ana- lyzing how people make decisions when they are competing with each other (among other things, this field gave us the term "zero-sum game").

**
Computer: A History of the Information Machine
** by
Martin Campbell-Kelly,
William Aspray,
Nathan L. Ensmenger,
Jeffrey R. Yost

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, borderless world, Buckminster Fuller, Build a better mousetrap, Byte Shop, card file, cashless society, cloud computing, combinatorial explosion, computer age, deskilling, don't be evil, Douglas Engelbart, Dynabook, fault tolerance, Fellow of the Royal Society, financial independence, Frederick Winslow Taylor, game design, garden city movement, Grace Hopper, informal economy, interchangeable parts, invention of the wheel, Jacquard loom, Jacquard loom, Jeff Bezos, jimmy wales, John von Neumann, linked data, Mark Zuckerberg, Marshall McLuhan, Menlo Park, natural language processing, Network effects, New Journalism, Norbert Wiener, Occupy movement, optical character recognition, packet switching, PageRank, pattern recognition, pirate software, popular electronics, prediction markets, pre–internet, QWERTY keyboard, RAND corporation, Robert X Cringely, Silicon Valley, Silicon Valley startup, Steve Jobs, Steven Levy, Stewart Brand, Ted Nelson, the market place, Turing machine, Vannevar Bush, Von Neumann architecture, Whole Earth Catalog, William Shockley: the traitorous eight, women in the workforce, young professional

Licklider was a consummate political operator who motivated a generation of computer scientists and obtained government funding for them to work in the fields of human-computer interaction and networked computing. COURTESY OF MIT MUSEUM. Working at the Institute for Advanced Study, Princeton, Herman Goldstine and John von Neumann introduced the “flow diagram” (above) as a way of managing the complexity of programs and communicating them to others. COURTESY OF INSTITUTE FOR ADVANCED STUDY, PRINCETON: Herman H. Goldstine and John von Neumann, Planning and Coding of Problems for an Electronic Computing Instrument, Part II, Volume 2 (1948), p. 28. Programming languages, such as FORTRAN, COBOL, and BASIC, improved the productivity of programmers and enabled nonexperts to write programs. FORTRAN, designed by IBM’s John Backus (left), was released in 1957 and for decades was the most widely used scientific programming language.

…

The developers of the most important wartime computer, the ENIAC, left their university posts to start a business building computers for the scientific and business markets. Other electrical manufacturers and business-machine companies, including IBM, also turned to this enterprise. The computer makers found a ready market in government agencies, insurance companies, and large manufacturers. The basic functional specifications of the computer were set out in a report written by John von Neumann in 1945, and these specifications are still largely followed today. However, decades of continuous innovation have followed the original conception. These innovations are of two types. One is the improvement in components, leading to faster processing speed, greater information-storage capacity, improved price/performance, better reliability, less required maintenance, and the like: today’s computers are literally millions of times better than the first computers on almost all measures of this kind.

…

Turing went on to illustrate with examples of sliding block puzzles, metal disentanglement puzzles, and knot tying. It was typical of Turing to be able to express a complex mathematical argument in terms that a nonmathematician could understand. The computability of mathematical functions would later become a cornerstone of computer science theory. Turing’s growing reputation earned him a research studentship at Princeton University to study under Alonzo Church in 1937. There he encountered John von Neumann, a founding professor of the Institute for Advanced Study at Princeton, who a few years later would play a pivotal role in the invention of the modern computer. Von Neumann was deeply interested in Turing’s work and invited him to stay on at the Institute. Turing, however, decided to return to Britain. World War II was looming, and he was enlisted in the code-breaking operation at Bletchley Park, a former mansion some eighty miles north of London.

**
Turing's Vision: The Birth of Computer Science
** by
Chris Bernhardt

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, British Empire, cellular automata, Claude Shannon: information theory, complexity theory, Conway's Game of Life, discrete time, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Internet Archive, Jacquard loom, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Norbert Wiener, Paul Erdős, Turing complete, Turing machine, Turing test, Von Neumann architecture

We also explain what it means to be effectively enumerable. We conclude by giving Turing’s proof that computable numbers are not effectively enumerable. Chapter 9 The final chapter describes both what happened to Turing and the computer in the years after his paper was published. It begins with Turing going to Princeton to obtain his Ph.D. under Church. This is where he gets to know John von Neumann. It then describes Turing’s move back to England and his work during the Second World War on code breaking. After this, we briefly look at how the modern computer came into existence during the forties. The procession from sophisticated calculator, to universal computer, to stored-program universal computer is outlined. In particular, we note that the stored-program concept originates with Turing’s paper.

…

This collection of cells and their connections they called a neural net. McCulloch and Pitts realized that this was a simplified model of how brains actually worked, but studied neural nets to see how logic could be handled by them. Since their nets had basic features in common with neurons and the human brain, their work, they hoped, would shed some light on logical reasoning in people. Their paper caught the attention of both John von Neumann and Norbert Wiener. Both were very impressed. Wiener, the famous American mathematician and philosopher, saw the power of feedback loops. He realized that they were widely applicable and used this idea to develop the theory of cybernetics.1 Cybernetics naturally led to the idea of machines that could learn and, in turn, led to the birth of artificial intelligence. Von Neumann recognized that McCulloch and Pitt’s description of cells and their connections could also be applied to electrical components and computation.

…

That we can simulate Turing machines on modern computers is not surprising. What is surprising is that we can design a Turing machine to simulate a modern computer, showing that Turing machines are equivalent in computing power to modern computers. We will sketch how this is done. The first step is to get a concrete description of the modern computer. Von Neumann Architecture Later we will talk more about John von Neumann, but it is important to know a few facts before we proceed. The First Draft of a Report on the EDVAC is probably the most important paper on the design of modern computers. It was written in 1945, as the first electronic computers were being built. It described the basic outline of how a computer should be designed, incorporating what had been learned from the design of earlier machines. This paper has a very different point of view to that of Turing’s paper.

**
Rise of the Machines: A Cybernetic History
** by
Thomas Rid

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1960s counterculture, A Declaration of the Independence of Cyberspace, agricultural Revolution, Albert Einstein, Alistair Cooke, Apple II, Apple's 1984 Super Bowl advert, back-to-the-land, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, business intelligence, Claude Shannon: information theory, conceptual framework, connected car, domain-specific language, Douglas Engelbart, dumpster diving, Extropian, full employment, game design, global village, Haight Ashbury, Howard Rheingold, Jaron Lanier, job automation, John von Neumann, Kevin Kelly, Marshall McLuhan, Menlo Park, Mother of all demos, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shale / tar sands, pattern recognition, RAND corporation, Silicon Valley, Simon Singh, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Telecommunications Act of 1996, telepresence, V2 rocket, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, Y2K, Yom Kippur War, Zimmermann PGP

Woirol, The Technological Unemployment and Structural Unemployment Debates (Westport, CT: Greenwood, 1996), 78. 75.David Fouquet, “Automation Held Threat to US Value Code,” Washington Post, May 12, 1964, A24. 76.Diebold, Beyond Automation, 10. 77.Fouquet, “Automation Held Threat.” 78.Diebold, Automation, 170. 79.Diebold, Beyond Automation, 206. 80.Kahn left Rand before publishing The Year 2000. 81.Herman Kahn and Anthony Wiener, The Year 2000: A Framework for Speculation on the Next Thirty-Three Years (London: Macmillan, 1967), 350. 4. ORGANISMS 1.Pesi Masani, Norbert Wiener, 1894–1964 (Basel: Burkhäuser, 1990), 225. 2.Paul E. Ceruzzi, A History of Modern Computing (Cambridge, MA: MIT Press, 2003), 21. 3.Masani, Norbert Wiener, 184. 4.John von Neumann to Norbert Wiener, November 29, 1946, in ibid., 243. 5.John von Neumann, Theory of Self-Reproducing Automata, ed. Arthur W. Burks (Urbana: University of Illinois Press, 1966), fifth lecture, 78. 6.Ibid., 79. 7.Ibid., 86. 8.Ibid., 87. 9.Edward F. Moore, “Artificial Living Plants,” Scientific American, October 1956, 118–26. 10.Ibid., 118. 11.Ibid., 119. 12.Ibid., 121. 13.Ibid., 122. 14.Ibid., 126. 15.David R. Francis, “Self-Producing Machines,” Christian Science Monitor, June 2, 1961, 16. 16.Ibid. 17.Norbert Wiener, “Some Moral and Technical Consequences of Automation,” Science 131, no. 3410 (May 6, 1960): 1355. 18.Arthur C.

…

Licklider, “Topics for Discussion at the Forthcoming Meeting,” Memorandum for Affiliates of the Intergalactic Computer Network, Advanced Research Projects Agency, Washington, DC, April 25, 1963. 86.Dan van der Vat, “Jack Good,” Guardian, April 29, 2009. 87.Irving J. Good, “Speculations concerning the First Ultraintelligent Machine,” Advances in Computers 6 (1965): 31–88. 88.John von Neumann discussed the effects of ever-accelerating technological progress with colleagues. In one such discussion, he allegedly said that humankind is approaching an essential “singularity” after which human affairs will be altered forever. See the recollection of Stanislaw Ulam: “Tribute to John von Neumann,” Bulletin of the American Mathematical Society 64, no. 3 (1958): 5. 89.Vernor Vinge, “First Word,” Omni 5, no. 1 (January 1983): 10. For Vinge’s weak scientific output, see his Google Scholar profile, http://bit.ly/vinge-scholar+. 90.Jürgen Kraus, “Selbstreproduktion bei Programmen” (master’s thesis, Universität Dortmund, Abteilung Informatik, 1980). 91.Ibid., 2. 92.Ibid., 154. 93.Ibid., 161. 94.Ibid., 160. 95.Ronald R.

…

Constructing and operating more and more complex machines would help scientists understand how the brain itself operated. And there was no reason why the theory could not be applied to all complex systems. Many leading minds in engineering, mathematics, biology, and psychology, but also sociology, philosophy, anthropology, and political science, would initially be drawn to the new thinking of adaptive systems. The best-known early cyberneticists were the mathematician John von Neumann, a fellow polyglot, computer pioneer, and prominent professor at the Institute for Advanced Study in Princeton still in his early forties, nine years younger than Wiener; the American neurophysiologist and neural networks pioneer Warren McCulloch; the Austrian American physicist Heinz von Foerster; and the Mexican physician Arturo Rosenblueth, one of Wiener’s closest friends and collaborators.

**
The Information: A History, a Theory, a Flood
** by
James Gleick

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, bank run, bioinformatics, Brownian motion, butterfly effect, citation needed, Claude Shannon: information theory, clockwork universe, computer age, conceptual framework, crowdsourcing, death of newspapers, discovery of DNA, double helix, Douglas Hofstadter, en.wikipedia.org, Eratosthenes, Fellow of the Royal Society, Gödel, Escher, Bach, Henri Poincaré, Honoré de Balzac, index card, informal economy, information retrieval, invention of the printing press, invention of writing, Isaac Newton, Jacquard loom, Jacquard loom, Jaron Lanier, jimmy wales, John von Neumann, Joseph-Marie Jacquard, Louis Daguerre, Marshall McLuhan, Menlo Park, microbiome, Milgram experiment, Network effects, New Journalism, Norbert Wiener, On the Economy of Machinery and Manufactures, PageRank, pattern recognition, phenotype, pre–internet, Ralph Waldo Emerson, RAND corporation, reversible computing, Richard Feynman, Richard Feynman, Simon Singh, Socratic dialogue, Stephen Hawking, Steven Pinker, stochastic process, talking drums, the High Line, The Wisdom of Crowds, transcontinental railway, Turing machine, Turing test, women in the workforce

♦ “ONE CAN PROVE ANY THEOREM”: Kurt Gödel, “On Formally Undecidable Propositions of Principia Mathematica and Related Systems I” (1931), 145. ♦ “CONTRARY TO APPEARANCES, SUCH A PROPOSITION”: Ibid., 151 n15. ♦ “AMAZING FACT”—“THAT OUR LOGICAL INTUITIONS”: Kurt Gödel, “Russell’s Mathematical Logic” (1944), 124. ♦ “A SUDDEN THUNDERBOLT FROM THE BLUEST OF SKIES”: Douglas R. Hofstadter, I Am a Strange Loop, 166. ♦ “THE IMPORTANT POINT”: John von Neumann, “Tribute to Dr. Gödel” (1951), quoted in Steve J. Heims, John von Neumann and Norbert Weiner (Cambridge, Mass.: MIT Press, 1980), 133. ♦ “IT MADE ME GLAD”: Russell to Leon Henkin, 1 April 1963. ♦ “MATHEMATICS CANNOT BE INCOMPLETE”: Ludwig Wittgenstein, Remarks on the Foundations of Mathematics (Cambridge, Mass.: MIT Press, 1967), 158. ♦ “RUSSELL EVIDENTLY MISINTERPRETS MY RESULT”: Gödel to Abraham Robinson, 2 July 1973, in Kurt Gödel: Collected Works, vol. 5, 201

…

They could represent any form of knowledge. Gödel’s first public mention of his discovery, on the third and last day of a philosophical conference in Königsberg in 1930, drew no response; only one person seems to have heard him at all, a Hungarian named Neumann János. This young mathematician was in the process of moving to the United States, where he would soon and for the rest of his life be called John von Neumann. He understood Gödel’s import at once; it stunned him, but he studied it and was persuaded. No sooner did Gödel’s paper appear than von Neumann was presenting it to the mathematics colloquium at Princeton. Incompleteness was real. It meant that mathematics could never be proved free of self-contradiction. And “the important point,” von Neumann said, “is that this is not a philosophical principle or a plausible intellectual attitude, but the result of a rigorous mathematical proof of an extremely sophisticated kind.”♦ Either you believed in mathematics or you did not.

…

.”♦ On the other hand, Vienna’s most famous philosopher, Ludwig Wittgenstein (who, fundamentally, did not), dismissed the incompleteness theorem as trickery (“Kunststücken”) and boasted that rather than try to refute it, he would simply pass it by: Mathematics cannot be incomplete; any more than a sense can be incomplete. Whatever I can understand, I must completely understand.♦ Gödel’s retort took care of them both. “Russell evidently misinterprets my result; however, he does so in a very interesting manner,” he wrote. “In contradistinction Wittgenstein … advances a completely trivial and uninteresting misinterpretation.”♦ In 1933 the newly formed Institute for Advanced Study, with John von Neumann and Albert Einstein among its first faculty members, invited Gödel to Princeton for the year. He crossed the Atlantic several more times that decade, as fascism rose and the brief glory of Vienna began to fade. Gödel, ignorant of politics and naïve about history, suffered depressive breakdowns and bouts of hypochondria that forced him into sanatoria. Princeton beckoned but Gödel vacillated.

**
How to Create a Mind: The Secret of Human Thought Revealed
** by
Ray Kurzweil

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Albert Michelson, anesthesia awareness, anthropic principle, brain emulation, cellular automata, Claude Shannon: information theory, cloud computing, computer age, Dean Kamen, discovery of DNA, double helix, en.wikipedia.org, epigenetics, George Gilder, Google Earth, Isaac Newton, iterative process, Jacquard loom, Jacquard loom, John von Neumann, Law of Accelerating Returns, linear programming, Loebner Prize, mandelbrot fractal, Norbert Wiener, optical character recognition, pattern recognition, Peter Thiel, Ralph Waldo Emerson, random walk, Ray Kurzweil, reversible computing, self-driving car, speech recognition, Steven Pinker, strong AI, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Wall-E, Watson beat the top human players on Jeopardy!, X Prize

Chapter 8: The Mind as Computer 1. Salomon Bochner, A Biographical Memoir of John von Neumann (Washington, DC: National Academy of Sciences, 1958). 2. A. M. Turing, “On Computable Numbers, with an Application to the Entscheidungsproblem,” Proceedings of the London Mathematical Society Series 2, vol. 42 (1936–37): 230–65, http://www.comlab.ox.ac.uk/activities/ieg/e-library/sources/tp2-ie.pdf. A. M. Turing, “On Computable Numbers, with an Application to the Entscheidungsproblem: A Correction,” Proceedings of the London Mathematical Society 43 (1938): 544–46. 3. John von Neumann, “First Draft of a Report on the EDVAC,” Moore School of Electrical Engineering, University of Pennsylvania, June 30, 1945. John von Neumann, “A Mathematical Theory of Communication,” Bell System Technical Journal, July and October 1948. 4.

…

—Diane Ackerman Brains exist because the distribution of resources necessary for survival and the hazards that threaten survival vary in space and time. —John M. Allman The modern geography of the brain has a deliciously antiquated feel to it—rather like a medieval map with the known world encircled by terra incognita where monsters roam. —David Bainbridge In mathematics you don’t understand things. You just get used to them. —John von Neumann E ver since the emergence of the computer in the middle of the twentieth century, there has been ongoing debate not only about the ultimate extent of its abilities but about whether the human brain itself could be considered a form of computer. As far as the latter question was concerned, the consensus has veered from viewing these two kinds of information-processing entities as being essentially the same to their being fundamentally different.

…

The basic idea is that the human brain is likewise subject to natural law, and thus its information-processing ability cannot exceed that of a machine (and therefore of a Turing machine). We can properly credit Turing with establishing the theoretical foundation of computation with his 1936 paper, but it is important to note that he was deeply influenced by a lecture that Hungarian American mathematician John von Neumann (1903–1957) gave in Cambridge in 1935 on his stored program concept, a concept enshrined in the Turing machine.1 In turn, von Neumann was influenced by Turing’s 1936 paper, which elegantly laid out the principles of computation, and made it required reading for his colleagues in the late 1930s and early 1940s.2 In the same paper Turing reports another unexpected discovery: that of unsolvable problems.

**
In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence
** by
George Zarkadakis

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, British Empire, business process, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, complexity theory, continuous integration, Conway's Game of Life, cosmological principle, dark matter, dematerialisation, double helix, Douglas Hofstadter, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, income inequality, index card, industrial robot, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, millennium bug, natural language processing, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, post-industrial society, prediction markets, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, speech recognition, stem cell, Stephen Hawking, Steven Pinker, strong AI, technological singularity, The Coming Technological Singularity, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

Cybernetics as a field grew out of these interdisciplinary meetings, held from 1946 until 1953, which brought together a number of notable post-war intellectuals, including Norbert Wiener, John von Neumann, Warren McCulloch, Claude Shannon, Heinz von Foerster and W. Ross Ashby. From its original focus on machines and animals, cybernetics quickly broadened in scope to encompass the workings of the mind (e.g. in the work of Bateson and Ashby) as well as social systems (e.g. Stafford Beer’s management cybernetics), thus rediscovering Plato’s original focus on the control relations in society. I will return to the very interesting connection of cybernetics, Plato and global governance later in the book. For now, I want to focus on four individuals who took part in the Macy Conferences, and whose work laid the foundations for Artificial Intelligence: Norbert Wiener, Claude Shannon, Warren McCulloch and John von Neumann. We have already met the first two.

…

At each one of these bifurcations new functionalities emerges. Finally, there comes a tipping point, where global change happens and the artificial, agent-based system becomes intelligent, in a similar fashion to the neuron-based human brain.6 The fourth cybernetician godfather of Artificial Intelligence, who also took part in the Macy Conferences, was the legendary Hungarian-American mathematician John von Neumann (1903–1957). He was the modern equivalent of Gottfried Leibniz, a polymath who made fundamental contributions to several sciences including mathematics, computing, cybernetics, logic, economics and quantum physics – to name but a few! His last work, before his untimely death at the age of fifty-three, was an unfinished manuscript entitled ‘The Computer and the Brain’, which shows how deeply interested von Neumann had become in the nascent science of Artificial Intelligence.7 During the time he participated in the Macy Conferences, von Neumann expanded on his theory of self-replicating automata.

…

Meanwhile, on the other side of the Atlantic, the Americans were also developing their own electronic computational machines, culminating in the design and construction of ENIAC (1946),6 an electronic, programmable and Turing-complete machine that was used for artillery-timing tables for the US Army’s Ballistic Research Laboratory. ENIAC represented a watershed in computer history. Its architecture would form the foundation of all modern computing. The key figure in generalising ENIAC’s architecture was John von Neumann, who at the time was involved in the Manhattan Project at the Los Alamos National Laboratory in New Mexico. Von Neumann was fascinated by the design of ENIAC, and wondered how the computer might be easily reprogrammed to perform a different set of operations – not involving artillery ballistics this time, but to predict the results of a hydrogen bomb explosion. Invited by the team that developed ENIAC to advise them, von Neumann produced a landmark report,7 which described a machine that could store both data and programs.8 The ‘von Neumann architecture’ – as it has hitherto been known – demonstrated how computers could be reprogrammed easily.

**
From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism
** by
Fred Turner

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1960s counterculture, A Declaration of the Independence of Cyberspace, Apple's 1984 Super Bowl advert, back-to-the-land, bioinformatics, Buckminster Fuller, Claude Shannon: information theory, complexity theory, computer age, conceptual framework, Danny Hillis, dematerialisation, distributed generation, Douglas Engelbart, Dynabook, From Mathematics to the Technologies of Life and Death, future of work, game design, George Gilder, global village, Golden Gate Park, Hacker Ethic, Haight Ashbury, hive mind, Howard Rheingold, informal economy, invisible hand, Jaron Lanier, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, market bubble, Marshall McLuhan, means of production, Menlo Park, Mother of all demos, new economy, Norbert Wiener, post-industrial society, postindustrial economy, Productivity paradox, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, Richard Stallman, Robert Shiller, Robert Shiller, Ronald Reagan, Silicon Valley, Silicon Valley ideology, South of Market, San Francisco, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Ted Nelson, Telecommunications Act of 1996, theory of mind, urban renewal, Vannevar Bush, Whole Earth Catalog, Whole Earth Review, Yom Kippur War

As several historians have pointed out, the “systems” approach taken by cybernetics predated the invention of the term itself by a little more than a decade. In 1928, for instance, John Von Neumann published his “Theory of Parlor Games,” thus inventing game theory. Heims, John Von Neumann and Norbert Wiener, 84. In the 1930s in England, Robert Lilienfeld has argued, the invention of radar led to the need for the coordination of machines and thus the invention of the “total point of view” characteristic of systems thinking. Lilienfeld, Rise of Systems Theory, 103. Cybernetics emerged as a self-consciously comprehensive ﬁeld of thought, however, with the work of Norbert Wiener. For a fuller account of Wiener’s career and the emergence of his cybernetics, see also Galison, “Ontology of the Enemy”; and Hayles, How We Became Posthuman. 29. Wiener, Cybernetics, 8. 30. Ibid., 9. 31. Heims, John Von Neumann and Norbert Wiener, 182 – 88. For a chronicle of Wiener’s shifting relationship to the Rad Lab, see Conway and Siegelman, Dark Hero of the Information Age, 115 –25. 32.

…

As in its proﬁle of the Electronic Frontier Foundation, Wired had offered the freelance lifestyle of a high-proﬁle consultant as a model of the independent lifestyle ostensibly becoming available to the digital generation as a whole. A month after Kelly’s ﬁrst interview with Gilder appeared in Wired, Paulina Borsook published a similar proﬁle of Dyson. The story moved swiftly through Dyson’s biography— child of physicist Freeman Dyson, childhood friend of Alice Bigelow (daughter of Julian Bigelow, John Von Neumann’s engineer), former Forbes reporter and Wall Street stock analyst, later editor of the newsletter Release 1.0 and hostess of the annual PC Forum conference. When it came to Dyson’s present career, however, the story slipped into information-system metaphors like those that appeared in Bronson’s proﬁle of Gilder. Like Bronson’s Gilder, Borsook’s Dyson moved from company to company, “assimilating new ideas and retelling them to Wired [ 227 ] those who can turn them into wealth.”

…

N o t e s t o Pa g e s 2 1 _ 2 6 [ 265 ] 33. For a critical analysis of this choice, and especially its relationship to conceptions of the Other in contemporary cultural theory, see Galison, “Ontology of the Enemy.” 34. Wiener, I Am a Mathematician, 252. 35. For Wiener, as Peter Galison put it, “Servomechanical theory would become the measure of man.” Galison, “Ontology of the Enemy,” 240. 36. Heims, John Von Neumann and Norbert Wiener, 184. 37. Rosenblueth, Wiener, and Bigelow, “Behavior, Purpose, and Teleology”; Galison, “Ontology of the Enemy,” 247; Wiener, Cybernetics, 15, 21. Shannon published his theory in a 1948 article, “Mathematical Theory of Communication.” Shannon’s theories rose to public prominence through his 1949 collaboration with Warren Weaver in The Mathematical Theory of Communication.

**
The Myth of the Rational Market: A History of Risk, Reward, and Delusion on Wall Street
** by
Justin Fox

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrei Shleifer, asset allocation, asset-backed security, bank run, Benoit Mandelbrot, Black-Scholes formula, Bretton Woods, Brownian motion, capital asset pricing model, card file, Cass Sunstein, collateralized debt obligation, complexity theory, corporate governance, Credit Default Swap, credit default swaps / collateralized debt obligations, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, discovery of the americas, diversification, diversified portfolio, Edward Glaeser, endowment effect, Eugene Fama: efficient market hypothesis, experimental economics, financial innovation, Financial Instability Hypothesis, floating exchange rates, George Akerlof, Henri Poincaré, Hyman Minsky, implied volatility, impulse control, index arbitrage, index card, index fund, invisible hand, Isaac Newton, John Nash: game theory, John von Neumann, joint-stock company, Joseph Schumpeter, libertarian paternalism, linear programming, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, market bubble, market design, New Journalism, Nikolai Kondratiev, Paul Lévy, pension reform, performance metric, Ponzi scheme, prediction markets, pushing on a string, quantitative trading / quantitative ﬁnance, Ralph Nader, RAND corporation, random walk, Richard Thaler, risk/return, road to serfdom, Robert Shiller, Robert Shiller, rolodex, Ronald Reagan, shareholder value, Sharpe ratio, short selling, side project, Silicon Valley, South Sea Bubble, statistical model, The Chicago School, The Myth of the Rational Market, The Predators' Ball, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Thomas L Friedman, Thorstein Veblen, Tobin tax, transaction costs, tulip mania, value at risk, Vanguard fund, volatility smile, Yogi Berra

He saw no hint of it in the work of his fellow economists, of whom he grew increasingly disdainful. The developments of the late 1930s, in which young Keynesians grafted a few kludgy imperfect-foresight formulas onto the body of perfect-foresight mathematical economics, aggravated him. He began consorting with the scientists and mathematicians of Vienna, one of whom steered him toward a 1928 paper about poker written by Hungarian mathematician John von Neumann.4 After emigrating to the United States in 1930, von Neumann became the brightest intellectual light at Princeton’s Institute for Advanced Study, a place that also employed Albert Einstein. He helped plan the Battle of the Atlantic, design the atomic bomb, and invent the computer. In the late 1950s, dying of bone cancer likely brought on by witnessing one too many atomic test blasts, he peddled his doctrine of nuclear brinksmanship while rolling his wheelchair down the halls of power in Washington—providing at least part of the inspiration for Stanley Kubrick’s Dr.

…

“The diagram had nothing to do with income distribution; it represented eight years of cotton prices.”11 Enthralled, Mandelbrot began sharing his discovery in visits to other universities. At the University of Chicago, he found an enthusiastic follower in Eugene Fama, another student of Harry Roberts studying market movements. Holbrook Working rejoined the fray with a paper showing that Alfred Cowles’s 1937 finding of patterns in stock movements was largely the result of a statistical error.12 Oskar Morgenstern chipped in, too. His friend John von Neumann had suggested before he died in 1957 that Morgenstern use a statistical technique called spectral analysis, helpful in distinguishing between true cycles and randomly generated ones, to examine economic data. Morgenstern wasn’t enough of a mathematician to do this himself, but he hired young British statistician Clive Granger and put him to work examining stock prices. In 1963, Morgenstern and Granger published a paper confirming that, according to their tests, stock prices moved in a short-term random walk (over the longer run, the movements didn’t look quite so random).13 Morgenstern had connections at Fortune that dated back to the magazine’s coverage of game theory fifteen years before, and his was thus the first of the random walk papers to receive attention in the mainstream press.

…

Even before that happened, scholars on multiple campuses were making it clear that, in theory, it would be awfully convenient if speculative markets functioned perfectly. CHAPTER 5 MODIGLIANI AND MILLER ARRIVE AT A SIMPLIFYING ASSUMPTION Finance, the business school version of economics, is transformed from a field of empirical research and rules of thumb to one ruled by theory. FOUR YEARS AFTER JOHN VON Neumann and Oskar Morgenstern published their equation-filled guide to weighing potential rewards and losses in an uncertain future, economist Milton Friedman and statistician Jimmie Savage made a startling proposal. With just a few tweaks, they wrote, the von Neumann-Morgenstern utility theory could describe the way real people made economic decisions. At the very least, they argued, “individuals behave as if they calculated and compared expected utility and as if they knew the odds.”

**
The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise
** by
Nathan L. Ensmenger

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

barriers to entry, business process, Claude Shannon: information theory, computer age, deskilling, Firefox, Frederick Winslow Taylor, future of work, Grace Hopper, informal economy, information retrieval, interchangeable parts, Isaac Newton, Jacquard loom, Jacquard loom, job satisfaction, John von Neumann, knowledge worker, loose coupling, new economy, Norbert Wiener, pattern recognition, performance metric, post-industrial society, Productivity paradox, RAND corporation, Robert Gordon, sorting algorithm, Steve Jobs, Steven Levy, the market place, Thomas Kuhn: the structure of scientific revolutions, Thorstein Veblen, Turing machine, Von Neumann architecture, Y2K

The Computer Boys Take Over History of Computing William Aspray, editor The Government Machine: A Revolutionary History of the Computer William Aspray John von Neumann and the Origins of Modern Computing William Aspray and Paul E. Ceruzzi, editors The Internet and American Business Charles J. Bashe, Lyle R. Johnson, John H. Palmer, and Emerson W. Pugh IBM’s Early Computers Martin Campbell-Kelly From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry Paul E. Ceruzzi A History of Modern Computing I. Bernard Cohen Howard Aiken: Portrait of a Computer Pioneer I. Bernard Cohen and Gregory W. Welch, editors Makin’ Numbers: Howard Aiken and the Computer Nathan Ensmenger The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise John Hendry Innovating for Failure: Government Policy and the Early British Computer Industry Michael Lindgren Glory and Failure: The Difference Engines of Johann Müller, Charles Babbage, and Georg and Edvard Scheutz David E.

…

The ENIAC women would simply set up the machine to perform these predetermined plans; that this work would turn out to be difficult and require radically innovative thinking was completely unanticipated.32 The telephone switchboardlike appearance of the ENIAC programming cable-and-plug panels reinforced the notion that programmers were mere machine operators, that programming was more handicraft than science, more feminine than masculine, more mechanical than intellectual. The idea that the development of hardware was the real business of computing, and that software was at best secondary, persisted throughout the 1940s and early 1950s. In the first textbooks on computing published in the United States, for example, John von Neumann and Herman Goldstine outlined a clear division of labor in computing—presumably based on their experience with the ENIAC project—that clearly distinguished between the headwork of the (male) scientist or “planner,” and the handwork of the (largely female) “coder.” In the von Neumann and Goldstine schema, the planner did the intellectual work of analysis and the coder simply translated this work into a form that a computer could understand.

…

The struggle between theory and practice would become a major challenge for academics and practitioners alike, and would reflect itself in the structure of programming languages, professional societies, and academic curricula. Conventional histories of computer programming tend to conflate programming as a vocational activity with computer science as an academic discipline. In many of these accounts, programming is represented as a subdiscipline of formal logic and mathematics, and its origins are identified in the writings of early computer theorists Alan Turing and John von Neumann. The development of the discipline is evaluated in terms of advances in programming languages, formal methods, and generally applicable theoretical research. This purely intellectual approach to the history of programming, however, conceals the essentially craftlike nature of early programming practice. The first computer programmers were not scientists or mathematicians; they were low-status, female clerical workers and desktop calculator operators.

**
Algorithms to Live By: The Computer Science of Human Decisions
** by
Brian Christian,
Tom Griffiths

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, algorithmic trading, anthropic principle, asset allocation, autonomous vehicles, Berlin Wall, Bill Duvall, bitcoin, Community Supported Agriculture, complexity theory, constrained optimization, cosmological principle, cryptocurrency, Danny Hillis, delayed gratification, dematerialisation, diversification, double helix, Elon Musk, fault tolerance, Fellow of the Royal Society, Firefox, first-price auction, Flash crash, Frederick Winslow Taylor, George Akerlof, global supply chain, Google Chrome, Henri Poincaré, information retrieval, Internet Archive, Jeff Bezos, John Nash: game theory, John von Neumann, knapsack problem, Lao Tzu, linear programming, martingale, Nash equilibrium, natural language processing, NP-complete, P = NP, packet switching, prediction markets, race to the bottom, RAND corporation, RFC: Request For Comment, Robert X Cringely, sealed-bid auction, second-price auction, self-driving car, Silicon Valley, Skype, sorting algorithm, spectrum auction, Steve Jobs, stochastic process, Thomas Malthus, traveling salesman, Turing machine, urban planning, Vickrey auction, Walter Mischel, Y Combinator

In 1936, IBM began producing a line of machines called “collators” that could merge two separately ordered stacks of cards into one. As long as the two stacks were themselves sorted, the procedure of merging them into a single sorted stack was incredibly straightforward and took linear time: simply compare the two top cards to each other, move the smaller of them to the new stack you’re creating, and repeat until finished. The program that John von Neumann wrote in 1945 to demonstrate the power of the stored-program computer took the idea of collating to its beautiful and ultimate conclusion. Sorting two cards is simple: just put the smaller one on top. And given a pair of two-card stacks, both of them sorted, you can easily collate them into an ordered stack of four. Repeating this trick a few times, you’d build bigger and bigger stacks, each one of them already sorted.

…

The computer industry is currently in transition from hard disk drives to solid-state drives; at the same price point, a hard disk will offer dramatically greater capacity, but a solid-state drive will offer dramatically better performance—as most consumers now know, or soon discover when they begin to shop. What casual consumers may not know is that this exact tradeoff is being made within the machine itself at a number of scales—to the point where it’s considered one of the fundamental principles of computing. In 1946, Arthur Burks, Herman Goldstine, and John von Neumann, working at the Institute for Advanced Study in Princeton, laid out a design proposal for what they called an electrical “memory organ.” In an ideal world, they wrote, the machine would of course have limitless quantities of lightning-fast storage, but in practice this wasn’t possible. (It still isn’t.) Instead, the trio proposed what they believed to be the next best thing: “a hierarchy of memories, each of which has greater capacity than the preceding but which is less quickly accessible.”

…

A nuclear reaction is a branching process, where possibilities multiply just as wildly as they do in cards: one particle splits in two, each of which may go on to strike others, causing them to split in turn, and so on. Exactly calculating the chances of some particular outcome of that process, with many, many particles interacting, is hard to the point of impossibility. But simulating it, with each interaction being like turning over a new card, provides an alternative. Ulam developed the idea further with John von Neumann, and worked with Nicholas Metropolis, another of the physicists from the Manhattan Project, on implementing the method on the Los Alamos computer. Metropolis named this approach—replacing exhaustive probability calculations with sample simulations—the Monte Carlo Method, after the Monte Carlo casino in Monaco, a place equally dependent on the vagaries of chance. The Los Alamos team was able to use it to solve key problems in nuclear physics.

**
The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age
** by
Paul J. Nahin

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Claude Shannon: information theory, conceptual framework, Fellow of the Royal Society, finite state, four colour theorem, Georg Cantor, Grace Hopper, Isaac Newton, John von Neumann, knapsack problem, New Journalism, reversible computing, Richard Feynman, Richard Feynman, Schrödinger's Cat, Steve Jobs, Steve Wozniak, thinkpad, Turing machine, Turing test, V2 rocket

One could calculate, if one wanted to … the kind of population you would have after a number of generations.”10 Both Burks and Bush strongly urged Shannon to publish, but he had lost interest in the topic, and, besides, he had other, more urgent matters that demanded his attention. With his PhD in hand, and after spending the summer of 1940 back at Bell Labs, Shannon used a National Research Council Fellowship for a year’s stay at the Institute for Advanced Study in Princeton, New Jersey, where he worked under the great mathematician Hermann Weyl. Also there were such luminaries as John von Neumann and Albert Einstein. He might even have bumped into Richard Feynman, who was working on his PhD in physics at Princeton. Also there with Shannon was his first wife, Norma Levor (born 1920), whom he had married in 1939. Theirs was an intense, passionate, but ultimately doomed brief marriage, and Norma left him in June 1941. With all that going on in his life, it isn’t surprising that writing up his doctoral dissertation wasn’t high on Shannon’s list of things to do.

…

So, the answer is that the additional relay E does not result in improved reliability.8 6.5 MAJORITY LOGIC In this final section I’ll comment just a bit on what sparked Shannon’s interest in building more reliable circuits out of less reliable components. In 1956 Shannon was coeditor of an anthology of technical papers, one of which was authored by the great Hungarian-born American mathematician John von Neumann (1903–1957). Titled “Probabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Components,” Shannon read that paper as an editor long before the anthology appeared, and in his 1956 “crummy relay” paper specifically cited von Neumann’s earlier work.9 Von Neumann’s paper is heavily oriented toward mimicking the fundamental component of the human brain, the neuron cell that “fires” (that is, produces an output) when its multiple inputs (the outputs of other neurons) satisfy certain requirements.

…

That is, the outputs of a Fredkin gate always have the same number of 1s and 0s as do the inputs.6 10.4 THERMODYNAMICS OF LOGIC The reason for our interest in logically reversible computation becomes clear once we ask the following question: for the logically irreversible gates, where does the destroyed information “go”? It appears as heat! An implicit recognition of this can be found as long ago as 1929, in an important thermodynamics paper by the Hungarian physicist Leo Szilard (1898–1964).7 The explicit tying together of information, energy, and computation in analysts’ minds is, however, almost certainly due to a remark made by the Institute for Advanced Study mathematician John von Neumann (1903–1957) in a December 1949 lecture at the University of Illinois.8 In that lecture he asserted that the minimum energy Emin associated with manipulating a bit to be kT ln(2) joules (J), where T is the temperature on the Kelvin scale and k is Boltzmann’s constant (k = 1.38. 10−23).9 (Power is energy per unit time and so, just to keep the scale of this in mind, 1 = 1 = 1 watt) At “room temperature,” that is, at T = 300 K, this minimum energy is 2.87 · 10−21 J, a very tiny amount of energy.

**
In Pursuit of the Traveling Salesman: Mathematics at the Limits of Computation
** by
William J. Cook

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

complexity theory, computer age, four colour theorem, index card, John von Neumann, linear programming, NP-complete, p-value, RAND corporation, Richard Feynman, Richard Feynman, traveling salesman, Turing machine

Nonetheless, plots such as the one displayed in figure 2.21 strongly suggest a steady decrease √ in the average tour length divided by n as n increases, pointing toward an ultimate value of approximately 0.712 for b.29 43 3: The Salesman in Action Because my mathematics has its origin in a real problem doesn’t make it less interesting to me—just the other way around. —George Dantzig, 1986.1 he name itself announces the applied nature of the traveling salesman problem. This has surely contributed to a focus on computational issues, keeping the research topic well away from perils famously described in John von Neumann’s essay “The Mathematician”. “In other words, at a great distance from its empirical source, or after much ‘abstract’ inbreeding, a mathematical subject is in danger of degeneration”. Indeed, a strength of TSP research is the steady stream of practical applications that breathe new life into the area. T Road Trips In our roundup of TSP applications, let’s begin with a sample of tours taken by humans, including the namesake of the problem.

…

Chairman,” he said, “if the speaker does not mind, I would like to reply for him.” Naturally I agreed. von Neumann said: “The speaker titled his talk ‘linear programming’ and carefully stated his axioms. If you have an application that satisfies the axioms, well use it. If it does not, then don’t.” Fortunately for the world, many of its complexities can in fact be described in sufficient detail by linear models. The episode with Dantzig, Hotelling, and John von Neumann is summed up nicely by a cartoon Dantzig’s Stanford colleagues reported as hanging outside his office.9 It featured the Peanuts character Linus in his traditional pose, sucking his thumb and Linear Programming holding a blanket. The caption read, “Happiness is assuming the world is linear.” Applications George Dantzig’s classic book Linear Programming and Extensions begins with the following statement.

…

It is remarkable that there always exists such a simple and elegant proof of optimality: the simplex algorithm constructs multipliers that can be used to combine the primal LP constraints into a convincing statement that no solution gives an objective value greater than that supplied by the final dictionary. Moreover, the multipliers are themselves an optimal solution to the dual LP problem and the optimal primal and dual objective values 107 108 Chapter 5 are equal. This beautiful result is known as the strong duality theorem, first stated and proved by John von Neumann.22 Strong duality gets top billing in LP theory, but in our TSP discussion we really only need the much easier statement that any dual LP solution provides a bound on the primal objective; this is called the weak duality theorem. And don’t worry if you missed a few details in our rush through material in the past few pages: in the special case of the TSP we provide an intuitive explanation of duality, showing how to trap the salesman with linear inequalities.

**
Blindside: How to Anticipate Forcing Events and Wild Cards in Global Politics
** by
Francis Fukuyama

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Asian financial crisis, banking crisis, Berlin Wall, Bretton Woods, British Empire, capital controls, Carmen Reinhart, cognitive bias, cuban missile crisis, energy security, flex fuel, income per capita, informal economy, invisible hand, John von Neumann, Menlo Park, Mikhail Gorbachev, moral hazard, Norbert Wiener, oil rush, oil shale / tar sands, oil shock, packet switching, RAND corporation, Ray Kurzweil, reserve currency, Ronald Reagan, The Wisdom of Crowds, trade route, Vannevar Bush, Vernor Vinge, Yom Kippur War

Howlett, and Gian-Carlo Rota (New York: Academic Press, 1980). 6. J. Presper Eckert Jr., “The ENIAC,” A History of Computing in the Twentieth Century, edited by Metropolis, Howlett, and Rota, p. 525; and John W. Mauchly, “The ENIAC,” A History of Computing in the Twentieth Century, edited by Metropolis, Howlett, and Rota, p. 541. 7. William Aspray, John von Neumann and the Origins of Modern Computing (MIT Press, 1990); William Aspray, “John von Neumann’s Contributions to Computing and Computer Science,” Annals of the History of Computing 11, no. 3: 189–95 (1989). 8. Paul E. Ceruzzi, A History of Modern Computing (MIT Press, 1998), chap. 7; Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: Basic Books, 1996), chap. 10. 9. Tim Berners-Lee, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by Its Inventor (San Francisco: Harper, 1999). 10.

…

The war, of course, created any number of desperate demands for computation, which in turn led to two of the most famous of the pioneering computers: the digital, all-electronic Colossus, which was actually a series of machines created at the British code-breaking center, Bletchley Park, as a tool for cracking the most difficult German ciphers;5 and the digital, all-electronic ENIAC, which was constructed by engineers at the University of Pennsylvania to calculate artillery trajectories.6 Starting in mid-1944, moreover, the ENIAC team was joined by the world-renowned, Hungarian-born mathematician John von Neumann, who was also a participant in the super-secret Manhattan Project—and who was looking for computing machines that could help out with the horrendous calculations needed in that effort. Although ENIAC was too late to help in designing the atomic bomb—the machine did not become operational until 1946—von Neumann was inspired nonetheless. After the war, he went on to pioneer what would now be called scientific supercomputing, designing machines and algorithms for weather forecasting and many other types of simulations.

…

Or to put it another way, the act of computation had become an abstraction embodied in what is now known as software. The history of information technology offers many other examples of invention-by-convergence. Among them: —The modern concept of information and information processing was a synthesis of insights developed in the 1930s and 1940s by Alan Turing, Claude Shannon, Norbert Wiener, Warren McCulloch, Walter Pitts, and John von Neumann.12 —The hobbyists who sparked the personal computer revolution in the late 1970s were operating (consciously or not) in the context of ideas that had been around for a decade or more. There was the notion of interactive comput- 2990-7 ch11 waldrop 7/23/07 12:13 PM innovation and adaptation Page 125 125 ing, for example, in which a computer would respond to the user’s input immediately (as opposed to generating a stack of fanfold printout hours later); this idea dated back to the Whirlwind project, an experiment in real-time computing that began at MIT in the 1940s.13 There were the twin notions of individually controlled computing (having a computer apparently under the control of a single user) and home computing (having a computer in your own house); both emerged in the 1960s from MIT’s Project MAC, an early experiment in time-sharing.14 And then there was the notion of a computer as an open system, meaning that a user could modify it, add to it, and upgrade it however he or she wanted; that practice was already standard in the minicomputer market, which was pioneered by the Digital Equipment Corporation in the 1960s.15 —The Internet as we know it today represents the convergence of (among other ideas) the notion of packet-switched networking from the 1960s;16 the notion of internetworking (as embodied in the TCP/IP protocol), which was developed in the 1970s to allow packets to pass between different networks;17 and the notion of hypertext—which, of course, goes back to Vannevar Bush’s article on the memex in 1945. 2990-7 ch11 waldrop 7/23/07 12:13 PM Page 126 2990-7 ch12 kurth 7/23/07 12:14 PM Page 127 Part IV What Could Be 2990-7 ch12 kurth 7/23/07 12:14 PM Page 128 2990-7 ch12 kurth 7/23/07 12:14 PM Page 129 12 Cassandra versus Pollyanna A Debate between James Kurth and Gregg Easterbrook James Kurth: I am an optimist about the current pessimism, but a pessimist overall.

**
Superintelligence: Paths, Dangers, Strategies
** by
Nick Bostrom

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

agricultural Revolution, AI winter, Albert Einstein, algorithmic trading, anthropic principle, anti-communist, artificial general intelligence, autonomous vehicles, barriers to entry, bioinformatics, brain emulation, cloud computing, combinatorial explosion, computer vision, cosmological constant, dark matter, DARPA: Urban Challenge, data acquisition, delayed gratification, demographic transition, Douglas Hofstadter, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, fear of failure, Flash crash, Flynn Effect, friendly AI, Gödel, Escher, Bach, income inequality, industrial robot, informal economy, information retrieval, interchangeable parts, iterative process, job automation, John von Neumann, knowledge worker, Menlo Park, meta analysis, meta-analysis, mutually assured destruction, Nash equilibrium, Netflix Prize, new economy, Norbert Wiener, NP-complete, nuclear winter, optical character recognition, pattern recognition, performance metric, phenotype, prediction markets, price stability, principal–agent problem, race to the bottom, random walk, Ray Kurzweil, recommendation engine, reversible computing, social graph, speech recognition, Stanislav Petrov, statistical model, stem cell, Stephen Hawking, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, The Coming Technological Singularity, The Nature of the Firm, Thomas Kuhn: the structure of scientific revolutions, transaction costs, Turing machine, Vernor Vinge, Watson beat the top human players on Jeopardy!, World Values Survey

However, even comparatively moderate enhancements of biological cognition could have important consequences. In particular, cognitive enhancement could accelerate science and technology, including progress toward more potent forms of biological intelligence amplification and machine intelligence. Consider how the rate of progress in the field of artificial intelligence would change in a world where Average Joe is an intellectual peer of Alan Turing or John von Neumann, and where millions of people tower far above any intellectual giant of the past.63 A discussion of the strategic implications of cognitive enhancement will have to await a later chapter. But we can summarize this section by noting three conclusions: (1) at least weak forms of superintelligence are achievable by means of biotechnological enhancements; (2) the feasibility of cognitively enhanced humans adds to the plausibility that advanced forms of machine intelligence are feasible—because even if we were fundamentally unable to create machine intelligence (which there is no reason to suppose), machine intelligence might still be within reach of cognitively enhanced humans; and (3) when we consider scenarios stretching significantly into the second half of this century and beyond, we must take into account the probable emergence of a generation of genetically enhanced populations—voters, inventors, scientists—with the magnitude of enhancement escalating rapidly over subsequent decades.

…

A more benign course of action, which might also have had a chance of working, would have been to use its nuclear arsenal as a bargaining chip to negotiate a strong international government—a veto-less United Nations with a nuclear monopoly and a mandate to take all necessary actions to prevent any country from developing its own nuclear weapons. Both of these approaches were proposed at the time. The hardline approach of launching or threatening a first strike was advocated by some prominent intellectuals such as Bertrand Russell (who had long been active in anti-war movements and who would later spend decades campaigning against nuclear weapons) and John von Neumann (co-creator of game theory and one of the architects of US nuclear strategy).34 Perhaps it is a sign of civilizational progress that the very idea of threatening a nuclear first strike today seems borderline silly or morally obscene. A version of the benign approach was tried in 1946 by the United States in the form of the Baruch plan. The proposal involved the USA giving up its temporary nuclear monopoly.

…

By 5000 BC, following the Agricultural Revolution, the world population was growing at a rate of about 1 million per 200 years—a great acceleration since the rate of perhaps 1 million per million years in early humanoid prehistory—so a great deal of acceleration had already occurred by then. Still, it is impressive that an amount of economic growth that took 200 years seven thousand years ago takes just ninety minutes now, and that the world population growth that took two centuries then takes one and a half weeks now. See also Maddison (2005). 2. Such dramatic growth and acceleration might suggest one notion of a possible coming “singularity,” as adumbrated by John von Neumann in a conversation with the mathematician Stanislaw Ulam: Our conversation centred on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue. (Ulam 1958) 3. Hanson (2000). 4. Vinge (1993); Kurzweil (2005). 5.

**
How Markets Fail: The Logic of Economic Calamities
** by
John Cassidy

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrei Shleifer, anti-communist, asset allocation, asset-backed security, availability heuristic, bank run, banking crisis, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, Black-Scholes formula, Bretton Woods, British Empire, capital asset pricing model, centralized clearinghouse, collateralized debt obligation, Columbine, conceptual framework, Corn Laws, correlation coefficient, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, crony capitalism, Daniel Kahneman / Amos Tversky, debt deflation, diversification, Elliott wave, Eugene Fama: efficient market hypothesis, financial deregulation, financial innovation, Financial Instability Hypothesis, financial intermediation, full employment, George Akerlof, global supply chain, Haight Ashbury, hiring and firing, Hyman Minsky, income per capita, incomplete markets, index fund, invisible hand, John Nash: game theory, John von Neumann, Joseph Schumpeter, laissez-faire capitalism, liquidity trap, London Interbank Offered Rate, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, market bubble, market clearing, mental accounting, Mikhail Gorbachev, Mont Pelerin Society, moral hazard, mortgage debt, Naomi Klein, Network effects, Nick Leeson, Northern Rock, paradox of thrift, Ponzi scheme, price discrimination, price stability, principal–agent problem, profit maximization, quantitative trading / quantitative ﬁnance, race to the bottom, Ralph Nader, RAND corporation, random walk, Renaissance Technologies, rent control, Richard Thaler, risk tolerance, risk-adjusted returns, road to serfdom, Robert Shiller, Robert Shiller, Ronald Coase, Ronald Reagan, shareholder value, short selling, Silicon Valley, South Sea Bubble, sovereign wealth fund, statistical model, technology bubble, The Chicago School, The Great Moderation, The Market for Lemons, The Wealth of Nations by Adam Smith, too big to fail, transaction costs, unorthodox policies, value at risk, Vanguard fund

To do this, though, he was forced to introduce some simplifications, such as assuming that the price of each commodity depended only on the quantities it was produced in, and not on the quantities of competing goods. Since the main point of Walrasian economics was to explore the connections between different markets, this wasn’t entirely satisfactory, but it was the best Wald could do. There things rested until 1937, when John von Neumann, a Princeton mathematician who had taught in Berlin before moving to the United States, visited Vienna and presented a paper to Menger’s Mathematical Colloquium that one leading historian of economic ideas has called “the single most important article in mathematical economics.” That judgment is debatable, but von Neumann, who was born in Budapest in 1903, was undoubtedly some sort of genius.

…

In many cases, it is the basis of what I call rational irrationality, by which I mean a situation in which the application of rational self-interest in the marketplace leads to an inferior and socially irrational outcome. When the prisoner’s dilemma was first introduced, back in the early 1950s, many people refused to accept that the two firms wouldn’t be able to reach the cooperative solution. Economists and mathematicians were excitedly exploring the new science of game theory that John von Neumann and Oskar Morgenstern had invented in their 1944 treatise Theory of Games and Economic Behavior. Many smart people held out great hope for game theory, imagining it could solve many of the outstanding problems in the social sciences. The key to this process was thought to lie in extending the solution methods that von Neumann and Morgenstern had introduced, most of which applied to zero-sum games, such as coin-tossing and poker.

…

Rather than putting money to work on the basis of what they perceive as the fundamentals, many professional investors concentrate on “foreseeing changes in the conventional basis of valuation a short time ahead of the general public” in order to make a quick profit, Keynes wrote. “They are concerned, not with what an investment is really worth to a man who buys it ‘for keeps,’ but with what the market will value it at, under the influence of mass psychology, three months or a year hence.” (If he had been writing in today’s world of day traders and momentum funds, Keynes might well have written “three hours or a day hence.”) Like John von Neumann, the Hungarian genius who invented game theory, Keynes believed that simple parlor games have much to teach economists: they feature the sort of strategic interactions that are largely absent from orthodox economics, but that play such an important role in reality. On Wall Street, Keynes pointed out, investing is a “battle of wits,” the primary aim being “to outwit the crowd, and to pass the bad, or depreciating, half-crown to the other fellow.”

**
The Singularity Is Near: When Humans Transcend Biology
** by
Ray Kurzweil

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

additive manufacturing, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, autonomous vehicles, Benoit Mandelbrot, Bill Joy: nanobots, bioinformatics, brain emulation, Brewster Kahle, Brownian motion, business intelligence, c2.com, call centre, carbon-based life, cellular automata, Claude Shannon: information theory, complexity theory, conceptual framework, Conway's Game of Life, cosmological constant, cosmological principle, cuban missile crisis, data acquisition, Dava Sobel, David Brooks, Dean Kamen, disintermediation, double helix, Douglas Hofstadter, en.wikipedia.org, epigenetics, factory automation, friendly AI, George Gilder, Gödel, Escher, Bach, informal economy, information retrieval, invention of the telephone, invention of the telescope, invention of writing, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, linked data, Loebner Prize, Louis Pasteur, mandelbrot fractal, Mikhail Gorbachev, mouse model, Murray Gell-Mann, mutually assured destruction, natural language processing, Network effects, new economy, Norbert Wiener, oil shale / tar sands, optical character recognition, pattern recognition, phenotype, premature optimization, randomized controlled trial, Ray Kurzweil, remote working, reversible computing, Richard Feynman, Richard Feynman, Rodney Brooks, Search for Extraterrestrial Intelligence, semantic web, Silicon Valley, Singularitarianism, speech recognition, statistical model, stem cell, Stephen Hawking, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Kaczynski, telepresence, The Coming Technological Singularity, transaction costs, Turing machine, Turing test, Vernor Vinge, Y2K, Yogi Berra

Although the number of transistors per unit cost has doubled every two years, transistors have been getting progressively faster, and there have been many other levels of innovation and improvement. The overall power of computation per unit cost has recently been doubling every year. In particular, the amount of computation (in computations per second) that can be brought to bear to a computer chess machine doubled every year during the 1990s. 3. John von Neumann, paraphrased by Stanislaw Ulam, "Tribute to John von Neumann," Bulletin of the American Mathematical Society 64.3, pt. 2 (May 1958): 1–49. Von Neumann (1903–1957) was born in Budapest into a Jewish banking family and came to Princeton University to teach mathematics in 1930. In 1933 he became one of the six original professors in the new Institute for Advanced Study in Princeton, where he stayed until the end of his life.

…

Although the Singularity has many faces, its most important implication is this: our technology will match and then vastly exceed the refinement and suppleness of what we regard as the best of human traits. The Intuitive Linear View Versus the Historical Exponential View When the first transhuman intelligence is created and launches itself into recursive self-improvement, a fundamental discontinuity is likely to occur, the likes of which I can't even begin to predict. —MICHAEL ANISSIMOV In the 1950s John von Neumann, the legendary information theorist, was quoted as saying that "the ever-accelerating progress of technology ... gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."3 Von Neumann makes two important observations here: acceleration and singularity. The first idea is that human progress is exponential (that is, it expands by repeatedly multiplying by a constant) rather than linear (that is, expanding by repeatedly adding a constant).

…

One theory speculates that the universe itself began with such a Singularity.18 Interestingly, however, the event horizon (surface) of a black hole is of J finite size, and gravitational force is only theoretically infinite at the zero-size center of the black hole. At any location that could actually be measured, the forces are finite, although extremely large. The first reference to the Singularity as an event capable of rupturing the fabric of human history is John von Neumann's statement quoted above. In the 1960s, I. J. Good wrote of an "intelligence explosion" resulting from intelligent machines' designing their next generation without human intervention. Vernor Vinge, a mathematician and computer scientist at San Diego State University, wrote about a rapidly approaching "technological singularity" in an article for Omni magazine in 1983 and in a science-fiction novel, Marooned in Realtime, in 1986.19 My 1989 book, The Age of Intelligent Machines, presented a future headed inevitably toward machines greatly exceeding human intelligence in the first half of the twenty-first century.20 Hans Moravec's 1988 book Mind Children came to a similar conclusion by analyzing the progression of robotics.21 In 1993 Vinge presented a paper to a NASA-organized symposium that described the Singularity as an impending event resulting primarily from the advent of "entities with greater than human intelligence," which Vinge saw as the harbinger of a runaway phenomenon.22 My 1999 book, The Age of Spiritual Machines: When Computers Exceed Human Intelligence, described the increasingly intimate connection between our biological intelligence and the artificial intelligence we are creating.23 Hans Moravec's book Robot: Mere Machine to Transcendent Mind, also published in 1999, described the robots of the 2040s as our "evolutionary heirs," machines that will "grow from us, learn our skills, and share our goals and values, ... children of our minds."24 Australian scholar Damien Broderick's 1997 and 2001 books, both titled The Spike, analyzed the pervasive impact of the extreme phase of technology acceleration anticipated within several decades.25 In an extensive series of writings, John Smart has described the Singularity as the inevitable result of what he calls "MEST" (matter, energy, space, and time) compression.26 From my perspective, the Singularity has many faces.

**
Chaos
** by
James Gleick

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Benoit Mandelbrot, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Murray Gell-Mann, Norbert Wiener, pattern recognition, Richard Feynman, Richard Feynman, Stephen Hawking, stochastic process, trade route

Implicitly, the mission of many twentieth-century scientists—biologists, neurologists, economists—has been to break their universes down into the simplest atoms that will obey scientific rules. In all these sciences, a kind of Newtonian determinism has been brought to bear. The fathers of modern computing always had Laplace in mind, and the history of computing and the history of forecasting were intermingled ever since John von Neumann designed his first machines at the Institute for Advanced Study in Princeton, New Jersey, in the 1950s. Von Neumann recognized that weather modeling could be an ideal task for a computer. There was always one small compromise, so small that working scientists usually forgot it was there, lurking in a corner of their philosophies like an unpaid bill. Measurements could never be perfect. Scientists marching under Newton’s banner actually waved another flag that said something like this: Given an approximate knowledge of a system’s initial conditions and an understanding of natural law, one can calculate the approximate behavior of the system.

…

As one of his friends said, he was oscillating. Inner Rhythms The sciences do not try to explain, they hardly even try to interpret, they mainly make models. By a model is meant a mathematical construct which, with the addition of certain verbal interpretations, describes observed phenomena. The justification of such a mathematical construct is solely and precisely that it is expected to work. —JOHN VON NEUMANN BERNARDO HUBERMAN LOOKED OUT over his audience of assorted theoretical and experimental biologists, pure mathematicians and physicians and psychiatrists, and he realized that he had a communication problem. He had just finished an unusual talk at an unusual gathering in 1986, the first major conference on chaos in biology and medicine, under the various auspices of the New York Academy of Sciences, the National Institute of Mental Health, and the Office of Naval Research.

…

FOR WANT OF A NAIL George Herbert; cited in this context by Norbert Wiener, “Nonlinear Prediction and Dynamics,” in Collected Works with Commentaries, ed. P. Masani (Cambridge, Mass.: The M.I.T. Press, 1981), 3:371. Wiener anticipated Lorenz in seeing at least the possibility of “self-amplitude of small details of the weather map.” He noted, “A tornado is a highly local phenomenon, and apparent trifles of no great extent may determine its exact track.” “THE CHARACTER OF THE EQUATION” John von Neumann, “Recent Theories of Turbulence” (1949), in Collected Works, ed. A. H. Taub (Oxford: Pergamon Press, 1963), 6:437. CUP OF HOT COFFEE “The predictability of hydrodynamic flow,” in Transactions of the New York Academy of Sciences II:25:4 (1963), pp. 409–32. “WE MIGHT HAVE TROUBLE” Ibid., p. 410. LORENZ TOOK A SET This set of seven equations to model convection was devised by Barry Saltzman of Yale University, whom Lorenz was visiting.

**
The Logic of Life: The Rational Economics of an Irrational World
** by
Tim Harford

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

affirmative action, Albert Einstein, Andrei Shleifer, barriers to entry, Berlin Wall, colonial rule, Daniel Kahneman / Amos Tversky, double entry bookkeeping, Edward Glaeser, en.wikipedia.org, endowment effect, European colonialism, experimental economics, experimental subject, George Akerlof, income per capita, invention of the telephone, Jane Jacobs, John von Neumann, law of one price, Martin Wolf, mutually assured destruction, New Economic Geography, new economy, Plutocrats, plutocrats, Richard Florida, Richard Thaler, Ronald Reagan, Silicon Valley, spinning jenny, Steve Jobs, The Death and Life of Great American Cities, the market place, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Malthus, women in the workforce

Both were cold war intellectuals, advising the U.S. government at the highest levels and using game theory to try to understand the riskiest of all games, nuclear war. Game theory emerged from the sparkling mind of John von Neumann, a celebrated mathematical prodigy, when he decided to create a theory of poker. Von Neumann’s academic brilliance offered penetrating insights but the cold force of his logic could have led us to Armageddon. It was tempered by the earthier wisdom—usually expressed in witty prose rather than equations—of Thomas Schelling. Tormented by a tobacco addiction he could not kick, Schelling nudged game theory into a direction that now offers us surprising insights into the minds of hapless slot machine addicts. LATE IN THE 1920s, the most ostentatiously brilliant man in the world decided to work out the correct way to play poker. John von Neumann, a mathematician who helped to develop both the computer and the atomic bomb, had been struck by an engaging new conceit.

…

He crashed and burned on national television while I berated him for getting the game theory tangled up. It was not a high spot for Andy, nor for my project of using economics as a tool for self-improvement. You might think that was the first and last time any economist has dared to show his face at a speed date, but not at all. We can’t get enough of them. Economists at Columbia University even went to the trouble of organizing one. Ever since John von Neumann’s game theory promised to help us understand love and marriage, economists have been interested in how people choose their partners and how relationships work. And if you want to understand the way people choose their partners, a speed date is a great place to start. At a speed date you can get information about how each person responded to dozens of potential partners, something that would be impossible to collect in more traditional dating situations without binoculars, snooping devices, and a good private investigator.

…

Even more bizarre is that the way they ended up behaving was rational. Although the initial disparity was purely a matter of chance, and although there was no fundamental difference between the greens and the purples, the students playing the role of employers were absolutely correct in their view that green workers were more likely to be educated. (If we were to split hairs about their rationality, they jumped to the conclusion too quickly: John von Neumann or Chris Ferguson would have realized that what looked like an early sign of a pattern might easily have been random.) The employers’ view became self-fulfilling as purple workers rationally abandoned hope of getting hired and stopped paying for education. And once the downward spiral set in, a determinedly color-blind employer would actually have lost money compared with one who took note of the color of the applicants.

**
Our Final Invention: Artificial Intelligence and the End of the Human Era
** by
James Barrat

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, AI winter, Amazon Web Services, artificial general intelligence, Automated Insights, Bernie Madoff, Bill Joy: nanobots, brain emulation, cellular automata, cloud computing, cognitive bias, computer vision, cuban missile crisis, Daniel Kahneman / Amos Tversky, Danny Hillis, data acquisition, don't be evil, Extropian, finite state, Flash crash, friendly AI, friendly fire, Google Glasses, Google X / Alphabet X, Isaac Newton, Jaron Lanier, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, Loebner Prize, lone genius, mutually assured destruction, natural language processing, Nicholas Carr, optical character recognition, PageRank, pattern recognition, Peter Thiel, prisoner's dilemma, Ray Kurzweil, Rodney Brooks, Search for Extraterrestrial Intelligence, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, smart grid, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, superintelligent machines, technological singularity, The Coming Technological Singularity, traveling salesman, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, zero day

Rational has a specific microeconomics meaning. It means that an individual or “agent” will have goals and also preferences (called a utility function in economics). He will have beliefs about the world and the best way to achieve his goals and preferences. As conditions change, he will update his beliefs. He is a rational economic agent when he pursues his goals with actions based on up-to-date beliefs about the world. Mathematician John von Neumann (1903–1957) codeveloped the idea connecting rationality and utility functions. As we’ll see, von Neumann laid the groundwork for many ideas in computer science, AI, and economics. Yet social scientists argue that a “rational economic agent” is a load of hogwash. Humans are not rational—we don’t specify our goals or our beliefs, and we don’t always update our beliefs as conditions change. Our goals and preferences shift with the wind, gas prices, when we last ate, and our attention spans.

…

Vinge told me he’d never read Good’s self-penned biographical paragraphs, or learned about his late-in-life change of heart about the intelligence explosion. Probably only Good, and Leslie Pendleton, knew about it. Vernor Vinge was the first person to formally use the word “singularity” when describing the technological future—he did it in a 1993 address to NASA, entitled “The Coming Technological Singularity.” Mathematician Stanislaw Ulam reported that he and polymath John von Neumann had used “singularity” in a conversation about technological change thirty-five years earlier, in 1958. But Vinge’s coinage was public, deliberate, and set the singularity ball rolling into the hands of Ray Kurzweil and what is today a Singularity movement. With that street cred, why doesn’t Vinge work the lecture and conference circuits as the ultimate Singularity pundit? Well, singularity has several meanings, and Vinge’s usage is more precise than others.

…

And that is an extraordinary powerful force. So, there’s hundreds of thousands of people in the world, very smart people, who are working on things that lead to superhuman intelligence. And probably most of them don’t even look at it that way. They look at it as faster, cheaper, better, more profitable.” Vinge compares it to the Cold War strategy called MAD—mutually assured destruction. Coined by acronym-loving John von Neumann (also the creator of an early computer with the winning initials, MANIAC), MAD maintained Cold War peace through the promise of mutual obliteration. Like MAD, superintelligence boasts a lot of researchers secretly working to develop technologies with catastrophic potential. But it’s like mutually assured destruction without any commonsense brakes. No one will know who is ahead, so everyone will assume someone else is.

**
Against the Gods: The Remarkable Story of Risk
** by
Peter L. Bernstein

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, Big bang: deregulation of the City of London, Bretton Woods, buttonwood tree, capital asset pricing model, cognitive dissonance, Daniel Kahneman / Amos Tversky, diversified portfolio, double entry bookkeeping, Edmond Halley, Edward Lloyd's coffeehouse, endowment effect, experimental economics, fear of failure, Fellow of the Royal Society, Fermat's Last Theorem, financial deregulation, financial innovation, full employment, index fund, invention of movable type, Isaac Newton, John Nash: game theory, John von Neumann, linear programming, loss aversion, Louis Bachelier, mental accounting, moral hazard, Nash equilibrium, probability theory / Blaise Pascal / Pierre de Fermat, random walk, Richard Thaler, Robert Shiller, Robert Shiller, spectrum auction, statistical model, The Bell Curve by Richard Herrnstein and Charles Murray, The Wealth of Nations by Adam Smith, trade route, transaction costs, tulip mania, Vanguard fund

So we usually settle for compromise alternatives, which may require us to make the best of a bad bargain; game theory uses terms like "maximin" and "minimax" to describe such decisions. Think of seller-buyer, landlord-tenant, husband-wife, lender-borrower, GM-Ford, parentchild, President-Congress, driver-pedestrian, boss-employee, pitcherbatter, soloist-accompanist. Game theory was invented by John von Neumann (1903-1957), a physicist of immense intellectual accomplishment.' Von Neumann was instrumental in the discovery of quantum mechanics in Berlin during the 1920s, and he played a major role in the creation of the first American atomic bomb and, later, the hydrogen bomb. He also invented the digital computer, was an accomplished meteorologist and mathematician, could multiply eight digits by eight digits in his head, and loved telling ribald jokes and reciting off-color limericks.

…

To do that, he suggested, "Calculate the weight of a grand piano with six men huddling over it to sing. Then triple that weight." That will guarantee certainty. Von Neumann was born in Budapest to a well-to-do, cultured, jolly family. Budapest at the time was the sixth-largest city in Europe, prosperous and growing, with the world's first underground subway. Its literacy rate was over 90%. More than 25% of the population was Jewish, including the von Neumanns, although John von Neumann paid little attention to his Jewishness except as a source of jokes. He was by no means the only famous product of pre-World War I Budapest. Among his contemporaries were famous physicists like himself-Leo Szilard and Edward Teller-as well as celebrities from the world of entertainment-George Solti, Paul Lukas, Leslie Howard (born Lazlo Steiner), Adolph Zukor, Alexander Korda, and, perhaps most famous of all, ZsaZsa Gabor.

…

His approach reflects the spirit of the early years after the Second World War, when many social scientists set about reviving the Victorian faith in measurement and the belief that the world's problems could be solved. Strangely, Markowitz had no interest in equity investment when he first turned his attention to the ideas dealt with in "Portfolio Selection." He knew nothing about the stock market. A self-styled "nerd" as a student, he was working in what was then the relatively young field of linear programming. Linear programming, which happened to be an innovation to which John von Neumann had made significant contributions, is a means of developing mathematical models for minimizing costs while holding outputs constant, or for maximizing outputs while holding costs constant. The technique is essential for dealing with problems like those faced by an airline that aims to keep a limited number of aircraft as busy as possible while flying to as many destinations as possible. One day, while waiting to see his professor to discuss a topic for his doctoral dissertation, Markowitz struck up a conversation with a stock broker sharing the waiting room who urged him to apply linear programming to the problems investors face in the stock market.

**
Code: The Hidden Language of Computer Hardware and Software
** by
Charles Petzold

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Bill Gates: Altair 8800, Claude Shannon: information theory, computer age, Douglas Engelbart, Dynabook, Eratosthenes, Grace Hopper, invention of the telegraph, Isaac Newton, Jacquard loom, Jacquard loom, James Watt: steam engine, John von Neumann, Joseph-Marie Jacquard, Louis Daguerre, millennium bug, Norbert Wiener, optical character recognition, popular electronics, Richard Feynman, Richard Feynman, Richard Stallman, Silicon Valley, Steve Jobs, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

Eckert and Mauchly's attempt to patent the computer was, however, thwarted by a competing claim of John V. Atanasoff (1903–1995), who earlier designed an electronic computer that never worked quite right. The ENIAC attracted the interest of mathematician John von Neumann (1903–1957). Since 1930, the Hungarian-born von Neumann (whose last name is pronounced noy mahn) had been living in the United States. A flamboyant man who had a reputation for doing complex arithmetic in his head, von Neumann was a mathematics professor at the Princeton Institute for Advanced Study, and he did research in everything from quantum mechanics to the application of game theory to economics. John von Neumann helped design the successor to the ENIAC, the EDVAC (Electronic Discrete Variable Automatic Computer). Particularly in the 1946 paper "Preliminary Discussion of the Logical Design of an Electronic Computing Instrument," coauthored with Arthur W.

…

Macintosh is a registered trademark of Apple Computer, Inc. Microsoft, MS-DOS, and Windows are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. Other product and company names mentioned herein may be the trademarks of their respective owners. Images of Charles Babbage, George Boole, Louis Braille, Herman Hollerith, Samuel Morse, and John von Neumann appear courtesy of Corbis Images and were modified for this book by Joel Panchot. The January 1975 cover of Popular Electronics is reprinted by permission of Ziff-Davis and the Ziff family. All other illustrations in the book were produced by Joel Panchot. Unless otherwise noted, the example companies, organizations, products, people, and events depicted herein are fictitious. No association with any real company, organization, product, person, or event is intended or should be inferred

…

These pulses took about a millisecond to reach the other end (where they were detected like sound waves and routed back to the beginning), and hence each tube of mercury could store about 1024 bits of information. It wasn't until the mid-1950s that magnetic core memory was developed. Such memory consisted of large arrays of little magnetized metal rings strung with wires. Each little ring could store a bit of information. Long after core memory had been replaced by other technologies, it was common to hear older programmers refer to the memory that the processor accessed as core. John von Neumann wasn't the only person doing some major conceptual thinking about the nature of computers in the 1940s. Claude Shannon (born 1916) was another influential thinker. In Chapter 11, I discussed his 1938 master's thesis, which established the relationship between switches, relays, and Boolean algebra. In 1948, while working for Bell Telephone Laboratories, he published a paper in the Bell System Technical Journal entitled "A Mathematical Theory of Communication" that not only introduced the word bit in print but established a field of study today known as information theory.

**
How We Got Here: A Slightly Irreverent History of Technology and Markets
** by
Andy Kessler

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andy Kessler, automated trading system, bank run, Big bang: deregulation of the City of London, Bretton Woods, British Empire, buttonwood tree, Claude Shannon: information theory, Corn Laws, Edward Lloyd's coffeehouse, fiat currency, floating exchange rates, Fractional reserve banking, full employment, Grace Hopper, invention of the steam engine, invention of the telephone, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, James Hargreaves, James Watt: steam engine, John von Neumann, joint-stock company, joint-stock limited liability company, Joseph-Marie Jacquard, Maui Hawaii, Menlo Park, Metcalfe's law, packet switching, price mechanism, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, railway mania, RAND corporation, Silicon Valley, Small Order Execution System, South Sea Bubble, spice trade, spinning jenny, Steve Jobs, supply-chain management, supply-chain management software, trade route, transatlantic slave trade, transatlantic slave trade, tulip mania, Turing machine, Turing test, William Shockley: the traitorous eight

The need for precision weapons would both directly and indirectly launch the digital revolution: Transistors in 1948, lasers and integrated circuits in 1958, packet switching in 1964 and microprocessors in 1970, and that was just the easy stuff. Using Edison effect tubes and relays and other forms of logic and memory, scientists and engineers invented electronic computers to help win World War II. John von Neumann at the Moore School at the University of Pennsylvania designed the ENIAC digital computer, the birth mother of the U.S. computer industry, to speed up calculations for artillery firing tables for Navy guns. At the same time, Alan Turing and the British at Bletchley Park designed the Colossus computer to decipher Enigma codes. A host of electronic devices at Los Alamos helped speed up difficult calculations to control the reaction of uranium-235 for the atomic bomb.

…

It also might take hours or even several days to change the algorithm or program that the ENIAC 116 HOW WE GOT HERE worked on. It had very little internal memory. Of course, the biggest problem with the ENIAC was that it was still a decimal machine working with 10 digits instead of the two of Boolean binary math. That increased its complexity, probably 100-fold. One of the folks working on ENIAC was John von Neumann, who had come over in June 1944 from Princeton’s Institute of Advanced Study, where Turing had studied. Von Neumann reengineered the ENIAC to store the algorithm/program inside it along with the data to be processed, and also added a “conditional control transfer.” For memory, von Neumann noticed that mercury delay lines, used in radar systems to store aircraft location information, stored a pulse or wave in a vial of slow moving mercury.

…

The commercial acceptance of GPS is quite telling, GPS 161 and not only the DoD, but also the DoT. The Department of Transportation now overseas its implementation. *** The need for precision weapons would both directly and indirectly launch the digital revolution: transistors in 1948, lasers and integrated circuits in 1958, packet switching in 1964 and microprocessors in 1970, and that was just the easy stuff. Computers were invented to help win World War II. John von Neumann and the Moore School at the University of Pennsylvania designed the ENIAC digital computer, the birth mother of the U.S. computer industry, to speed up calculations for artillery firing tables for Navy guns. At the same time, Alan Turing and the British at Bletchley Park designed the Colossus computer to decipher Enigma codes. A host of electronic devices at Los Alamos helped speed up difficult calculations controlling the reaction of uranium-235 for the atomic bomb.

**
The Transhumanist Reader
** by
Max More,
Natasha Vita-More

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Drosophila, en.wikipedia.org, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, P = NP, pattern recognition, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Richard Feynman, Ronald Reagan, silicon-based life, Singularitarianism, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce

In 1970, Marvin Minsky made what turned out to be highly optimistic forecasts of the advent of super-intelligent artificial intelligence (AI), then in a 1994 Scientific American article explained why vastly extended lives will require replacing our biological brains with superior computational devices. The idea of accelerating technological progress driven by machine super-intelligence dates back several decades. This idea, now frequently referred to as “the singularity,” was explicitly pondered in a 1958 conversation between Stanislaw Ulam and John von Neumann during which they discussed “the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue” (Ulam 1958). In 1965, I.J. Good argued that AI development would lead to an intelligence explosion. These ideas were taken up and elaborated and extended by several other influential writers (Bostrom 1998; Broderick 2001; Kurzweil 1990, 1999; Moravec 1989; Vinge 1993).

…

Sandberg, Anders (2001) “Morphological Freedom – Why We Not Just Want It, but Need It.” http://www.nada.kth.se/~asa/Texts/MorphologicalFreedom.htm. Retrieved November 21, 2011. Stambler, Ilia (2010) “Life Extension: A Conservative Enterprise? Some Fin-de-Siècle and Early Twentieth-Century Precursors of Transhumanism.” Journal of Evolution and Technology 21/1 (March), pp. 13–26. Ulam, Stanislaw (1958) “John von Neumann 1903–1957.” Bulletin of the American Mathematical Society (May), part 2, pp. 1–49. Various (2002) “The Transhumanist Declaration.” http://humanityplus.org/philosophy/transhumanist-declaration/. Various (2003) “The Transhumanist FAQ: v 2.1.” World Transhumanist Association. http://humanityplus.org/philosophy/transhumanist-faq/. Verdoux, Philippe (2009) “Transhumanism, Progress and the Future.”

…

The problem is that while we can imagine, for example, a robot arm that can screw, bolt, solder, and weld enough to assemble a robot arm from parts, it needs a sequence of instructions to obey in this process. And there is more than one instruction per part. But the instructions must be embodied in some physical form, so to finish the process we need instructions to build the instructions, and so on, in an infinite regress. The answer to this seeming conundrum was given mathematically by John von Neumann, and at roughly the same time (the 1950s) was teased out of the naturally occurring self-reproducing machines we find all around us, living cells. It turns out to be the same answer in both cases. First, design a machine that can build machines, like the robot arm above. (In a cell, there is such a thing, called a ribosome.) Next, we need another machine which is special-purpose, and does nothing but copy instructions.

**
Darwin's Dangerous Idea: Evolution and the Meanings of Life
** by
Daniel C. Dennett

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Alfred Russel Wallace, anthropic principle, buy low sell high, cellular automata, combinatorial explosion, complexity theory, computer age, conceptual framework, Conway's Game of Life, Danny Hillis, double helix, Douglas Hofstadter, Drosophila, finite state, Gödel, Escher, Bach, In Cold Blood by Truman Capote, invention of writing, Isaac Newton, Johann Wolfgang von Goethe, John von Neumann, Murray Gell-Mann, New Journalism, non-fiction novel, Peter Singer: altruism, phenotype, price mechanism, prisoner's dilemma, QWERTY keyboard, random walk, Richard Feynman, Richard Feynman, Rodney Brooks, Schrödinger's Cat, Stephen Hawking, Steven Pinker, strong AI, the scientific method, theory of mind, Thomas Malthus, Turing machine, Turing test

If it exhibits peculiarities that could only have arisen in the course of solving the subproblems in some apparently remote branch of the Tree that grows in Design Space, then barring a miracle or a coincidence too Cosmic to credit, there must be a copying event of some kind that moved that completed design work to its new location. {134} There is no single summit in Design Space, nor a single staircase or ladder with calibrated steps, so we cannot expect to find a scale for comparing amounts of design work across distant developing branches. Thanks to the vagaries and digressions of different "methods adopted," something that is in some sense just one problem can have both hard and easy solutions, requiring more or less work. There is a famous story about the mathematician and physicist (and coinventor of the computer) John von Neumann, who was legendary for his lightning capacity to do prodigious calculations in his head. (Like most famous stories, this one has many versions, of which I choose the one that best makes the point I am pursuing.) One day a colleague approached him with a puzzle that had two paths to solution, a laborious, complicated calculation and an elegant, Aha!-type solution. This colleague had a theory: in such a case, mathematicians work out the laborious solution while the (lazier, but smarter) physicists pause and find the quick and easy solution.

…

They designed, and proved the viability of the design of, a self-reproducing entity composed entirely of Life cells that was also (for good measure) a Universal Turing machine — it was a two-dimensional computer that in principle can compute any computable function! What on Earth inspired Conway and his students to create first this world and then this amazing denizen of that world? They were trying to answer at a very abstract level one of the central questions we have been considering in this chapter: what is the minimal complexity required for a self-reproducing thing? They were following up the brilliant early speculations of John von Neumann, who had been working on the question at the time of his death {172} in 1957. Francis Crick and James Watson had discovered DNA in 1953, but how it worked was a mystery for many years. Von Neumann had imagined in some detail a sort of floating robot that picked up pieces of flotsam and jetsam that could be used to build a duplicate of itself that would then be able to repeat the process. His description (posthumously published, 1966) of how an automaton would read its own blueprint and then copy it into its new creation anticipated in impressive detail many of the later discoveries about the mechanisms of DNA expression and replication, but in order to make his proof of the possibility of a self-reproducing automaton mathematically rigorous and tractable, von Neumann had switched to simple, two-dimensional abstractions, now known as cellular automata.

…

Together they strike a fundamental blow at the last refuge to which people have retreated in the face of the Copernican Revolution: the mind as an inner sanctum that science cannot reach. (See Mazlish 1993) It is a long and winding road from molecules to minds, with many diverting spectacles along the way — and we will tarry over the most interesting of these in subsequent chapters — but now is the time to look more closely than usual at the Darwinian beginnings of Artificial Intelligence. 5. THE COMPUTER THAT LEARNED TO PLAY CHECKERS Alan Turing and John von Neumann were two of the greatest scientists of the century. If anybody could be said to have invented the computer, they did, and their brainchild has come to be recognized as both a triumph of engineering and an intellectual vehicle for exploring the most abstract realms of pure science. Both thinkers were at one and the same time awesome theorists and deeply practical, epitomizing an intellectual style that has been playing a growing role in science since the Second World War.

**
The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy
** by
Sharon Bertsch McGrayne

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

bioinformatics, British Empire, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, double helix, Edmond Halley, Fellow of the Royal Society, full text search, Henri Poincaré, Isaac Newton, John Nash: game theory, John von Neumann, linear programming, meta analysis, meta-analysis, Nate Silver, p-value, placebo effect, prediction markets, RAND corporation, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Ronald Reagan, speech recognition, statistical model, stochastic process, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, uranium enrichment, Yom Kippur War

Those connected with Colossus, the epitome of the British decryption effort, received little or no credit. Turing was given an Order of the British Empire (OBE), a routine award given to high civil servants. Newman was so angry at the government’s “derisory” lack of gratitude to Turing that he refused his own OBE. Britain’s science, technology, and economy were losers, too. The Colossi were built and operational years before the ENIAC in Pennsylvania and before John von Neumann’s computer at the Institute for Advance Study in Princeton, but for the next half century the world assumed that U.S. computers had come first. Obliterating all information about the decryption campaign distorted Cold War attitudes about the value of cryptanalysis and about antisubmarine warfare. The war replaced human spies with machines. Decryption was faster than spying and provided unfiltered knowledge of the enemy’s thinking in real time, yet the Cold War glamorized military hardware and the derring-do of spydom.

…

Roughly 3 nuclear bombs were dropped accidentally or jettisoned on purpose per 1,000 flights that carried these weapons. In that 80% of aircraft crashes occurred within 3 miles of an air force base, the likelihood of public exposure was growing. And so it went. None of these studies involved a nuclear explosion, but to a Bayesian they suggested ominous possibilities. Computationally, Madansky was confident that RAND’s two powerful computers, a 700 series IBM and the Johnniac, designed by and named for John von Neumann, could handle the job. But he hoped to avoid using them by solving the problem with pencil and paper. Given the power and availability of computers in the 1950s, many Bayesians were searching for ways to make calculations manageable. Madansky latched onto the fact that many types of priors and posteriors share the same probability curves. Bailey had used the same technique in the late 1940s, and later it would be known as Howard Raiffa and Robert Schlaifer’s conjugate priors.

…

To Raiffa’s great surprise, he became a superb and “deliriously happy” student who raced through a bachelor’s degree in mathematics, a master’s degree in statistics, and a doctorate in mathematics in six years between 1946 and 1952. “In the year I studied statistics, I don’t think I heard the word ‘Bayes.’ As a way of inference, it was nonexistent. It was all strictly Neyman-Pearson, classical, objectivistic (frequency-based) statistics.”10 Although Schlaifer had embraced Bayes in one fell swoop, Raiffa inched grudgingly toward its subjectivity. But reading John von Neumann and Oskar Morgenstern’s book Game Theory (1944), he instinctively assessed how others would play in order to determine how he himself should compete: “In my naiveté, without any theory or anything like that. . . . [I began] assessing judgmental probability distributions. I slipped into being a subjectivist without realizing how radically I was behaving. That was the natural thing to do. No big deal.”11 When Raiffa gave a series of seminars on Abraham Wald’s new book Statistical Decision Functions, he discovered it was full of Bayesian decisionmaking rules for use in a frequentist framework.

**
The Undercover Economist: Exposing Why the Rich Are Rich, the Poor Are Poor, and Why You Can Never Buy a Decent Used Car
** by
Tim Harford

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, barriers to entry, Berlin Wall, collective bargaining, congestion charging, Corn Laws, David Ricardo: comparative advantage, decarbonisation, Deng Xiaoping, Fall of the Berlin Wall, George Akerlof, invention of movable type, John Nash: game theory, John von Neumann, market design, Martin Wolf, moral hazard, new economy, price discrimination, Productivity paradox, race to the bottom, random walk, rent-seeking, Robert Gordon, Robert Shiller, Robert Shiller, Ronald Reagan, sealed-bid auction, second-price auction, second-price sealed-bid, Shenzhen was a fishing village, special economic zone, spectrum auction, The Market for Lemons, Thomas Malthus, trade liberalization, Vickrey auction

Unfortunately, not all of the economists who were hired as consultants knew how to set up the auction so that it was likely to produce a good price. One auction really did raise less than 1 percent of what was hoped for, while another raised ten times as much as expected. This wasn’t down to luck but to cleverness in some cases and blundering in others. Auctioning air, like playing poker, is a game of great skill—and one that was played for very high stakes indeed. Love, war, and poker Many of those who knew the mathematician John von Neumann regarded him as the “best brain in the world,” and they had a chance to compare him with some stiff competition, given that one of von Neumann’s colleagues at Princeton was Albert Einstein. Von Neumann was a genius around whom grew a my-thology of almost superhuman intelligence. According to one story, Von Neumann was asked to assist with the design of a new supercomputer, required to solve a new and important mathematical problem, which was beyond the capacities of existing supercomputers.

…

Chris “Jesus” Ferguson, a disciple of Von Neumann, proved it when he won the World Series of Poker in 2000. But poker with your buddies in the garage is not the World Series; what can game theory say about players who get drunk and bluff badly? This is not a knockdown objection to game theory. It is possible to model mistakes, forgetfulness, misinformation, and any other kind of failure on the part of the players to live up to the impossibly high standards of John Von Neumann. The trouble is that the more mistakes that need to be taken into account, the more complicated and the less useful game theory becomes. It is always useful for the game theorist to draw on experience as well as pure theory, because if the game becomes too complex for the players to understand, then the theory becomes nearly useless for practical purposes since it tells us nothing about what they will actually do.

…

Telecom executives may curse the British auctions since 3G remains commercially unproven and threatened by competitors like Wi-Fi, but the public should celebrate them. All the compa- • 174 • T H E M E N W H O K N E W T H E V A L U E O F N O T H I N G nies involved were convinced that the 3G licenses offered tremendous scarcity value, and these auctions successfully secured a fair price for that apparent value. John von Neumann’s successors used game theory to achieve one of the most spectacular, if controversial, policy triumphs that economics had ever seen. The men who knew the “value of nothing” had shown that economists, like dentists, could finally earn their keep. • 175 • This page intentionally left blank W H Y P O O R C O U N T R I E S A R E P O O R E I G H T Why Poor Countries Are Poor They call Douala “the Armpit of Africa.”

**
Beyond: Our Future in Space
** by
Chris Impey

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, Admiral Zheng, Albert Einstein, Alfred Russel Wallace, Berlin Wall, Buckminster Fuller, butterfly effect, California gold rush, carbon-based life, Colonization of Mars, cosmic abundance, crowdsourcing, cuban missile crisis, dark matter, discovery of DNA, Doomsday Clock, Edward Snowden, Elon Musk, Eratosthenes, Haight Ashbury, Hyperloop, I think there is a world market for maybe five computers, Isaac Newton, Jeff Bezos, John von Neumann, Kickstarter, life extension, Mahatma Gandhi, Mars Rover, mutually assured destruction, Oculus Rift, operation paperclip, out of africa, Peter H. Diamandis: Planetary Resources, phenotype, purchasing power parity, RAND corporation, Ray Kurzweil, RFID, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, risk tolerance, Search for Extraterrestrial Intelligence, Searching for Interstellar Communications, Silicon Valley, skunkworks, Skype, Stephen Hawking, Steven Pinker, supervolcano, technological singularity, telepresence, telerobotics, the medium is the message, the scientific method, theory of mind, V2 rocket, wikimedia commons, X Prize, Yogi Berra

This is a spacecraft that could go to a neighboring star system, mine materials to create replicas of itself, and send those out to other star systems. Using fairly conventional forms of propulsion, these probes could spread through a galaxy the size of the Milky Way in less than a few million years. The probes could investigate planetary systems and send information back to us on the home planet.23 The concept is named after the Hungarian mathematician and physicist John von Neumann. He was one of the major intellectual figures of the twentieth century, making important contributions to mathematics, physics, computer science, and economics. Noted physicist Eugene Wigner recalled that von Neumann’s unusual mind was like a “. . . perfect instrument whose gears were machined to mesh accurately within a thousandth of an inch.” But he was less perfect in the real world. As a driver, he had numerous accidents and a few arrests, usually because he was distracted or reading.

…

Exponential gains in processing power lead to the idea of the technological singularity. This is the time, projected to be in the middle of the twenty-first century, when civilization and human nature itself are fundamentally transformed. One variant of the singularity is when artificial intelligence surpasses human intelligence. Software-based synthetic minds begin to program themselves and a runaway reaction of self-improvement occurs. This event was foreshadowed by John von Neumann and Alan Turing in the 1950s. Turing wrote that “. . . at some stage therefore we should have to expect the machines to take control . . . ,” and von Neumann described “. . . an ever-accelerating progress and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”17 A dystopian version of this event permeates the popular culture, from science fiction novels to movies such as Blade Runner and The Terminator.

…

“X-Tech and the Search for Infra Particle Intelligence” by H. de Garis 2014, from Best of H+, online at http://hplusmagazine.com/2014/02/20/x-tech-and-the-search-for-infra-particle-intelligence/. 17. Intelligent Machinery, A Heretical Theory by A. Turing 1951, reprinted in Philosophia Mathematica 1996, vol. 4, no. 3, pp. 256–60. The von Neumann quote comes from Stanislaw Ulam’s “Tribute to John von Neumann” in the May 1958 Bulletin of the American Mathematical Society, p. 5. 18. “Are You Living in a Computer Simulation?” by N. Bostrom 2003. Philosophical Quarterly, vol. 53, no. 211, pp. 243–55. The views of Kurzweil and Moravec are represented in their popular books, in particular The Singularity Is Near: When Humans Transcend Biology by R. Kurzweil 2006. New York: Penguin; and Robot: Mere Machine to Transcendent Mind by H.

**
Priceless: The Myth of Fair Value (And How to Take Advantage of It)
** by
William Poundstone

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

availability heuristic, Cass Sunstein, collective bargaining, Daniel Kahneman / Amos Tversky, delayed gratification, Donald Trump, East Village, en.wikipedia.org, endowment effect, equal pay for equal work, experimental economics, experimental subject, feminist movement, game design, German hyperinflation, Henri Poincaré, high net worth, index card, invisible hand, John von Neumann, laissez-faire capitalism, loss aversion, market bubble, mental accounting, meta analysis, meta-analysis, Nash equilibrium, new economy, payday loans, Potemkin village, price anchoring, price discrimination, psychological pricing, Ralph Waldo Emerson, RAND corporation, random walk, RFID, Richard Thaler, risk tolerance, Robert Shiller, Robert Shiller, rolodex, Steve Jobs, The Chicago School, The Wealth of Nations by Adam Smith, ultimatum game, working poor

• • • Ward Edwards (1927–2005) spent his career asking difficult questions. Born in Morristown, New Jersey, he was the son of an economist and grew up hearing the table talk of his father’s colleagues. This instilled in him a rebellious skepticism toward economics. Ward decided on psychology as a career, studying at Swarthmore and Harvard. It was at Harvard that he read the work of John von Neumann and Oskar Morgenstern, and he wasn’t crazy about all he read. Hungarian-born John von Neumann was one of the great mathematicians of the twentieth century. At the urging of Princeton economist Oskar Morgenstern, von Neumann turned his brilliant mind to the problems of economics. The result was a 1944 book, Theory of Games and Economic Behavior. Von Neumann’s running metaphor was that economic conflicts were “games,” something like poker and equally amenable to mathematical analysis.

…

Between the trick haircut and the tight smile that might be a frown, Allais’ face evoked one of those odd pictures that becomes a different face when turned upside down. Allais had told Savage he had something to show him. It was a little test he wanted him to take. The important thing is that Savage failed the test. Savage was a brash statistician, then at the University of Chicago. He had gone into statistics on the advice of John von Neumann himself. Visually, the most remarkable thing about him was his eyeglasses. Their lenses packed enough diopters to reveal the space behind his head. At Chicago, Savage had acquired a second mentor, Milton Friedman—founding father of the Chicago school of economics, future Nobel laureate, and veritable saint to Reagan-era capitalists. Friedman knew quite a bit of statistics for an economist.

**
The Golden Ticket: P, NP, and the Search for the Impossible
** by
Lance Fortnow

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, Claude Shannon: information theory, cloud computing, complexity theory, Erdős number, four colour theorem, Gerolamo Cardano, Isaac Newton, John von Neumann, linear programming, new economy, NP-complete, Occam's razor, P = NP, Paul Erdős, Richard Feynman, Richard Feynman, smart grid, Stephen Hawking, traveling salesman, Turing machine, Turing test, Watson beat the top human players on Jeopardy!, William of Occam

We don’t think that P = NP, but the possibility of such a beautiful world is tantalizing. P versus NP P versus NP is about all the problems described above and thousands more of a similar flavor: How fast can we search through a huge number of possibilities? How easily can we find that “golden ticket,” that one best answer? The P versus NP problem was first mentioned in a 1956 letter from Kurt Gödel to John von Neumann, two of the greatest mathematical minds of the twentieth century. That letter was unfortunately lost until the 1980s. The P versus NP problem was first publicly announced in the early 1970s by Steve Cook and Leonid Levin, working separately in countries on opposite sides of the Cold War. Richard Karp followed up with a list of twenty-one important problems that capture P versus NP, including the traveling salesman problem and the partition puzzle mentioned earlier.

…

Alexander Razborov, a Russian student, played a major role in the development of circuit complexity as an approach to proving P ≠ NP, a story we tell in chapter 7. After the collapse of the Soviet Union and the rise of the Internet, Russian mathematical researchers no longer worked in isolation. The world is now a truly global research environment. The Gödel Letter In 1956 Kurt Gödel wrote a letter to John von Neumann, one of the pioneers of computer science and many other fields. In this letter (written in German), Gödel talks about the satisfiability problem and formulates the P versus NP question in different terminology. He suggests that if we lived in a world where P = NP, “the mental work of a mathematician concerning Yes-or-No questions could be completely replaced by a machine. … Now it seems to me, however, to be completely within the realm of possibility.”

**
When Things Start to Think
** by
Neil A. Gershenfeld

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, Ada Lovelace, Bretton Woods, cellular automata, Claude Shannon: information theory, Dynabook, Hedy Lamarr / George Antheil, I think there is a world market for maybe five computers, invention of movable type, Iridium satellite, Isaac Newton, Jacquard loom, Jacquard loom, John von Neumann, means of production, new economy, Nick Leeson, packet switching, RFID, speech recognition, Stephen Hawking, Steve Jobs, telemarketer, the medium is the message, Turing machine, Turing test, Vannevar Bush

There is a disconnect between the breathless pronouncements of cyber gurus and the experience of ordinary people left perpetually upgrading hardware to meet the demands of new software, or wondering where their files have gone, or trying to understand why they can't connect to the network. The revolution so far has been for the computers, not the people. Digital data of all kinds, whether an e-mail message or a movie, is encoded as a string of O's and 1's because of a remarkable discovery by Claude Shannon and John von Neumann in the 1940s. Prior to their work, it was obvious that engineered systems degraded with time and use. A tape recording sounds worse after it is duplicated, a photocopy is less satisfactory than an original, a telephone call becomes more garbled the farther it has to travel. They showed that this is not so for a digital representation. Errors still occur in digital systems, but instead of continuously degrading the performance there is a threshold below which errors can be corrected with near certainty.

…

When Babbage started building machines to evaluate not just arithmetic but more complex functions he likewise used discrete values. This required approximating the continuous changes by small differences, hence the name of the Difference Engine. These approximations have been used ever since in electronic digital computers to allow them to manipulate models of the continuous world. Starting in the 1940s with John von Neumann, people realized that this practice was needlessly circular. Most physical phenomena start out discrete at some level. A fluid is not actually continuous; it is just made up of so many molecules that it appears to be continuous. The equations of calculus for a fluid are themselves an approximation of the rules for how the molecules behave. Instead of approximating discrete molecules with continuous equations that then get approximated with discrete variables on a computer, it's possible go directly to a computer model that uses discrete values for time and space.

**
Why Information Grows: The Evolution of Order, From Atoms to Economies
** by
Cesar Hidalgo

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Albert Einstein, Arthur Eddington, Claude Shannon: information theory, David Ricardo: comparative advantage, Douglas Hofstadter, frictionless, frictionless market, George Akerlof, Gödel, Escher, Bach, income inequality, income per capita, invention of the telegraph, invisible hand, Isaac Newton, James Watt: steam engine, Jane Jacobs, job satisfaction, John von Neumann, New Economic Geography, Norbert Wiener, p-value, phenotype, price mechanism, Richard Florida, Ronald Coase, Silicon Valley, Simon Kuznets, Skype, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, The Market for Lemons, The Nature of the Firm, The Wealth of Nations by Adam Smith, total factor productivity, transaction costs, working-age population

Finally, since there are 140 characters in a tweet, Brian will need 140 × 5 = 700 yes-or-no questions, or bits, to uniquely identify Abby’s tweet.7 Shannon’s theory tells us that we need 700 bits, or yes-or-no questions, to communicate a tweet written using a thirty-two-character alphabet. Shannon’s theory is also the basis of modern communication systems. By quantifying the number of bits we need to encode messages, he helped develop digital communication technologies. Yet what Shannon did not know when he developed his formula was that it was identical to the formula discovered by Boltzmann nearly half a century earlier. At the suggestion of John von Neumann, the famous Hungarian mathematician, Shannon decided to call his measure “entropy,” since Shannon’s formula was equivalent to the formula for entropy used by statistical physicists. (Also—as the legend goes—von Neumann told Shannon that calling his measure entropy would guarantee Shannon’s victory in every argument, since nobody really knew what entropy was.) But the interpretation of entropy and information that emerged from Shannon’s work was hard to reconcile both with the traditional use of the word information and with the interpretation of entropy that emerged from Boltzmann’s work.

…

The fact that the genetic variation between individuals is much larger than the genetic variation between groups is a key argument to fend off racist and eugenic arguments. This explanation is key to the line of argumentation advanced in Pinker, The Blank Slate. 14. Speculating about the knowledge- and information-carrying capacity of the human brain is an interesting exercise. Among the first ones to perform this exercise was John von Neumann, the Hungarian polymath who became interested in computers while working on the Manhattan Project. Some of his speculations on the topic are presented in his The Computer and the Brain (New Haven, CT: Yale University Press, 1958). There, Neumann notes that the architecture of the brain is fundamentally different from that of computers. Computers are built on transistors, which take two inputs to produce one output, while brains are built on neurons, which can take up to tens of thousands of inputs to produce a single output.

**
Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots
** by
John Markoff

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

A Declaration of the Independence of Cyberspace, AI winter, airport security, Apple II, artificial general intelligence, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Duvall, bioinformatics, Brewster Kahle, Burning Man, call centre, cellular automata, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, collective bargaining, computer age, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deskilling, don't be evil, Douglas Engelbart, Douglas Hofstadter, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, factory automation, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, Google Glasses, Google X / Alphabet X, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, haute couture, hive mind, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, Mother of all demos, natural language processing, new economy, Norbert Wiener, PageRank, pattern recognition, pre–internet, RAND corporation, Ray Kurzweil, Richard Stallman, Robert Gordon, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Nelson, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Turing test, Vannevar Bush, Vernor Vinge, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, William Shockley: the traitorous eight

A decade and a half after he had issued his first warnings about the consequences of automated machines, Wiener turned his thoughts to religion and technology while remaining a committed humanist. In his final book, God & Golem, Inc., he explored the future human relationship with machines through the prism of religion. Invoking the parable of the golem, he pointed out that despite best intentions, humans are incapable of understanding the ultimate consequences of their inventions.16 In his 1980 dual biography of John von Neumann and Wiener, Steven Heims notes that in the late 1960s he had asked a range of mathematicians and scientists about Wiener’s philosophy of technology. The general reaction of the scientists was as follows: “Wiener was a great mathematician, but he was also eccentric. When he began talking about society and the responsibility of scientists, a topic outside of his area of expertise, well, I just couldn’t take him seriously.”17 Heims concludes that Wiener’s social philosophy hit a nerve with the scientific community.

…

He jumped straight to advanced calculus and simultaneously a range of other courses including aeronautical engineering. He was drafted relatively late in the war, so his army career was more about serving as a cog in the bureaucracy than combat. Stationed close to home at Fort MacArthur in the port city of San Pedro, California, he began as a clerk, preparing discharges, then promotions for soldiers leaving the military. He made his way to Princeton for graduate school and promptly paid a visit to John von Neumann, the applied mathematician and physicist who would become instrumental in defining the basic design of the modern computer. At this point the notion of “artificial intelligence” was fermenting in McCarthy’s mind, but the coinage had not yet come to him. That wouldn’t happen for another half decade in conjunction with the summer 1956 Dartmouth conference. He had first come to the concept in grad school when attending the Hixon Symposium on Cerebral Mechanisms in Behavior at Caltech.10 At that point there weren’t programmable computers, but the idea was in the air.

…

pagewanted=all. 12.Ibid. 13.Ibid. 14.Carew, Walter Reuther, 144. 15.The Ad Hoc Committee on the Triple Revolution, “The Triple Revolution,” Liberation, April 1964, http://www.educationanddemocracy.org/FSCfiles/C_CC2a_TripleRevolution.htm. 16.Mark D. Stahlman, “Wiener’s Genius Project” (invited paper, IEEE 2014 Conference on Norbert Wiener in the 21st Century, 2014). 17.Steve J. Heims, John von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life and Death (Cambridge, MA: MIT Press, 1980), 343. 18.Norbert Wiener, God and Golem, Inc.: A Comment on Certain Points where Cybernetics Impinges on Religion (Cambridge, MA: MIT Press, 1964), 29. 19.“Machines Smarter Than Men? Interview with Dr. Norbert Wiener, Noted Scientist,” U.S. News & World Report, February 24, 1964, http://21stcenturywiener.org/wp-content/uploads/2013/11/Machines-Smarter-Than-Man-Interview-with-Norbert-Wiener.pdf. 20.Defense Science Board, “The Role of Autonomy in DoD Systems,” U.S.

**
Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing (Writing Science)
** by
Thierry Bardini

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Apple II, augmented reality, Bill Duvall, conceptual framework, Douglas Engelbart, Dynabook, experimental subject, Grace Hopper, hiring and firing, hypertext link, index card, information retrieval, invention of hypertext, Jaron Lanier, Jeff Rulifson, John von Neumann, knowledge worker, Menlo Park, Mother of all demos, new economy, Norbert Wiener, packet switching, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, RFC: Request For Comment, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, stochastic process, Ted Nelson, the medium is the message, theory of mind, Turing test, unbiased observer, Vannevar Bush, Whole Earth Catalog

(Licklider I9 6 5, 94 -9 5) Efforts to develop "equipment in which the user and the computer make their marks on different screens," that is, equipment in which, as on a type- writer keyboard, what the hand does and what the eye sees were again un- linked, in fact had started in the mid-1950'S, with the design and operation of JOHNNIAC, a Princeton-class computer built at RAND between 1950 and 1953 and named after John von Neumann, and the work of Allen Newell, Herbert Simon, and Cliff Shaw at RAND on JOSS, the JOHNNIAC Open Shop System, between 1960 and 1964. JOSS's main application was a "help- ful assistant" in the Artificial Intelligence tradition designed for mathemati- cians, an "open-shop" experiment in on-line communication. 8 "Open shop" in this context meant that JOSS was directly available to its users, who for the first time in computing history were not programmers or computer scientists: they were mathematicians at RAND.

…

The notion that theIr conceptual system better fit scientific reality-that we could learn from them-bordered on the unthinkable" (19 8 7, 330, emphasis In the orIgInal). 12. The excerpt of Engelbart's 1962 report surely conveys the impression that Engelbart directly quotes Whorf. It IS obvious that It IS not the case, but that Engel- bart IS gIving here hIs own translation of the hypothesis. 13. During their training years as young mathematicians, from 1924 to 1926, both Norbert Wiener and John von Neumann spent some time in Gottingen, Ger- many, where their paths first crossed when they attended lectures by Heisenberg (Heims 19 8o , 5 1 - 52). 14. Korzybski summarized these three Aristotelian laws of thought as follows: "( I) The law of identity (Whatever is, is. (A thing is what it is}); (2) The law of con- tradiction. (Nothing can both be, and not be); and (3) The law of excluded thIrd {mIddle} (Everything must eIther be, or not be)" (quoted in Paulson 19 8 3, 47). 15.

…

.: Spartan Books. Hauben, M. 1996. "Behind the Net: The Untold History of the ARPANET." AvaIl- able on-line at http://picasso.dei.isep.ipp.pt/docs/arpa.html. Hayles, N. K. 1999. How We Became Post-Human: VIrtual Bod,es In CybernetIcs, LIterature, and Informatics. Chicago: UniversIty of ChIcago Press. Helm, M. 1993. MetaphysIcs of Virtual Reality. Oxford: Oxford University Press. Heims, S. J. 1980. John von Neumann and Norbert WIener: From Mathematics to TechnologIes of Life and Death. Cambridge, Mass.: MIT Press. . 1991. The CybernetIcs Group. Cambridge, Mass.: MIT Press. HerkImer County HIstorical Society. 1923. The Story of the TypewrIter, 1873- 1923. HerkImer, N.Y. Hodges, A. 1992 [1983]. Alan TurIng: The EnIgma. London: Vintage. Hofstadter, R. 1962. Ant,-Intel/ectual,sm In AmerIcan LIfe. New York: VIntage Books.

**
I Am a Strange Loop
** by
Douglas R. Hofstadter

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrew Wiles, Benoit Mandelbrot, Brownian motion, double helix, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Isaac Newton, James Watt: steam engine, John Conway, John von Neumann, mandelbrot fractal, pattern recognition, Paul Erdős, place-making, probability theory / Blaise Pascal / Pierre de Fermat, publish or perish, random walk, Ronald Reagan, self-driving car, Silicon Valley, telepresence, Turing machine

They thought they were building machines of very limited, and purely military, scopes — for instance, machines to calculate the trajectories of ballistic missiles, taking wind and air resistance into account, or machines to break very specific types of enemy codes. They envisioned their computers as being specialized, single-purpose machines — a little like wind-up music boxes that could play just one tune each. But at some point, when Alan Turing’s abstract theory of computation, based in large part on Gödel’s 1931 paper, collided with the concrete engineering realities, some of the more perceptive people (Turing himself and John von Neumann especially) put two and two together and realized that their machines, incorporating the richness of integer arithmetic that Gödel had shown was so potent, were thereby universal. All at once, these machines were like music boxes that could read arbitrary paper scrolls with holes in them, and thus could play any tune. From then on, it was simply a matter of time until cell phones started being able to don many personas other than just the plain old cell-phone persona.

…

SL #641: When we do look down at our fine-grained substrates through scientific experiments, we find small miracles just as Gödelian as is “I”. SL #642: Ah, yes, to be sure — little microgödelinos! But… such as? SL #641: I mean the self-reproduction of the double helix of DNA. The mechanism behind it all involves just the same abstract ideas as are implicated in Gödel’s type of self-reference. This is what John von Neumann unwittingly revealed when he designed a self-reproducing machine in the early 1950’s, and it had exactly the same abstract structure as Gödel’s self-referential trick did. SL #642: Are you saying microgödelinos are self-replicating machines? SL #641: Yes! It’s a subtle but beautiful analogy. The analogue of the Gödel number k is a particular blueprint. The “parent” machine examines this blueprint and follows its instructions exactly — that is, it builds what the blueprint depicts.

…

Page 139 an elegant linguistic analogy… See [Quine] for the original idea (which is actually a variation of Gödel’s idea (which is itself a variation of an idea of Jules Richard (which is a variation of an idea of Georg Cantor (which is a variation of an idea of Euclid (with help from Epimenides))))), and [Hofstadter 1979] for a variation on Quine’s theme. Page 147 “…and Related Systems (I)”… Gödel put a roman numeral at the end of the title of his article because he feared he had not spelled out sufficiently clearly some of his ideas, and expected he would have to produce a sequel. However, his paper quickly received high praise from John von Neumann and other respected figures, catapulting the unknown Gödel to a position of great fame in a short time, even though it took most of the mathematical community decades to absorb the meaning of his results. Page 150 respect for …the most mundane of analogies… See [Hofstadter 2001] and [Sander], as well as Chapter 24 in [Hofstadter 1985] and [Hofstadter and FARG]. Page 159 X’s play is so mega-inconsistent… This should be heard as “X’s play is omega-inconsistent”, which makes a phonetic hat-tip to the metamathematical concepts of omega-inconsistency and omega-incompleteness, discussed in many books in the Bibliography, such as [DeLong], [Nagel and Newman], [Hofstadter 1979], [Smullyan 1992], [Boolos and Jeffrey], and others.

**
The Age of Radiance: The Epic Rise and Dramatic Fall of the Atomic Era
** by
Craig Nelson

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Brownian motion, cognitive dissonance, Columbine, corporate governance, cuban missile crisis, dark matter, Doomsday Clock, El Camino Real, Ernest Rutherford, failed state, Henri Poincaré, hive mind, Isaac Newton, John von Neumann, Louis Pasteur, Menlo Park, Mikhail Gorbachev, music of the spheres, mutually assured destruction, nuclear winter, oil shale / tar sands, Project Plowshare, Ralph Nader, Richard Feynman, Richard Feynman, Ronald Reagan, Skype, Stuxnet, technoutopianism, too big to fail, uranium enrichment, V2 rocket, éminence grise

Like all the rest of the Creeps, Fermi’s bodyguard and chauffeur, the Italian-speaking American, halfback-size lawyer John Baudino, filed weekly reports on his subject with army intelligence. To keep Fermi from talking about his work to others, Baudino got him to talk about it to him. So Fermi started saying that, “Soon Johnny will know so much about the project he will need a bodyguard, too.” When agent Charles Campbell, who hated physics but pretended to like it as part of his job, mentioned to John von Neumann that he was too busy to study, von Neumann got upset: “It is my fault! You will come with me and together we will study theoretical physics in New Mexico!” FBI surveillance teams used walkie-talkies disguised as hearing aids, and any assembled together looked conspicuously like an outing of deaf people. The same month that Los Alamos opened, the FBI suspended its surveillance of Oppenheimer at the insistence of the army.

…

By the summer of 1948, Edward Teller’s Chicago idyll was upended by news of the Soviet invasions of Hungary, his birthplace, and Czechoslovakia, with its uranium mother lode at St. Joachimsthal. Communists were victorious in China, and soon enough, they would successfully blockade Berlin. It appeared to many that America’s foes were taking over the world, that the United States was in real danger. “Russia was traditionally the enemy,” John von Neumann said of his countrymen. “I think you will find, generally speaking, among Hungarians an emotional fear and dislike of Russia.” Had Edward Teller been certain that a hydrogen bomb was impossible, that nobody could make it, he would have set his sights elsewhere. But like Leo Szilard’s thinking of Hitler, Ed was tormented by what might happen if the Americans failed to create such a mighty weapon, and the totalitarians succeeded.

…

The very least we can conclude is that our twenty-thousandth bomb, useful as it may be in filling the vast munitions pipelines of a great war, will not in any deep strategic sense offset their two-thousandth.” Hearing this, physicist John Wheeler complained to a congressman, “Anybody who says twenty thousand weapons are no better than two thousand ought to read the history of wars.” Sharing Wheeler’s perspective was Hungarian mathematician John von Neumann, who announced in 1950, “If you say why not bomb them tomorrow, I say why not today? If you say today at five o’clock, I say why not one o’clock?” Von Neumann’s promotion of a preemptory nuclear strike was one of the many oddities that inspired Einstein to nickname his Princeton colleague Denktier, “think animal.” Washington military and civilian policymakers during the Cold War would be baffled by the conundrum of “What is the atom good for in war?”

**
Richard Dawkins: How a Scientist Changed the Way We Think
** by
Alan Grafen; Mark Ridley

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alfred Russel Wallace, Arthur Eddington, bioinformatics, cognitive bias, computer age, conceptual framework, Dava Sobel, double helix, Douglas Hofstadter, epigenetics, Fellow of the Royal Society, Haight Ashbury, interchangeable parts, Isaac Newton, Johann Wolfgang von Goethe, John von Neumann, loose coupling, Murray Gell-Mann, Necker cube, phenotype, profit maximization, Ronald Reagan, Stephen Hawking, Steven Pinker, the scientific method, theory of mind, Thomas Kuhn: the structure of scientific revolutions, Yogi Berra

I am not concerned here with the psychology of motives.’14 Dawkins’ brilliant application of mentalistic behaviorism—what I call the intentional stance—to evolutionary biology was, like my own coinage, an articulation of ideas that were already proving themselves in the work of many other theorists. We are both clari-fiers and unifiers of practices and attitudes pioneered by others, and we share a pantheon: Alan Turing and John von Neumann on the one hand, and Bill Hamilton, John Maynard Smith, George Williams, and Bob Trivers on the other. We see computer science and evolutionary theory fitting together in excellent harmony; it’s algorithms all the way down. Dawkins and I have both had to defend our perspective against those who cannot fathom—or abide—this strategic approach to such deep matters. Mary Midgley15 was incredulous—how on earth could a gene be selfish?

…

In the 1950s, Alan Turing, the ‘father of artificial intelligence’ and a man fundamentally associated with codes, logic, chess, and other mechanico-mathematical arcana, developed influential models of biological morphogenesis:1 the processes involved in the development of biological patterns as an organism grows from a single cell. He was particularly interested in accounting for the tendency of spiral patterns in many plant structures to obey the Fibonacci sequence (e.g. if you count the number of whirls running clockwise on a pine cone and the number running anticlockwise, the two numbers will be consecutive terms in Fibonacci’s famous sequence of integers: 0, 1, 1, 2, 3, 5, 8, 12, ...). At the same time, John von Neumann, one of history’s great polymaths and the man responsible for game theory and the architecture of the modern computer among many other things typically considered to lie far from the muddy field of biology, worked on the problem of selfreplication:2 over evolutionary time, simple life-forms have given rise to more complicated creatures, but how, von Neumann asked, could a machine (like a dog or an amoeba or a robot) make a more complex version of itself?

**
The Misbehavior of Markets
** by
Benoit Mandelbrot

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, asset allocation, Augustin-Louis Cauchy, Benoit Mandelbrot, Big bang: deregulation of the City of London, Black-Scholes formula, British Empire, Brownian motion, buy low sell high, capital asset pricing model, carbon-based life, discounted cash flows, diversification, double helix, Edward Lorenz: Chaos theory, Elliott wave, equity premium, Eugene Fama: efficient market hypothesis, Fellow of the Royal Society, full employment, Georg Cantor, Henri Poincaré, implied volatility, index fund, informal economy, invisible hand, John von Neumann, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, market bubble, market microstructure, new economy, paper trading, passive investing, Paul Lévy, Plutocrats, plutocrats, price mechanism, quantitative trading / quantitative ﬁnance, Ralph Nelson Elliott, RAND corporation, random walk, risk tolerance, Robert Shiller, Robert Shiller, short selling, statistical arbitrage, statistical model, Steve Ballmer, stochastic volatility, transfer pricing, value at risk, volatility smile

Mandelbrot’s career has taken a jagged path. In 1945, he dropped out of France’s most prestigious school, the École Normale Supérieure, on the second day, to enroll at the less-exalted but more appropriate École Polytechnique. He proceeded to Caltech; then—after a Ph.D. in Paris—to MIT; then to the Institute for Advanced Study in Princeton, as the last post-doc to study with the great Hungarian-born mathematician, John von Neumann; then to Geneva and back to Paris for a time. Atypically for a scientist in those days, Mandelbrot ended up working, not in a university lecture hall, but in an industrial laboratory, IBM Research, up the Hudson River from Manhattan. At that time IBM’s bosses were drawing into that lab and its branches a number of brainy, unpredictable people, not doubting they would do something brilliant for the company.

…

He was suffered to give occasional lecture series at the University of Paris; it was feared he would in some way disrupt the standard curriculum. I recall that by the end of one such series, I was his sole auditor; we could as easily have quit the auditorium and adjourned to his office for a chat. At seventy-eight, he received belated recognition by election to France’s Académie des Sciences. But he was ever an anomaly. As a later teacher of mine, John von Neumann, told me: “I think I understand how every other mathematician operates, but Lévy is like a visitor from a strange planet. He seems to have his own private methods of arriving at the truth, which leave me ill at ease.” Lévy did not “arrive at” probability theory until he was nearly forty, when he was asked shortly after World War I to lecture on targeting errors in gunnery. He was soon doing original work, beginning with what he—most unfortunately—called “stable” probability distributions.

**
Is God a Mathematician?
** by
Mario Livio

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Antoine Gombaud: Chevalier de Méré, Brownian motion, cellular automata, correlation coefficient, correlation does not imply causation, cosmological constant, Dava Sobel, double helix, Edmond Halley, Eratosthenes, Georg Cantor, Gerolamo Cardano, Gödel, Escher, Bach, Henri Poincaré, Isaac Newton, John von Neumann, music of the spheres, probability theory / Blaise Pascal / Pierre de Fermat, The Design of Experiments, the scientific method, traveling salesman

Zermelo—who discovered Russell’s paradox independently as early as 1900—proposed a way to build set theory on a corresponding axiomatic foundation. Russell’s paradox was bypassed in this theory by a careful choice of construction principles that eliminated contradictory ideas such as “the set of all sets.” Zermelo’s scheme was further augmented in 1922 by the Israeli mathematician Abraham Fraenkel (1891–1965) to form what has become known as the Zermelo-Fraenkel set theory (other important changes were added by John von Neumann in 1925). Things would have been nearly perfect (consistency was yet to be demonstrated) were it not for some nagging suspicions. There was one axiom—the axiom of choice—that just like Euclid’s famous “fifth” was causing mathematicians serious heartburn. Put simply, the axiom of choice states: If X is a collection (set) of nonempty sets, then we can choose a single member from each and every set in X to form a new set Y.

…

In an article published in 1947 he wrote: But, despite their remoteness from sense experience, we do have something like a perception of the objects of set theory, as is seen from the fact that the axioms force themselves upon us as being true. I don’t see any reason why we should have less confidence in this kind of perception, i.e., in mathematical intuition, than in sense perception. By an ironic twist of fate, just as the formalists were getting ready for their victory march, Kurt Gödel—an avowed Platonist—came and rained on the parade of the formalist program. The famous mathematician John von Neumann (1903–57), who was lecturing on Hilbert’s work at the time, canceled the rest of his planned course and devoted the remaining time to Gödel’s findings. Gödel the man was every bit as complex as his theorems. In 1940, he and his wife Adele fled Nazi Austria so he could take up a position at the Institute for Advanced Study in Princeton, New Jersey. There he became a good friend and walking partner of Albert Einstein.

**
Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking
** by
Charles Seife

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, anti-communist, Brownian motion, correlation does not imply causation, Dmitri Mendeleev, Ernest Rutherford, Fellow of the Royal Society, Gary Taubes, Isaac Newton, John von Neumann, Mikhail Gorbachev, Project Plowshare, Richard Feynman, Richard Feynman, Ronald Reagan, the scientific method, Yom Kippur War

Los Alamos, perched on a mesa in the New Mexico desert, was the intellectual heart of the Manhattan Project. Other facilities, such as one at Oak Ridge in Tennessee and another at Hanford in Washington, were crucial to figuring out the best way to separate bombworthy uranium-235 from the much more common uranium-238 and how to manufacture plutonium-239.2 However, the big minds roamed at Los Alamos: Oppenheimer, Hans Bethe, Richard Feynman, Stanislaw Ulam, John von Neumann, Enrico Fermi, and Edward Teller. Teller, a Hungarian émigré and, arguably, a better theoretician than Oppenheimer, was brought to the University of Chicago in mid-1942 by the Manhattan Project just as it was getting under way. When Teller arrived, nobody assigned him a task, so he set to work trying to design the ultimate weapon, more powerful even than the one the project’s scientists were trying to build.

…

New York Times, 7 May 1958. ———. “British Modifying Fusion Apparatus.” New York Times, 29 January 1958. ———. “British-U.S. Data on Hydrogen Due.” New York Times, 13 January 1958. ———. “Briton 90% Sure Fusion Occurred.” New York Times, 25 January 1958. ———. “Butler Affirms Atom Fusion Lead.” New York Times, 31 January 1958. ———. “H-Bomb Untamed, Britain Admits.” New York Times, 17 May 1958. Macrae, Norman. John von Neumann. New York: Pantheon, 1992. Maddox, John. “What to Say about Cold Fusion.” Nature 338 (27 April 1989): 701. Magnetic Fusion Energy Engineering Act of 1980. Public Law 96-386 (7 October 1980). Malakoff, David. “DOE Slams Livermore for Hiding NIF Problems.” Science 285 (10 September 1999): 1647. Mallove, Eugene. “MIT and Cold Fusion: A Special Report.” Infinite Energy, 1999, issue 24, 1-57.

**
E=mc2: A Biography of the World's Most Famous Equation
** by
David Bodanis

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Arthur Eddington, Berlin Wall, British Empire, dark matter, Ernest Rutherford, Erwin Freundlich, Fellow of the Royal Society, Henri Poincaré, Isaac Newton, John von Neumann, Mercator projection, pre–internet, Richard Feynman, Richard Feynman, Silicon Valley, Silicon Valley startup, Stephen Hawking, Thorstein Veblen, V2 rocket

He nurtured the ﬁrst theorists who proposed implosion; he assembled the right explosives experts; as the project grew to a level that under anyone else’s supervision it might have fallen apart in a mess of squabbling egos, he deftly manipulated the participants so that all the different groups involved worked together in parallel. At one point he had the top U.S. explosives expert, and the top UK explosives expert, and the Hungarian John von Neumann—the quickest mathematician anyone had met, who would also help create the computer in his long career—and a host of other nationalities all working on it. He even had Feynman joining in! The one prima donna who might have destroyed the effort was the embarrassingly egocentric Hungarian physicist Edward Teller. Oppenheimer neatly led him away, and granted him his own ofﬁce and work team, even amid the shortages of skilled manpower, to concentrate on his own prize ideas.

…

Some of those radio signals were absorbed in the hospital’s walls, but most were bounced back skyward. Sticking out of the bomb’s back, near the spinning ﬁns, were a number of whiplike thin radio antennae. Those collected the returning radio signals, and used the time lag each took 163 2 adulthood to return as a way of measuring the height remaining to the ground. At 1,900 feet the last rebounded radio signal arrived. John von Neumann and others had calculated that a bomb exploding much higher would dissipate much of its heat in the open air; exploding much lower, it would dig a huge crater in the ground. At just under 2,000 feet the height would be ideal. An electric impulse lit cordite sacs, producing a conventional artillery blast. A small part of the total puriﬁed uranium was now pushed forward down a gun barrel that was actually inside the bomb.

**
Rise of the Robots: Technology and the Threat of a Jobless Future
** by
Martin Ford

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, additive manufacturing, Affordable Care Act / Obamacare, AI winter, algorithmic trading, Amazon Mechanical Turk, artificial general intelligence, autonomous vehicles, banking crisis, Baxter: Rethink Robotics, Bernie Madoff, Bill Joy: nanobots, call centre, Capital in the Twenty-First Century by Thomas Piketty, Chris Urmson, Clayton Christensen, clean water, cloud computing, collateralized debt obligation, computer age, debt deflation, deskilling, diversified portfolio, Erik Brynjolfsson, factory automation, financial innovation, Flash crash, Fractional reserve banking, Freestyle chess, full employment, Goldman Sachs: Vampire Squid, High speed trading, income inequality, indoor plumbing, industrial robot, informal economy, iterative process, Jaron Lanier, job automation, John Maynard Keynes: technological unemployment, John von Neumann, Khan Academy, knowledge worker, labor-force participation, labour mobility, liquidity trap, low skilled workers, low-wage service sector, Lyft, manufacturing employment, McJob, moral hazard, Narrative Science, Network effects, new economy, Nicholas Carr, Norbert Wiener, obamacare, optical character recognition, passive income, performance metric, Peter Thiel, Plutocrats, plutocrats, post scarcity, precision agriculture, price mechanism, Ray Kurzweil, rent control, rent-seeking, reshoring, RFID, Richard Feynman, Richard Feynman, Rodney Brooks, secular stagnation, self-driving car, Silicon Valley, Silicon Valley startup, single-payer health, software is eating the world, sovereign wealth fund, speech recognition, Spread Networks laid a new fibre optics cable between New York and Chicago, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steven Levy, Steven Pinker, strong AI, Stuxnet, technological singularity, telepresence, telepresence robot, The Bell Curve by Richard Herrnstein and Charles Murray, The Coming Technological Singularity, Thomas L Friedman, too big to fail, Tyler Cowen: Great Stagnation, union organizing, Vernor Vinge, very high income, Watson beat the top human players on Jeopardy!, women in the workforce

A Moral Question If we think again in terms of doubling a penny as a proxy for the exponential advance of digital technology, it’s clear that today’s enormous technological account balance results from the efforts of countless individuals and organizations over the course of decades. Indeed, the arc of progress can be traced back in time at least as far as Charles Babbage’s mechanical difference engine in the early seventeenth century. The innovations that have resulted in fantastic wealth and influence in today’s information economy, while certainly significant, do not really compare in importance to the groundbreaking work done by pioneers like Alan Turing or John von Neumann. The difference is that even incremental advances are now able to leverage that extraordinary accumulated account balance. In a sense, the successful innovators of today are a bit like the Boston Marathon runner who in 1980 famously snuck into the race only half a mile from the finish line. Of course, all innovators stand on the shoulders of those who came before them. This was certainly true when Henry Ford introduced the Model T.

…

Indeed, it might well spawn a wave of disruption that would scale across our entire civilization, let alone our economy. In the words of futurist and inventor Ray Kurzweil, it would “rupture the fabric of history” and usher in an event—or perhaps an era—that has come to be called “the Singularity.” The Singularity The first application of the term “singularity” to a future technology-driven event is usually credited to computer pioneer John von Neumann, who reportedly said sometime in the 1950s that “ever accelerating progress . . . gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”5 The theme was fleshed out in 1993 by San Diego State University mathematician Vernor Vinge, who wrote a paper entitled “The Coming Technological Singularity.”

**
Einstein's Dice and Schrödinger's Cat: How Two Great Minds Battled Quantum Randomness to Create a Unified Theory of Physics
** by
Paul Halpern

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Albert Michelson, Arthur Eddington, Brownian motion, clockwork universe, cosmological constant, dark matter, double helix, Ernest Rutherford, Fellow of the Royal Society, Isaac Newton, John von Neumann, lone genius, Murray Gell-Mann, New Journalism, Richard Feynman, Richard Feynman, Schrödinger's Cat, Solar eclipse in 1919, The Present Situation in Quantum Mechanics

It teeters in a superposition of north, south, east, and west. Now picture a strong breeze coming along from a wholly random direction. By touching the structure, it is in some sense taking a measurement. The house of cards topples over in one of the directions, collapsing into one of its constituent eigenstates. The process of measurement has triggered a collapse from the superposition into a single position. Hungarian mathematician John von Neumann would later show that all quantum processes obeyed one of two types of dynamics: the continuous, deterministic evolution governed by a wave equation (either the Schrödinger equation or a relativistic version such as the Dirac equation) and the discrete, probabilistic repositioning associated with wavefunction collapse. Schrödinger himself would continue to believe in the former process, arguing vehemently against the latter.

…

Causality, he argued, was a local process involving interactions between adjacent entities, spreading through space from one point to the next at the speed of light or slower. Distant things must be treated as physically distinct, not as a linked system. Otherwise a kind of “telepathy” could exist between an electron on Earth and one on, say, Mars. How could each immediately “know” what the other is doing? By then, John von Neumann had formalized the notion of wavefunction collapse, originally suggested by Heisenberg. In that formalism, a particle’s wavefunction can be expressed in terms of either position eigenstates or momentum eigenstates, but not both at once. It is something like slicing an egg. You could slice it along its length or across its width, but unless you want it to be diced instead of sliced, you’d only do one or the other.

**
Hacker's Delight
** by
Henry S. Warren

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

[Aus1] Found in a REXX interpreter subroutine written by Marc A. Auslander. [Aus2] Auslander, Marc A. Private communication. [Bern] Bernstein, Robert. "Multiplication by Integer Constants." Software—Practice and Experience 16, 7 (July 1986), 641-652. [BGN] Burks, Arthur W., Goldstine, Herman H., and von Neumann, John. "Preliminary Discussion of the Logical Design of an Electronic Computing Instrument, Second Edition" (1947). In Papers of John von Neumann on Computing and Computing Theory, Volume 12 in the Charles Babbage Institute Reprint Series for the History of Computing, MIT Press, 1987. [CJS] Stephenson, Christopher J. Private communication. [Cohen] These rules were pointed out by Norman H. Cohen. [Cut] Cutland, Nigel J. Computability: An Introduction to Recursive Function Theory. Cambridge University Press, 1980. [CWG] Hoxey, Karim, Hay, and Warren (Editors).

…

[H&W] Hardy, G. H. and Wright, E. M. An Introduction to the Theory of Numbers, Fourth Edition. Oxford University Press, 1960. [IBM] From an IBM programming course, 1961. [Irvine] Irvine, M. M. "Early Digital Computers at Bell Telephone Laboratories." IEEE Annals of the History of Computing 23, 3 (July-September 2001), 22-42. [JVN] von Neumann, John. "First Draft of a Report on the EDVAC." In Papers of John von Neumann on Computing and Computing Theory, Volume 12 in the Charles Babbage Institute Reprint Series for the History of Computing, MIT Press, 1987. [Ken] Found in a GNU C compiler for the RS/6000 that was ported by Richard Kenner. He attributes this to a 1992 PLDI conference paper by him and Torbjörn Granlund. [Knu1] Knuth, Donald E. The Art of Computer Programming, Volume 1, Third Edition: Fundamental Algorithms.

**
Massive: The Missing Particle That Sparked the Greatest Hunt in Science
** by
Ian Sample

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Arthur Eddington, cuban missile crisis, dark matter, Donald Trump, double helix, Ernest Rutherford, Gary Taubes, Isaac Newton, John Conway, John von Neumann, Menlo Park, Murray Gell-Mann, Richard Feynman, Richard Feynman, Ronald Reagan, Stephen Hawking, uranium enrichment, Yogi Berra

Its most famous resident, Albert Einstein, who had died in 1955, had spent the last twenty-five years of his life there, trying to explain how the forces of nature were born. The Austrian-American logician Kurt Gödel was still there, redefining the limits of human knowledge. He and Einstein had been friends, though he had vexed Einstein by pointing out that his famous theories allowed time travel to be possible.21 The father of modern computing, John von Neumann, was also at the institute, turning the mathematics of poker into a political strategy to win the Cold War.22 Robert Oppenheimer, the towering figure who had led the Manhattan Project to build the atomic bomb, had become head of the institute in 1946, only adding to the intimidating aura of the place. Oppenheimer was renowned for his short temper and sharp tongue and could be at his worst when he turned up for the weekly seminars that were held on the campus.

…

The two charged components give mass to the positively and negatively charged W bosons. One neutral component gives mass to the Z boson. The Higgs boson is the quantum of the remaining neutral component field. 21 For more on Gödel’s work, see Thinking about Gödel and Turing: Essays on Complexity, 1970-2007, by Gregory J. Chaitin, World Scientific, 2007. 22 For more on von Neumann’s work on game theory, see Prisoner’s Dilemma: John von Neumann, Game Theory and the Puzzle of the Bomb, by William Poundstone, Anchor Books, 1993. 23 Interview with the author, August 2008. 24 Interview with the author, August 2007. 25 As recalled in an interview with the author, August 2008. 26 See “Conserved Currents and Associated Symmetries: Goldstone’s Theorem,” by Daniel Kastler, Derek W. Robinson, and André Swieca, Communications in Mathematical Physics 2, no. 2 (1966): 108-120. 27 Oppenheimer stood down from his directorship in 1966 after being diagnosed with throat cancer.

**
Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street
** by
William Poundstone

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, anti-communist, asset allocation, Benoit Mandelbrot, Black-Scholes formula, Brownian motion, buy low sell high, capital asset pricing model, Claude Shannon: information theory, computer age, correlation coefficient, diversified portfolio, en.wikipedia.org, Eugene Fama: efficient market hypothesis, high net worth, index fund, interest rate swap, Isaac Newton, Johann Wolfgang von Goethe, John von Neumann, Long Term Capital Management, Louis Bachelier, margin call, market bubble, market fundamentalism, Marshall McLuhan, New Journalism, Norbert Wiener, offshore financial centre, publish or perish, quantitative trading / quantitative ﬁnance, random walk, risk tolerance, risk-adjusted returns, Robert Shiller, Robert Shiller, Ronald Reagan, short selling, speech recognition, statistical arbitrage, The Predators' Ball, The Wealth of Nations by Adam Smith, transaction costs, traveling salesman, value at risk, zero-coupon bond

As he developed these ideas, Shannon needed a name for the incompressible stuff of messages. Nyquist had used intelligence, and Hartley had used information. In his earliest writings, Shannon favored Nyquist’s term. The military connotation of “intelligence” was fitting for the cryptographic work. “Intelligence” also implies meaning, however, which Shannon’s theory is pointedly not about. John von Neumann of Princeton’s Institute for Advanced Study advised Shannon to use the word entropy. Entropy is a physics term loosely described as a measure of randomness, disorder, or uncertainty. The concept of entropy grew out of the study of steam engines. It was learned that it is impossible to convert all the random energy of heat into useful work. A steam engine requires a temperature difference to run (hot steam pushing a piston against cooler air).

…

This incident appears to be the grain of truth behind an MIT legend of piles of uncashed checks languishing in Shannon’s office. In the late 1950s, Shannon began an intensive study of the stock market that was motivated both by intellectual curiosity and desire for gain. He filled three library shelves with something like a hundred books on economics and investing. The titles included Adam Smith’s The Wealth of Nations, John von Neumann and Oskar Morgenstern’s Theory of Games and Economic Behavior, and Paul Samuelson’s Economics, as well as books with a more practical focus on investment. One book Shannon singled out as a favorite was Fred Schwed’s wry classic, Where Are the Customers’ Yachts? At the time he was designing the roulette computer with Thorp, Shannon kept notes in an MIT notebook. Part of the notebook is devoted to the roulette device and part to a wildly disconnected set of stock market musings.

**
Fermat’s Last Theorem
** by
Simon Singh

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, Arthur Eddington, Augustin-Louis Cauchy, Fellow of the Royal Society, Georg Cantor, Henri Poincaré, Isaac Newton, John Conway, John von Neumann, kremlinology, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, Simon Singh, Wolfskehl Prize

It was during this period that Gödel developed the ideas that would devastate the foundations of mathematics. In 1931 Gödel published his book Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme (On Formally Undecidable Propositions in Principia Mathematica and Related Systems), which contained his so-called theorems of undecidability. When news of the theorems reached America the great mathematician John von Neumann immediately cancelled a lecture series he was giving on Hilbert’s programme and replaced the remainder of the course with a discussion of Gödel’s revolutionary work. Gödel had proved that trying to create a complete and consistent mathematical system was an impossible task. His ideas could be encapsulated in two statements. First theorem of undecidability If axiomatic set theory is consistent, there exist theorems which can neither be proved or disproved.

…

The Second World War was to provide just what was required – the greatest leap in calculating power since the invention of the slide-rule. The Brute Force Approach When in 1940 G.H. Hardy declared that the best mathematics is largely useless, he was quick to add that this was not necessarily a bad thing: ‘Real mathematics has no effects on war. No one has yet discovered any warlike purpose to be served by the theory of numbers.’ Hardy was soon to be proved wrong. In 1944 John von Neumann co-wrote the book The Theory of Games and Economic Behavior, in which he coined the term game theory. Game theory was von Neumann’s attempt to use mathematics to describe the structure of games and how humans play them. He began by studying chess and poker, and then went on to try and model more sophisticated games such as economics. After the Second World War the RAND corporation realised the potential of von Neumann’s ideas and hired him to work on developing Cold War strategies.

**
Information: A Very Short Introduction
** by
Luciano Floridi

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

agricultural Revolution, Albert Einstein, bioinformatics, carbon footprint, Claude Shannon: information theory, conceptual framework, double helix, Douglas Engelbart, George Akerlof, Gordon Gekko, industrial robot, Internet of things, invention of writing, John Nash: game theory, John von Neumann, moral hazard, Nash equilibrium, Norbert Wiener, phenotype, prisoner's dilemma, RAND corporation, RFID, Turing machine

MTC deals with messages comprising uninterpreted symbols encoded in well-formed strings of signals. These are mere data that constitute, but are not yet, semantic information. So MTC is commonly described as a study of information at the syntactic level. And since computers are syntactical devices, this is why MTC can be applied so successfully in ICT. Entropy and randomness Information in Shannon's sense is also known as entropy. It seems we owe this confusing label to John von Neumann (1903-1957), one of the most brilliant scientists of the 20th century, who recommended it to Shannon: You should call it entropy for two reasons: first, the function is already in use in thermodynamics under the same name; second, and more importantly, most people don't know what entropy really is, and if you use the word entropy in an argument you will win every time. Von Neumann proved to be right on both accounts, unfortunately.

**
Smart Machines: IBM's Watson and the Era of Cognitive Computing (Columbia Business School Publishing)
** by
John E. Kelly Iii

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

AI winter, call centre, carbon footprint, crowdsourcing, demand response, discovery of DNA, Erik Brynjolfsson, future of work, Geoffrey West, Santa Fe Institute, global supply chain, Internet of things, John von Neumann, Mars Rover, natural language processing, optical character recognition, pattern recognition, planetary scale, RAND corporation, RFID, Richard Feynman, Richard Feynman, smart grid, smart meter, speech recognition, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!

They were essentially elaborate mechanical abacuses. People used them to organize data and make calculations that were helpful in everything from conducting a national population census to tracking the performance of a company’s sales force. The programmable computing era—today’s technologies—emerged in the 1940s. Programmable machines are still based on a design laid out by the Hungarian American mathematician John von Neumann. Electronic devices governed by software programs perform calculations, execute logical sequences of steps, and store information using millions of zeros and ones. Scientists built the first such computers for use in decrypting encoded messages in wartime. Successive generations of computing technology have enabled everything from space exploration to global manufacturing-supply chains to the Internet.

**
Fifty Challenging Problems in Probability With Solutions
** by
Frederick Mosteller

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Isaac Newton, John von Neumann, prisoner's dilemma, RAND corporation, stochastic process

As is not hard to see, this can occur only if y = x, in which case he can win y with a single bold gamble with the probability w given by (2). 57 37 The problem of an exact upper bound and optimum strategies for the gambler in Red-and-Black who wants to win an amount different from x is more difficult and will not be entered into here. 38. The Thick Coin How thick should a coin be to have a ! chance of landing on edge? Solution for The Thick Coin On first hearing this question, the late great mathematician, John von Neumann, was unkind enough to solve it-including a 3-decimal answerin his head in 20 seconds in the presence of some unfortunates who had labored much longer. ~ '-J- Edge This problem has no definite answer without some simplifying conditions. The elasticity of the coin, the intensity with which it is tossed, and the properties of the surface on which it lands combine to make the reallife question an empirical one.

**
The Art of Computer Programming
** by
Donald Ervin Knuth

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Brownian motion, complexity theory, correlation coefficient, Eratosthenes, Georg Cantor, information retrieval, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Menlo Park, NP-complete, P = NP, Paul Erdős, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, random walk, sorting algorithm, Turing machine, Y2K

Manipulation of Power Series 525 Answers to Exercises 538 Appendix A — Tables of Numerical Quantities 726 1. Fundamental Constants (decimal) 726 2. Fundamental Constants (octal) 727 3. Harmonic Numbers, Bernoulli Numbers, Fibonacci Numbers . . . 728 Appendix B — Index to Notations 730 Index and Glossary 735 CHAPTER THREE RANDOM NUMBERS Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin. — JOHN VON NEUMANN A951) Lest men suspect your tale untrue, Keep probability in view. — JOHN GAY A727) There wanted not some beams of light to guide men in the exercise of their Stocastick faculty. — JOHN OWEN A662) 3.1. INTRODUCTION Numbers that are "chosen at random" are useful in many different kinds of applications. For example: a) Simulation. When a computer is being used to simulate natural phenomena, random numbers are required to make things realistic.

…

George Marsaglia helped resuscitate random tables in 1995 by preparing a demonstration disk that contained 650 random megabytes, generated by combining the output of a noise-diode circuit with deterministically scrambled rap music. (He called it "white and black noise.") The inadequacy of mechanical methods in the early days led to an interest in the production of random numbers using a computer's ordinary arithmetic operations. John von Neumann first suggested this approach in about 1946; his idea was to take the square of the previous random number and to extract the middle digits. For example, if we are generating 10-digit numbers and the previous value was 5772156649, we square it to get 33317792380594909201; the next number is therefore 7923805949. There is a fairly obvious objection to this technique: How can a sequence generated in such a way be random, since each number is completely determined by its predecessor?

…

The discussion in the following section assumes the existence of a random sequence of uniformly distributed real numbers between zero and one. A new uniform deviate U is generated whenever we need it. These numbers are usually represented in a computer word with the radix point assumed at the left. 3.4.1. Numerical Distributions This section summarizes the best techniques known for producing numbers from various important distributions. Many of the methods were originally suggested by John von Neumann in the early 1950s, and they have gradually been improved upon by other people, notably George Marsaglia, J. H. Ahrens, and U. Dieter. A. Random choices from a finite set. The simplest and most common type of distribution required in practice is a random integer. An integer between 0 and 7 can be extracted from three bits of U on a binary computer; in such a case, these bits should be extracted from the most significant (left-hand) part of the computer word, since the least significant bits produced by many random number generators are not sufficiently random.

**
From eternity to here: the quest for the ultimate theory of time
** by
Sean M. Carroll

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, cellular automata, Claude Shannon: information theory, Columbine, cosmic microwave background, cosmological constant, cosmological principle, dark matter, dematerialisation, double helix, en.wikipedia.org, gravity well, Harlow Shapley and Heber Curtis, Henri Poincaré, Isaac Newton, John von Neumann, Lao Tzu, lone genius, New Journalism, Norbert Wiener, pets.com, Richard Feynman, Richard Feynman, Richard Stallman, Schrödinger's Cat, Slavoj Žižek, Stephen Hawking, stochastic process, the scientific method, wikimedia commons

And that’s not the end of it; there are several other ways of thinking about entropy, and new ones are frequently being proposed in the literature. There’s nothing wrong with that; after all, Boltzmann and Gibbs were proposing definitions to supercede Clausius’s perfectly good definition of entropy, which is still used today under the rubric of “thermodynamic” entropy. After quantum mechanics came on the scene, John von Neumann proposed a formula for entropy that is specifically adapted to the quantum context. As we’ll discuss in the next chapter, Claude Shannon suggested a definition of entropy that was very similar in spirit to Gibbs’s, but in the framework of information theory rather than physics. The point is not to find the one true definition of entropy; it’s to come up with concepts that serve useful functions in the appropriate contexts.

…

But the question remains with us, only in a more specific form: Why did the universe have a low entropy near the Big Bang? 9 INFORMATION AND LIFE You should call it entropy, for two reasons. In the first place, your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage. —John von Neumann, to Claude Shannon144 In a celebrated episode in Swann’s Way, Marcel Proust’s narrator is feeling cold and somewhat depressed. His mother offers him tea, which he reluctantly accepts. He is then pulled into an involuntary recollection of his childhood by the taste of a traditional French teatime cake, the madeleine. And suddenly the memory appeared. That taste was the taste of the little piece of madeleine which on Sunday mornings at Combray . . . when I went to say good morning to her in her bedroom, my aunt Léonie would give me after dipping it in her infusion of tea or lime blossom . . .

…

We cannot, for example, exchange the roles of time and space. As a general rule, the more symmetries you have, the simpler things become. 110 This whole checkerboard-worlds idea sometimes goes by the name of cellular automata. A cellular automaton is just some discrete grid that follows a rule for determining the next row from the state of the previous row. They were first investigated in the 1960s, by John von Neumann, who is also the guy who figured out how entropy works in quantum mechanics. Cellular automata are fascinating for many reasons having little to do with the arrow of time; they can exhibit great complexity and can function as universal computers. See Poundstone (1984) or Shalizi (2009). Not only are we disrespecting cellular automata by pulling them out only to illustrate a few simple features of time reversal and information conservation, but we are also not speaking the usual language of cellular-automaton cognoscenti.

**
Debunking Economics - Revised, Expanded and Integrated Edition: The Naked Emperor Dethroned?
** by
Steve Keen

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

accounting loophole / creative accounting, banking crisis, banks create money, barriers to entry, Benoit Mandelbrot, Big bang: deregulation of the City of London, Black Swan, Bonfire of the Vanities, butterfly effect, capital asset pricing model, cellular automata, central bank independence, citizen journalism, clockwork universe, collective bargaining, complexity theory, correlation coefficient, credit crunch, David Ricardo: comparative advantage, debt deflation, diversification, double entry bookkeeping, en.wikipedia.org, Eugene Fama: efficient market hypothesis, experimental subject, Financial Instability Hypothesis, Fractional reserve banking, full employment, Henri Poincaré, housing crisis, Hyman Minsky, income inequality, invisible hand, iterative process, John von Neumann, laissez-faire capitalism, liquidity trap, Long Term Capital Management, mandelbrot fractal, margin call, market bubble, market clearing, market microstructure, means of production, minimum wage unemployment, open economy, place-making, Ponzi scheme, profit maximization, quantitative easing, RAND corporation, random walk, risk tolerance, risk/return, Robert Shiller, Robert Shiller, Ronald Coase, Schrödinger's Cat, scientific mainstream, seigniorage, six sigma, South Sea Bubble, stochastic process, The Great Moderation, The Wealth of Nations by Adam Smith, Thorstein Veblen, time value of money, total factor productivity, tulip mania, wage slave

The concept of expected value is thus not a good arbiter for rational behavior in the way it is normally presented in Behavioral Economics and Finance experiments – why, then, is it used? If you’ve read this far into this book, you won’t be surprised to learn that it’s because economists have misread the foundation research on this topic by the mathematician John von Neumann, and his economist collaborator Oskar Morgenstern, The Theory of Games and Economic Behavior (Von Neumann and Morgenstern 1953). Misunderstanding von Neumann John von Neumann was one of the greatest intellects of all time, a child prodigy who went on to make numerous pivotal contributions to a vast range of fields in mathematics, physics, and computer science. He was a polymath at a time when it was far more difficult to make contributions across a range of fields than it had been in earlier centuries.

…

These will remain to us as outstanding and disturbing forces; they must be treated, if at all, by other appropriate branches of knowledge’ (Jevons 1866). However, subsequent economists have applied this theory to all behavior, including interpersonal relations. 4 Cardinal refers to the ability to attach a precise quantity, whereas ordinal refers to the ability to rank things in size order, without necessarily being able to ascribe a numeric value to each. 5 As I point out later, the mathematician John von Neumann developed a way that a cardinal measure of utility could be derived, but this was ignored by neoclassical economists (Von Neumann and Morgenstern 1953: 17–29). 6 At its base (where, using my ‘bananas and biscuits’ example, zero bananas and zero biscuits were consumed), its height was zero. Then as you walked in the bananas direction only (eating bananas but no biscuits), the mountain rose, but at an ever-diminishing rate – it was its steepest at its base, because the very first units consumed gave the greatest ‘marginal utility.’

**
Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100
** by
Michio Kaku

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

agricultural Revolution, AI winter, Albert Einstein, augmented reality, Bill Joy: nanobots, bioinformatics, blue-collar work, British Empire, Brownian motion, cloud computing, Colonization of Mars, DARPA: Urban Challenge, delayed gratification, double helix, Douglas Hofstadter, en.wikipedia.org, friendly AI, Gödel, Escher, Bach, hydrogen economy, I think there is a world market for maybe five computers, industrial robot, invention of movable type, invention of the telescope, Isaac Newton, John von Neumann, life extension, Louis Pasteur, Mahatma Gandhi, Mars Rover, megacity, Murray Gell-Mann, new economy, oil shale / tar sands, optical character recognition, pattern recognition, planetary scale, postindustrial economy, Ray Kurzweil, refrigerator car, Richard Feynman, Richard Feynman, Rodney Brooks, Ronald Reagan, Search for Extraterrestrial Intelligence, Silicon Valley, Simon Singh, speech recognition, stem cell, Stephen Hawking, Steve Jobs, telepresence, The Wealth of Nations by Adam Smith, Thomas L Friedman, Thomas Malthus, trade route, Turing machine, uranium enrichment, Vernor Vinge, Wall-E, Walter Mischel, Whole Earth Review, X Prize

The word originally came from the world of relativistic physics, my personal specialty, where a singularity represents a point of infinite gravity, from which nothing can escape, such as a black hole. Because light itself cannot escape, it is a horizon beyond which we cannot see. The idea of an AI singularity was first mentioned in 1958, in a conversation between two mathematicians, Stanislaw Ulam (who made the key breakthrough in the design of the hydrogen bomb) and John von Neumann. Ulam wrote, “One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the human race beyond which human affairs, as we know them, could not continue.” Versions of the idea have been kicking around for decades. But it was then amplified and popularized by science fiction writer and mathematician Vernor Vinge in his novels and essays.

…

Even if computers begin to match the computing speed of the brain, they will still lack the necessary software and programming to make everything work. Matching the computing speed of the brain is just the humble beginning. Third, even if intelligent robots are possible, it is not clear if a robot can make a copy of itself that is smarter than the original. The mathematics behind self-replicating robots was first developed by the mathematician John von Neumann, who invented game theory and helped to develop the electronic computer. He pioneered the question of determining the minimum number of assumptions before a machine could create a copy of itself. However, he never addressed the question of whether a robot can make a copy of itself that is smarter than it. In fact, the very definition of “smart” is problematic, since there is no universally accepted definition of “smart.”

**
The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom?
** by
David Brin

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

affirmative action, airport security, Ayatollah Khomeini, clean water, cognitive dissonance, corporate governance, data acquisition, death of newspapers, Extropian, Howard Rheingold, illegal immigration, informal economy, Iridium satellite, Jaron Lanier, John von Neumann, Kevin Kelly, means of production, mutually assured destruction, offshore financial centre, open economy, packet switching, pattern recognition, pirate software, placebo effect, Plutocrats, plutocrats, prediction markets, Ralph Nader, RAND corporation, Saturday Night Live, Search for Extraterrestrial Intelligence, Steve Jobs, Steven Levy, Stewart Brand, telepresence, trade route, Vannevar Bush, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, Yogi Berra, Zimmermann PGP

In those days, long-distance call routing was a laborious task of negotiation, planned well in advance by human operators arranging connections from one zone to the next. But this drudgery might be avoided in a dispersed computer network if the messages themselves could navigate, finding their own way from node to node, carrying destination information in their lead bits like the address on the front of an envelope. Early theoretical work by Alan Turing and John Von Neumann hinted this to be possible by allowing each part of a network to guess the best way to route a message past any damaged area and eventually reach its goal. In theory, such a system might keep operating even when others lay in tatters. In retrospect, the advantages of Baranʼs insight seem obvious. Still, it remains a wonder that the Pentagon actually went ahead with experiments in decentralized, autonomous message processing.

…

Some may object in principle that we should not be coddled by bureaucrats, or that the free market ( caveat emptor) could protect us better still. But contentious issues of paternalism will be dealt with later. (See the section on “public feedback regulation” in chapter 8.) Right now we should focus on the evolving way in which researchers have come to view the concept of risk and how people respond to it. Until recently, most models were based on classical decision theory, supplemented by the later game theory that John Von Neumann developed after World War II. These are essentially mathematical approaches to betting— calculating odds for success or failure when contributing factors are either well known, or partly unknown. For instance, a problem called “the prisonersʼ dilemma” explores how two parties might behave when each can make a quick, temporary score by betraying the other, or else both might prosper, moderately but indefinitely, by deciding to cooperate.

**
What Technology Wants
** by
Kevin Kelly

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Alfred Russel Wallace, Buckminster Fuller, c2.com, carbon-based life, Cass Sunstein, charter city, Clayton Christensen, cloud computing, computer vision, Danny Hillis, dematerialisation, demographic transition, double entry bookkeeping, en.wikipedia.org, Exxon Valdez, George Gilder, gravity well, hive mind, Howard Rheingold, interchangeable parts, invention of air conditioning, invention of writing, Isaac Newton, Jaron Lanier, John Conway, John von Neumann, Kevin Kelly, knowledge economy, Lao Tzu, life extension, Louis Daguerre, Marshall McLuhan, megacity, meta analysis, meta-analysis, new economy, out of africa, performance metric, personalized medicine, phenotype, Picturephone, planetary scale, RAND corporation, random walk, Ray Kurzweil, recommendation engine, refrigerator car, Richard Florida, Silicon Valley, silicon-based life, Skype, speech recognition, Stephen Hawking, Steve Jobs, Stewart Brand, Ted Kaczynski, the built environment, the scientific method, Thomas Malthus, Vernor Vinge, Whole Earth Catalog, Y2K

When two world wars unleashed the full killing power of this inventiveness, it cemented the reputation of technology as a beguiling satan. As we refined this stuff through generations of technological evolution, it lost much of its hardness. We began to see through technology’s disguise as material and began to see it primarily as action. While it inhabited a body, its heart was something softer. In 1949, John von Neumann, the brainy genius behind the first useful computer, realized what computers were teaching us about technology: “Technology will in the near and in the farther future increasingly turn from problems of intensity, substance, and energy, to problems of structure, organization, information, and control.” No longer a noun, technology was becoming a force—a vital spirit that throws us forward or pushes against us.

…

Long gone is the era when 10 years could elapse between the public announcement of an invention or discovery and the date the last researcher would hear about it. Synchronicity is not just a phenomenon of the past, when communication was poor, but very much part of the present. Scientists at AT&T Bell Labs won a Nobel Prize for inventing the transistor in 1948, but two German physicists independently invented a transistor two months later at a Westinghouse laboratory in Paris. Popular accounts credit John von Neumann with the invention of a programmable binary computer during the last years of World War II, but the idea and a working punched-tape prototype were developed quite separately in Germany a few years earlier, in 1941, by Konrad Zuse. In a verifiable case of modern parallelism, Zuse’s pioneering binary computer went completely unnoticed in the United States and the UK until many decades later.

**
Accelerando
** by
Stross, Charles

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

call centre, carbon-based life, cellular automata, cognitive dissonance, Conway's Game of Life, dark matter, dumpster diving, Extropian, finite state, Flynn Effect, glass ceiling, gravity well, John von Neumann, knapsack problem, Kuiper Belt, Magellanic Cloud, mandelbrot fractal, market bubble, means of production, packet switching, performance metric, phenotype, planetary scale, Pluto: dwarf planet, reversible computing, Richard Stallman, SETI@home, Silicon Valley, Singularitarianism, slashdot, South China Sea, stem cell, technological singularity, telepresence, The Chicago School, theory of mind, Turing complete, Turing machine, Turing test, upwardly mobile, Vernor Vinge, Von Neumann architecture, web of trust, Y2K

Manfred asks, pointing at the renderer, which is whining to itself and slowly sintering together something that resembles a carriage clockmaker's fever dream of a spring-powered hard disk drive. "Oh, one of Johnny's toys – a micromechanical digital phonograph player," Gianni says dismissively. "He used to design Babbage engines for the Pentagon – stealth computers. (No van Eck radiation, you know.) Look." He carefully pulls a fabric-bound document out of the obsolescent data wall and shows the spine to Manfred: "On the Theory of Games, by John von Neumann. Signed first edition." Aineko meeps and dumps a slew of confusing purple finite state automata into Manfred's left eye. The hardback is dusty and dry beneath his fingertips as he remembers to turn the pages gently. "This copy belonged to the personal library of Oleg Kordiovsky. A lucky man is Oleg: He bought it in 1952, while on a visit to New York, and the MVD let him to keep it." "He must be –" Manfred pauses.

…

Another deeply buried thread starts up, and Aineko analyses the package from a perspective no human being has yet established. Presently a braid of processes running on an abstract virtual machine asks him a question that cannot be encoded in any human grammar. Watch and wait, he replies to his passenger. They'll figure out what we are sooner or later. Part 2 Point of Inflexion Life is a process which may be abstracted from other media. – John Von Neumann Chapter 1 Halo The asteroid is running Barney: it sings of love on the high frontier, of the passion of matter for replicators, and its friendship for the needy billions of the Pacific Rim. "I love you," it croons in Amber's ears as she seeks a precise fix on it: "Let me give you a big hug … " A fraction of a light-second away, Amber locks a cluster of cursors together on the signal, trains them to track its Doppler shift, and reads off the orbital elements.

**
The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
** by
Pedro Domingos

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, Albert Einstein, Amazon Mechanical Turk, Arthur Eddington, Benoit Mandelbrot, bioinformatics, Black Swan, Brownian motion, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer vision, constrained optimization, correlation does not imply causation, crowdsourcing, Danny Hillis, data is the new oil, double helix, Douglas Hofstadter, Erik Brynjolfsson, experimental subject, Filter Bubble, future of work, global village, Google Glasses, Gödel, Escher, Bach, information retrieval, job automation, John Snow's cholera map, John von Neumann, Joseph Schumpeter, Kevin Kelly, lone genius, mandelbrot fractal, Mark Zuckerberg, Moneyball by Michael Lewis explains big data, Narrative Science, Nate Silver, natural language processing, Netflix Prize, Network effects, NP-complete, P = NP, PageRank, pattern recognition, phenotype, planetary scale, pre–internet, random walk, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman, Second Machine Age, self-driving car, Silicon Valley, speech recognition, statistical model, Stephen Hawking, Steven Levy, Steven Pinker, superintelligent machines, the scientific method, The Signal and the Noise by Nate Silver, theory of mind, transaction costs, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, white flight

The Bible Code, a 1998 bestseller, claimed that the Bible contains predictions of future events that you can find by skipping letters at regular intervals and assembling words from the letters you land on. Unfortunately, there are so many ways to do this that you’re guaranteed to find “predictions” in any sufficiently long text. Skeptics replied by finding them in Moby Dick and Supreme Court rulings, along with mentions of Roswell and UFOs in Genesis. John von Neumann, one of the founding fathers of computer science, famously said that “with four parameters I can fit an elephant, and with five I can make him wiggle his trunk.” Today we routinely learn models with millions of parameters, enough to give each elephant in the world his own distinctive wiggle. It’s even been said that data mining means “torturing the data until it confesses.” Overfitting is seriously exacerbated by noise.

…

Prudently, he picked a more conservative topic for his dissertation—Boolean circuits with cycles—and in 1959 he earned the world’s first PhD in computer science. His PhD advisor, Arthur Burks, nevertheless encouraged Holland’s interest in evolutionary computation and was instrumental in getting him a faculty job at Michigan and shielding him from senior colleagues who didn’t think that stuff was computer science. Burks himself was so open-minded because he had been a close collaborator of John von Neumann, who had proved the possibility of self-reproducing machines. Indeed, it had fallen to him to complete the work when von Neumann died of cancer in 1957. That von Neumann could prove that such machines are possible was quite remarkable, given the primitive state of genetics and computer science at the time. But his automaton just made exact copies of itself; evolving automata had to wait for Holland.

**
Why Stock Markets Crash: Critical Events in Complex Financial Systems
** by
Didier Sornette

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Asian financial crisis, asset allocation, Berlin Wall, Bretton Woods, Brownian motion, capital asset pricing model, capital controls, continuous double auction, currency peg, Deng Xiaoping, discrete time, diversified portfolio, Elliott wave, Erdős number, experimental economics, financial innovation, floating exchange rates, frictionless, frictionless market, full employment, global village, implied volatility, index fund, invisible hand, John von Neumann, joint-stock company, law of one price, Louis Bachelier, mandelbrot fractal, margin call, market bubble, market clearing, market design, market fundamentalism, mental accounting, moral hazard, Network effects, new economy, oil shock, open economy, pattern recognition, Paul Erdős, quantitative trading / quantitative ﬁnance, random walk, risk/return, Ronald Reagan, Schrödinger's Cat, short selling, Silicon Valley, South Sea Bubble, statistical model, stochastic process, Tacoma Narrows Bridge, technological singularity, The Coming Technological Singularity, The Wealth of Nations by Adam Smith, Tobin tax, total factor productivity, transaction costs, tulip mania, VA Linux, Y2K, yield curve

This “general equilibrium” proof, which relies on a set of very restricted assumptions of an idealized world, has been a mainstay of graduate-level economics training ever since. The most important tool in this analysis was game theory: the study of situations, like poker or chess games, in which players have to make their decisions based on guesses about what the other player is going to do next. Game theory was ﬁrst adapted to economics in the 1940s by mathematician John von Neumann (the same von Neumann whose theoretical insights made the computer possible) and economist O. Morgenstern. Since then, the standard economics and social science model of a human agent is that it is like a general-purpose logic machine. All decision tasks, regardless of context, constitute optimization problems subject to external constraints whether from the physical environment or from the reaction functions of other agents.

…

Gambling with the house money and trying to break even: The effects of prior outcomes on risky choice, Management Science 36, 643–660. 426. Toner, J. and Tu, Y. H. (1998). Flocks, herds, and schools: A quantitative theory of ﬂocking, Physical Review E 58, 4828–4858. 427. Trueman, B. (1994). Analyst forecasts and herding behavior, The Review of Financial Studies 7, 97–124. 428. Ulam, S. (1959). Tribute to John von Neumann, Bulletin of the American Mathematical Society 64, 1–49. 429. U.S. Committee of the Global Atmospheric Research Program (1975). Understanding Climatic Change—A Program for Action (National Research Council, National Academy of Sciences, Washington, D.C.). 430. U.S. Postage Release No. 99-045, May 21, 1999. 431. Van Norden, S. (1996). Regime switching as a test for exchange rate bubbles, Journal of Applied Econometrics 11, 219–251. 432.

**
Misbehaving: The Making of Behavioral Economics
** by
Richard H. Thaler

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Amazon Mechanical Turk, Andrei Shleifer, Apple's 1984 Super Bowl advert, Atul Gawande, Berlin Wall, Bernie Madoff, Black-Scholes formula, capital asset pricing model, Cass Sunstein, Checklist Manifesto, choice architecture, clean water, cognitive dissonance, conceptual framework, constrained optimization, Daniel Kahneman / Amos Tversky, delayed gratification, diversification, diversified portfolio, Edward Glaeser, endowment effect, equity premium, Eugene Fama: efficient market hypothesis, experimental economics, Fall of the Berlin Wall, George Akerlof, hindsight bias, Home mortgage interest deduction, impulse control, index fund, invisible hand, Jean Tirole, John Nash: game theory, John von Neumann, late fees, law of one price, libertarian paternalism, Long Term Capital Management, loss aversion, market clearing, Mason jar, mental accounting, meta analysis, meta-analysis, More Guns, Less Crime, mortgage debt, Nash equilibrium, Nate Silver, New Journalism, nudge unit, payday loans, Ponzi scheme, presumed consent, pre–internet, principal–agent problem, prisoner's dilemma, profit maximization, random walk, randomized controlled trial, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Coase, Silicon Valley, South Sea Bubble, statistical model, Steve Jobs, technology bubble, The Chicago School, The Myth of the Rational Market, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, transaction costs, ultimatum game, Walter Mischel

This implies that if your wealth is $100,000 and I offer you a choice between an additional $1,000 for sure or a 50% chance to win $2,000, you will take the sure thing because you value the second thousand you would win less than the first thousand, so you are not willing to risk losing that first $1,000 prize in an attempt to get $2,000. The full treatment of the formal theory of how to make decisions in risky situations—called expected utility theory—was published in 1944 by the mathematician John von Neumann and the economist Oskar Morgenstern. John von Neumann, one of the greatest mathematicians of the twentieth century, was a contemporary of Albert Einstein at the Institute of Advanced Study at Princeton University, and during World War II he decided to devote himself to practical problems. The result was the 600-plus-page opus The Theory of Games and Economic Behavior, in which the development of expected utility theory was just a sideline.

**
From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry
** by
Martin Campbell-Kelly

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Apple II, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, business process, card file, computer age, computer vision, continuous integration, deskilling, Grace Hopper, inventory management, John von Neumann, linear programming, Menlo Park, Network effects, popular electronics, RAND corporation, Robert X Cringely, Ronald Reagan, Silicon Valley, software patent, Steve Jobs, Steve Wozniak, Steven Levy, Thomas Kuhn: the structure of scientific revolutions

From Airline Reservations to Sonic the Hedgehog History of Computing I. Bernard Cohen and William Aspray, editors William Aspray, John von Neumann and the Origins of Modern Computing Charles J. Bashe, Lyle R. Johnson, John H. Palmer, and Emerson W. Pugh, IBM’s Early Computers Martin Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry Paul E. Ceruzzi, A History of Modern Computing I. Bernard Cohen, Howard Aiken: Portrait of a Computer Pioneer I. Bernard Cohen and Gregory W. Welch, editors, Makin’ Numbers: Howard Aiken and the Computer John Hendry, Innovating for Failure: Government Policy and the Early British Computer Industry Michael Lindgren, Glory and Failure: The Difference Engines of Johann Müller, Charles Babbage, and Georg and Edvard Scheutz David E.

…

Hurd assigned to the job the Technical Computing Bureau, headed by John Sheldon.7 (Sheldon was soon to be a cofounder of the Computer Usage Company, probably the world’s first software contracting firm, and a few years later Hurd would become its chairman.) The Technical Computing Bureau was allocated the first production 701, which was located in IBM’s world headquarters on Madison Avenue in New York. The machine was inaugurated in April 1953 with enormous publicity to herald IBM’s arrival on the computer scene. Among the 200 guests were J. Robert Oppenheimer (the former director of the Manhattan Project), John von Neumann (one of the inventors of the stored-program computer), and William Shockley (coinventor of the transistor). The Technical Computing Bureau’s machine was in place about 6 months before the first deliveries of 701s to customers, so customers’ programmers were able to become familiar with the 701 before their machines arrived. Most programmers in 701 installations took a basic course in programming organized by IBM (a course from which hundreds eventually graduated).

**
Infinite Ascent: A Short History of Mathematics
** by
David Berlinski

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, Benoit Mandelbrot, Douglas Hofstadter, Eratosthenes, four colour theorem, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Isaac Newton, John von Neumann, Murray Gell-Mann, Stephen Hawking, Turing machine, William of Occam

It follows that neither the formula designated by [R(q), q] nor its negation is provable. We ourselves may allow Pedro or Fedro to suffer a cut all his own, restoring to prominence in Kurt Gödel the twenty-three-year-old director of record. The unpurged images of this spectacular argument recede; so, too, the details of Gödel’s first theorem. Directly, the second theorem appears, this one dealing directly with the issue of consistency. It is a theorem that John von Neumann noticed after Gödel had communicated his first theorem to various mathematicians; but when he wrote eagerly to Gödel to convey his discovery, he learned that Gödel had already discovered the same thing. The import of Gödel’s second theorem can be conveyed by means of only a few strokes. The first incompleteness theorem affirms that if the system of the Principia is consistent, then there exists an undecidable proposition, one that may be expressed from within the cage of its symbols.

**
GDP: A Brief but Affectionate History
** by
Diane Coyle

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Asian financial crisis, Berlin Wall, big-box store, Bretton Woods, BRICs, clean water, computer age, conceptual framework, crowdsourcing, Diane Coyle, double entry bookkeeping, en.wikipedia.org, Erik Brynjolfsson, Fall of the Berlin Wall, falling living standards, financial intermediation, global supply chain, happiness index / gross national happiness, income inequality, income per capita, informal economy, John von Neumann, Kevin Kelly, Long Term Capital Management, mutually assured destruction, Nathan Meyer Rothschild: antibiotics, new economy, Occupy movement, purchasing power parity, Robert Shiller, Robert Shiller, Ronald Reagan, shareholder value, Silicon Valley, Simon Kuznets, The Wealth of Nations by Adam Smith, Thorstein Veblen, University of East Anglia, working-age population

This was, of course, the computer and Internet revolution. This provides a good example of the kind of time lags Paul David described. The electronic programmable computer was one of the basic innovations of World War II. It emerged from the wartime code-breaking work at Bletchley Park in the United Kingdom and the brilliant conceptual leaps made by Alan Turing, and, across the Atlantic during and after the war, from the work of John Von Neumann and others involved in the development of nuclear weapons. Computers began as military and academic machines, then came into use in big businesses, and in the 1980s finally became small and cheap enough to spread to all offices and gradually individual homes. Separately, the communications protocols between computers were developed in the United States in the 1970s, by DARPA (the Defense Advanced Research Projects Agency) among other groups.

**
Information Doesn't Want to Be Free: Laws for the Internet Age
** by
Cory Doctorow,
Amanda Palmer,
Neil Gaiman

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Airbnb, barriers to entry, Brewster Kahle, cloud computing, Dean Kamen, Edward Snowden, game design, Internet Archive, John von Neumann, Kickstarter, optical character recognition, Plutocrats, plutocrats, pre–internet, profit maximization, recommendation engine, rent-seeking, Saturday Night Live, Skype, Steve Jobs, Steve Wozniak, Stewart Brand, transfer pricing, Whole Earth Catalog, winner-take-all economy

They figured out how to do it with ease. 1.5 Understanding General-Purpose Computers BACK AT THE dawn of mechanical computation, computers were “special-purpose.” One computer would solve one kind of mathematical problem, and if you had a different problem, you’d build a different computer. But during World War II, thanks to the government-funded advancements made by such scientific luminaries as Alan Turing and John von Neumann, a new kind of computer came into existence: the “general-purpose” digital computer. These machines arose from a theory of general-purpose computation that showed that, with a simple set of “logic gates” and enough memory and time, you could “compute” any program that could be represented symbolically. That is, general-purpose computers are machines capable of running every valid program we can write.

**
Surviving AI: The Promise and Peril of Artificial Intelligence
** by
Calum Chace

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, Ada Lovelace, AI winter, Airbnb, artificial general intelligence, augmented reality, barriers to entry, bitcoin, blockchain, brain emulation, Buckminster Fuller, cloud computing, computer age, computer vision, correlation does not imply causation, credit crunch, cryptocurrency, cuban missile crisis, dematerialisation, discovery of the americas, disintermediation, don't be evil, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, everywhere but in the productivity statistics, Flash crash, friendly AI, Google Glasses, industrial robot, Internet of things, invention of agriculture, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, life extension, low skilled workers, Mahatma Gandhi, means of production, mutually assured destruction, Nicholas Carr, pattern recognition, Peter Thiel, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley ideology, Skype, South Sea Bubble, speech recognition, Stanislav Petrov, Stephen Hawking, Steve Jobs, strong AI, technological singularity, theory of mind, Turing machine, Turing test, universal basic income, Vernor Vinge, wage slave, Wall-E

But the first general-purpose computer to be completed was ENIAC (Electronic Numerical Integrator And Computer), built at the Moore School of Electrical Engineering in Philadelphia, and unveiled in 1946. Like so many technological advances, it was funded by the military, and one of its first assignments was a feasibility study of the hydrogen bomb. While working on ENIAC’s successor, EDVAC (Electronic Discrete Variable Automatic Computer), the brilliant mathematician and polymath John von Neumann wrote a paper describing an architecture for computers which remains the basis for today’s machines. The Dartmouth Conference The arrival of computers combined with a series of ideas about thinking by Turing and others led to “the conjecture that every . . . feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.” This was the claim of the organisers of a month-long conference at Dartmouth College in New Hampshire in the summer of 1956, which quickly became seen as the foundation event for the science of artificial intelligence.

**
The Internet of Us: Knowing More and Understanding Less in the Age of Big Data
** by
Michael P. Lynch

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Affordable Care Act / Obamacare, Amazon Mechanical Turk, big data - Walmart - Pop Tarts, bitcoin, Cass Sunstein, Claude Shannon: information theory, crowdsourcing, Edward Snowden, Firefox, Google Glasses, hive mind, income inequality, Internet of things, John von Neumann, meta analysis, meta-analysis, Nate Silver, new economy, patient HM, prediction markets, RFID, sharing economy, Steve Jobs, Steven Levy, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, WikiLeaks

And our cognitive processes are increasingly entangled with those of other people. This raises an obvious question. Is it possible that the smartest guy in the room is the room? That is, can networks themselves know? There are a few different ways to approach this question. One way has to do with what those in the AI (artificial intelligence) biz call “the singularity”—a term usually credited to the mathematician John von Neumann. The basic idea is that at some point machines—particularly computer networks—will become intelligent enough to become self-aware, and powerful enough to take control. The possibility of the singularity raises a host of interesting philosophical questions, but I want to focus on one issue that is already with us. As we’ve discussed, there are reasons to think that we digital humans are, in a very real sense, components of a network already.

**
The Secret War Between Downloading and Uploading: Tales of the Computer as Culture Machine
** by
Peter Lunenfeld

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Andrew Keen, Apple II, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, Burning Man, butterfly effect, computer age, crowdsourcing, cuban missile crisis, Dissolution of the Soviet Union, don't be evil, Douglas Engelbart, Dynabook, East Village, Edward Lorenz: Chaos theory, Fall of the Berlin Wall, Francis Fukuyama: the end of history, Frank Gehry, Grace Hopper, gravity well, Guggenheim Bilbao, Honoré de Balzac, Howard Rheingold, invention of movable type, Isaac Newton, Jacquard loom, Jacquard loom, Jane Jacobs, Jeff Bezos, John von Neumann, Mark Zuckerberg, Marshall McLuhan, Mercator projection, Mother of all demos, mutually assured destruction, Network effects, new economy, Norbert Wiener, PageRank, pattern recognition, planetary scale, Plutocrats, plutocrats, Post-materialism, post-materialism, Potemkin village, RFID, Richard Feynman, Richard Feynman, Richard Stallman, Robert X Cringely, Schrödinger's Cat, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, Skype, social software, spaced repetition, Steve Ballmer, Steve Jobs, Steve Wozniak, Ted Nelson, the built environment, The Death and Life of Great American Cities, the medium is the message, Thomas L Friedman, Turing machine, Turing test, urban planning, urban renewal, Vannevar Bush, walkable city, Watson beat the top human players on Jeopardy!, William Shockley: the traitorous eight

—Vannevar Bush People tend to overestimate what can be done in one year and underestimate what can be done in ﬁve to ten years. —J.C.R. Licklider 147 GENERATIONS There are many mathematicians, early computer scientists, and engineers who deserve to be considered part of the ﬁrst generation of pioneering Patriarchs. They include Alan Turing, already discussed in chapter 2; mathematician and quantum theorist John von Neumann; cyberneticist Norbert Wiener; information theorist Claude Shannon; and computer architects like the German Konrad Zuse, and Americans J. Presper Eckert and John Mauchly, who developed ENIAC, the room-sized machine at the University of Pennsylvania that we recognize as the ﬁrst general-purpose electronic computer. These were the Patriarchs who set the parameters for computer science, laying out the issues for software development, building the original architectures for hardware, and creating the cultures of computer science and engineering.

**
You Are Not a Gadget
** by
Jaron Lanier

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1960s counterculture, accounting loophole / creative accounting, additive manufacturing, Albert Einstein, call centre, cloud computing, crowdsourcing, death of newspapers, digital Maoism, Douglas Hofstadter, Extropian, follow your passion, hive mind, Internet Archive, Jaron Lanier, jimmy wales, John Conway, John von Neumann, Kevin Kelly, Long Term Capital Management, Network effects, new economy, packet switching, PageRank, pattern recognition, Ponzi scheme, Ray Kurzweil, Richard Stallman, Silicon Valley, Silicon Valley startup, slashdot, social graph, stem cell, Steve Jobs, Stewart Brand, Ted Nelson, telemarketer, telepresence, The Wisdom of Crowds, trickle-down economics, Turing test, Vernor Vinge, Whole Earth Catalog

CHAPTER 2 An Apocalypse of Self-Abdication THE IDEAS THAT I hope will not be locked in rest on a philosophical foundation that I sometimes call cybernetic totalism. It applies metaphors from certain strains of computer science to people and the rest of reality. Pragmatic objections to this philosophy are presented. What Do You Do When the Techies Are Crazier Than the Luddites? The Singularity is an apocalyptic idea originally proposed by John von Neumann, one of the inventors of digital computation, and elucidated by figures such as Vernor Vinge and Ray Kurzweil. There are many versions of the fantasy of the Singularity. Here’s the one Marvin Minsky used to tell over the dinner table in the early 1980s: One day soon, maybe twenty or thirty years into the twenty-first century, computers and robots will be able to construct copies of themselves, and these copies will be a little better than the originals because of intelligent software.

**
Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life
** by
J. Craig Venter

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Alfred Russel Wallace, Barry Marshall: ulcers, bioinformatics, borderless world, Brownian motion, clean water, discovery of DNA, double helix, epigenetics, experimental subject, Isaac Newton, Islamic Golden Age, John von Neumann, Louis Pasteur, Mars Rover, Mikhail Gorbachev, phenotype, Richard Feynman, Richard Feynman, stem cell, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Turing machine

In 1936 Alan Turing, the cryptographer and pioneer of artificial intelligence, described what has come to be known as a Turing machine, which is described by a set of instructions written on a tape. Turing also defined a universal Turing machine, which can carry out any computation for which an instruction set can be written. This is the theoretical foundation of the digital computer. Turing’s ideas were developed further in the 1940s, by the remarkable American mathematician and polymath John von Neumann, who conceived of a self-replicating machine. Just as Turing had envisaged a universal machine, so von Neumann envisaged a universal constructor. The Hungarian-born genius outlined his ideas in a lecture, “The General and Logical Theory of Automata,” at the 1948 Hixon Symposium, in Pasadena, California. He pointed out that natural organisms “are, as a rule, much more complicated and subtle, and therefore much less well understood in detail than are artificial automata”; nevertheless, he maintained that some of the regularities we observe in the former might be instructive in our thinking about and planning of the latter.

**
Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else
** by
Steve Lohr

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

23andMe, Affordable Care Act / Obamacare, Albert Einstein, big data - Walmart - Pop Tarts, bioinformatics, business intelligence, call centre, cloud computing, computer age, conceptual framework, Credit Default Swap, crowdsourcing, Daniel Kahneman / Amos Tversky, Danny Hillis, data is the new oil, David Brooks, East Village, Edward Snowden, Emanuel Derman, Erik Brynjolfsson, everywhere but in the productivity statistics, Frederick Winslow Taylor, Google Glasses, impulse control, income inequality, indoor plumbing, industrial robot, informal economy, Internet of things, invention of writing, John von Neumann, Mark Zuckerberg, market bubble, meta analysis, meta-analysis, natural language processing, obamacare, pattern recognition, payday loans, personalized medicine, precision agriculture, pre–internet, Productivity paradox, RAND corporation, rising living standards, Robert Gordon, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley startup, six sigma, skunkworks, speech recognition, statistical model, Steve Jobs, Steven Levy, The Design of Experiments, the scientific method, Thomas Kuhn: the structure of scientific revolutions, unbanked and underbanked, underbanked, Von Neumann architecture, Watson beat the top human players on Jeopardy!

It was not IBM’s kind of business, even though it generated yearly sales of $4 billion. The big-data era is the next evolutionary upheaval in the landscape of computing. The things people want to do with data, like real-time analysis of data streams or continuously running machine-learning software, pose a threat to the traditional computer industry. Conventional computing—the Von Neumann architecture, named for mathematician and computer scientist John von Neumann—operates according to discrete steps of program, store, and process. Major companies and markets were built around those tiers of computing—software, disk drives, and microprocessors, respectively. Modern data computing, according to John Kelly, IBM’s senior vice president in charge of research, will “completely disrupt the industry as we know it, creating new platforms and players.” IBM, of course, sees opportunity in that disruption.

**
Geek Sublime: The Beauty of Code, the Code of Beauty
** by
Vikram Chandra

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, British Empire, business process, conceptual framework, create, read, update, delete, crowdsourcing, East Village, European colonialism, finite state, Firefox, Flash crash, glass ceiling, Grace Hopper, haute couture, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce

“The telephone switchboard-like appearance of the ENIAC programming cable-and-plug panels,” Ensmenger writes, “reinforced the notion that programmers were mere machine operators, that programming was more handicraft than science, more feminine than masculine, more mechanical than intellectual.”18 The planners considered the coding process so transparently simple that they couldn’t imagine that once in the machines, their algorithms might fault and hang, might need to be stopped. One of the ENIAC programmers, Betty Holberton, had to work very hard to convince John von Neumann that programs were complex and therefore fragile: But to my astonishment, [Dr von Neumann] never mentioned a stop instruction. So I did coyly say, “Don’t we need a stop instruction in this machine?” He said, “No we don’t need a stop instruction. We have all these empty sockets here that just let it go to bed.” And I went back home and I was really alarmed. After all, we had debugged the machine day and night for months just trying to get jobs on it.

**
The Inner Lives of Markets: How People Shape Them—And They Shape Us
** by
Tim Sullivan

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Airbnb, airport security, Al Roth, Andrei Shleifer, attribution theory, autonomous vehicles, barriers to entry, Brownian motion, centralized clearinghouse, clean water, conceptual framework, constrained optimization, continuous double auction, deferred acceptance, Donald Trump, Edward Glaeser, experimental subject, first-price auction, framing effect, frictionless, fundamental attribution error, George Akerlof, Goldman Sachs: Vampire Squid, helicopter parent, Internet of things, invisible hand, Isaac Newton, iterative process, Jean Tirole, Jeff Bezos, Johann Wolfgang von Goethe, John Nash: game theory, John von Neumann, Joseph Schumpeter, late fees, linear programming, Lyft, market clearing, market design, market friction, medical residency, multi-sided market, mutually assured destruction, Nash equilibrium, Occupy movement, Peter Thiel, pets.com, pez dispenser, pre–internet, price mechanism, price stability, prisoner's dilemma, profit motive, proxy bid, RAND corporation, ride hailing / ride sharing, Robert Shiller, Robert Shiller, Ronald Coase, school choice, school vouchers, sealed-bid auction, second-price auction, second-price sealed-bid, sharing economy, Silicon Valley, spectrum auction, Steve Jobs, Tacoma Narrows Bridge, technoutopianism, telemarketer, The Market for Lemons, The Wisdom of Crowds, Thomas Malthus, Thorstein Veblen, trade route, transaction costs, two-sided market, uranium enrichment, Vickrey auction, winner-take-all economy

The foundation’s founding motto was “Science is Measurement.”11 The second, the RAND Corporation, first established as a joint project by the Douglas Aircraft Company and the US Department of War in 1945, used game theory to analyze the United States’s geopolitical position relative to the Soviet Union. Game theory—a mathematical approach to analyzing strategic choices—emerged from the work of Princeton mathematician John von Neumann in the 1930s, who collaborated with his economist colleague Oskar Morgenstern to write Theory of Games and Economic Behavior (published in 1944), which launched the field. Their book provided an analytical framework for figuring out, say, what Pepsi should do if Coke lowers its prices. That depends on how Pepsi’s CEO thinks Coke will respond, which in turn depends on what Coke’s CEO expects that Pepsi’s response to their price reduction will be.

**
The Supermen: The Story of Seymour Cray and the Technical Wizards Behind the Supercomputer
** by
Charles J. Murray

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Berlin Wall, fear of failure, John von Neumann, pattern recognition, Ralph Waldo Emerson, Silicon Valley

Scientists estimated that the temperature at the point of detonation was tens of millions of de- grees and that pressures were tens of billions times atmospheric. The release of energy was simply beyond the bounds of human imagination. Being anywhere near a nuclear blast was probably the closest thing on earth to hell itself. That knowledge had, in fact, been one of the driving forces behind the formation of the new lab. Legend held that on his deathbed, world-renowned mathematician John von Neumann had called for a greater push in the area of computational study of nuclear weapons. "Never let the lab be like the aircraft industry," he had said, "building, crash- ing, and then fixing." The concept of computing was not new to nuclear scientists. Those at Los Alamos National Laboratory-or more accurately, their wives-had used primitive calculating machinery to work through the mysteries of Fat Man and Little Boy.

**
Time Travel: A History
** by
James Gleick

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Albert Einstein, Albert Michelson, Arthur Eddington, augmented reality, butterfly effect, crowdsourcing, Doomsday Book, index card, Isaac Newton, John von Neumann, luminiferous ether, Marshall McLuhan, Norbert Wiener, pattern recognition, Richard Feynman, Richard Feynman, Schrödinger's Cat, self-driving car, Stephen Hawking, telepresence, wikimedia commons

On ne s’évadait pas du Temps. The future has followed him here. Only at the last instant does he realize whose death he had witnessed as a child. * * * *1 A rebellious elector in Virginia refused to cast his ballot for the vote winners, Richard Nixon and Spiro Agnew, in 1972 and voted instead for John Hospers, on the Libertarian line. *2 Gödel’s proof “is more than a monument,” said John von Neumann, “it is a landmark which will remain visible far in space and time….The subject of logic has completely changed its nature and possibilities with Gödel’s achievement.” *3 Also, the Gödelian universe does not expand, whereas most cosmologists are pretty sure that ours does. *4 Gödel’s biographer Rebecca Goldstein remarked, “As a physicist and a man of common sense, Einstein would have preferred that his field equations excluded such an Alice-in-Wonderland possibility as looping time

**
The Economic Singularity: Artificial intelligence and the death of capitalism
** by
Calum Chace

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, additive manufacturing, agricultural Revolution, AI winter, Airbnb, artificial general intelligence, augmented reality, autonomous vehicles, banking crisis, Baxter: Rethink Robotics, Berlin Wall, Bernie Sanders, bitcoin, blockchain, call centre, Chris Urmson, congestion charging, credit crunch, David Ricardo: comparative advantage, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Flynn Effect, full employment, future of work, gender pay gap, gig economy, Google Glasses, Google X / Alphabet X, income inequality, industrial robot, Internet of things, invention of the telephone, invisible hand, James Watt: steam engine, Jaron Lanier, Jeff Bezos, job automation, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, knowledge worker, lump of labour, Lyft, Mark Zuckerberg, Martin Wolf, McJob, means of production, Milgram experiment, Narrative Science, natural language processing, new economy, Occupy movement, Oculus Rift, PageRank, pattern recognition, post scarcity, post-industrial society, precariat, prediction markets, QWERTY keyboard, railway mania, RAND corporation, Ray Kurzweil, RFID, Rodney Brooks, Satoshi Nakamoto, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Skype, software is eating the world, speech recognition, Stephen Hawking, Steve Jobs, TaskRabbit, technological singularity, Thomas Malthus, transaction costs, Tyler Cowen: Great Stagnation, Uber for X, universal basic income, Vernor Vinge, working-age population, Y Combinator, young professional

[ii] Many textbooks place the start of the industrial revolution in the second half of the 18th century, but I like the argument that Thomas Newcomen's creation of the first practical steam engine in 1712 provides the best origin story. [iii] There is no general agreement about when the information revolution started. In his 1962 book “The Production and Distribution of Knowledge in the United States”, the Austrian economist Fritz Machler suggested that with 29% of GDP accounted for by the knowledge industry, it had begun. [iv] The term was first applied to human affairs back in the 1950s by John von Neumann, a key figure in the development of the computer. The physicist and science fiction author Vernor Vinge argued in 1993 that artificial intelligence and other technologies would cause a singularity in human affairs within 30 years. This idea was picked up and popularised by the inventor and futurist Ray Kurzweil, who believes that computers will overtake humans in general intelligence in 1929, and a singularity will arrive in 2045. https://en.wikipedia.org/wiki/Technological_singularity [v] The event horizon of a black hole is the point beyond which events cannot affect an outside observer, or in other words, the point of no return.

**
The Art of Computer Programming: Fundamental Algorithms
** by
Donald E. Knuth

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

discrete time, distributed generation, fear of failure, Fermat's Last Theorem, Isaac Newton, Jacquard loom, Jacquard loom, John von Neumann, linear programming, linked data, Menlo Park, probability theory / Blaise Pascal / Pierre de Fermat, Richard Feynman, sorting algorithm, stochastic process, Turing machine

Dijkstra, CACM 18 A975), 453-457; A Discipline of Programming (Prentice-Hall, 1976).] 18 BASIC CONCEPTS 1.2.1 The concept of inductive assertions actually appeared in embryonic form in 1946, at the same time as flow charts were introduced by H. H. Goldstine and J. von Neumann. Their original flow charts included "assertion boxes" that are in close analogy with the assertions in Fig. 4. [See John von Neumann, Collected Works 5 (New York: Macmillan, 1963), 91-99. See also A. M. Turing's early comments about verification in Report of a Conference on High Speed Automatic Calculating Machines (Cambridge Univ., 1949), 67-68 and figures; reprinted with commentary by F. L. Morris and C. B. Jones in Annals of the History of Computing 6 A984), 139-143.] The understanding of the theory of a routine may be greatly aided by providing, at the time of construction one or two statements concerning the state of the machine at well chosen points. ...

…

However, these were essentially "open subroutines," meant to be inserted into a program where needed instead of being linked up dynamically. Babbage's planned machine was controlled by sequences of punched cards, as on the Jacquard loom; the Mark I was controlled by a number of paper tapes. Thus they were quite different from today's stored- program computers. Subroutine linkage appropriate to stored-program machines, with the return address supplied as a parameter, was discussed by Herman H. Goldstine and John von Neumann in their widely circulated monograph on programming, written during 1946 and 1947; see von Neumann's Collected Works 5 (New York: Macmillan, 1963), 215-235. The main routine of their programs was responsible for storing parameters into the body of the subroutine, instead of passing the necessary information in registers. In England, A. M. Turing had designed hardware and software for subroutine linkage as early as 1945; see Proceedings of a Second Symposium on Large-Scale Digital Calculating Machinery (Cambridge, Mass.: Harvard University, 1949), 87-90; B.

…

Genuys (Academic Press, 1968), 43-112; BIT 8 A968), 174-186; Ada Informatica 1 A971), 115-138]. The paper "Input- Output Buffering and FORTRAN" by David E. Ferguson, JACM 7 A960), 1-9, describes buffer circles and gives a detailed description of simple buffering with many units at once. About 1000 instructions is a reasonable upper limit for the complexity of the problems new envisioned. — HERMAN GOLDSTINE and JOHN VON NEUMANN A946) CHAPTER TWO INFORMATION STRUCTURES / think that I shall never see A poem lovely as a tree. — JOYCE KILMER A913) Yea, from the table of my memory I'll wipe away all trivial fond records. — Hamlet (Act I, Scene 5, Line 98) 2.1. INTRODUCTION Computer programs usually operate on tables of information. In most cases these tables are not simply amorphous masses of numerical values; they involve important structural relationships between the data elements.

**
The Strangest Man: The Hidden Life of Paul Dirac, Mystic of the Atom
** by
Graham Farmelo

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, anti-communist, Arthur Eddington, Berlin Wall, cuban missile crisis, double helix, Ernest Rutherford, Fall of the Berlin Wall, Fellow of the Royal Society, financial independence, gravity well, Henri Poincaré, invention of radio, invisible hand, Isaac Newton, John von Neumann, Kevin Kelly, Murray Gell-Mann, Richard Feynman, Richard Feynman, Simon Singh, Solar eclipse in 1919, Stephen Hawking, strikebreaker, University of East Anglia

Since the break-up of the Austro-Hungarian Empire in 1918, Hungary had been through a bloody Bolshevik revolution led by Béla Kun and the White Terror organised by nationalist and anti-Semitic forces. Wigner was fearful of the future of the country, then under Admiral Horthy’s authoritarian regime. Despite all the political upheavals, Wigner had an exceptionally fine school education in mathematics and science, even more thorough than Dirac’s. Historians still debate why Budapest in the early twentieth century produced so many intellectual innovators, including John von Neumann, whom Dirac would later rate as the world’s finest mathematician, and Wigner’s friends Leó Szilárd and Edward Teller, both to do important research into the first nuclear weapons.17 The success of this cohort of Hungarians is partly due to their education, shortly after the war, in Budapest’s excellent high schools and partly to the vibrancy and ambition of the city’s Western-focused culture.18 Wigner was one of the shyest and most uncommunicative of the quantum physicists but, compared with Dirac, he was gregariousness itself, so conversation during their evening meals together was probably strained.

…

In the summer of 1939, Wigner, Szilárd and Teller persuaded Einstein to write to President Roosevelt, drawing his attention to the possibility of nuclear weapons and the danger that the Germans might produce one first.17 After a long delay, Roosevelt invited Einstein to join a committee of government advisers but he brusquely declined and sat out the war at the Institute for Advanced Study in Princeton, where word spread that the Nazis were indeed working on a bomb. In the spring of 1940, Dirac’s friends Oswald Veblen and John von Neumann wrote to the director Frank Aydelotte, urgently seeking his assistance to fund investigations into the chain reaction. In their letter, they mentioned a recent conversation with the Dutch physical chemist Peter Debye, who had led one of Berlin’s largest research institutes until the German authorities sent him abroad in order to free his laboratories for secret war work. [H]e made no secret of the fact that this work is essentially a study of the fission of uranium.

**
The Signal and the Noise: Why So Many Predictions Fail-But Some Don't
** by
Nate Silver

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

airport security, availability heuristic, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, big-box store, Black Swan, Broken windows theory, Carmen Reinhart, Claude Shannon: information theory, Climategate, Climatic Research Unit, cognitive dissonance, collapse of Lehman Brothers, collateralized debt obligation, complexity theory, computer age, correlation does not imply causation, Credit Default Swap, credit default swaps / collateralized debt obligations, cuban missile crisis, Daniel Kahneman / Amos Tversky, diversification, Donald Trump, Edmond Halley, Edward Lorenz: Chaos theory, en.wikipedia.org, equity premium, Eugene Fama: efficient market hypothesis, everywhere but in the productivity statistics, fear of failure, Fellow of the Royal Society, Freestyle chess, fudge factor, George Akerlof, haute cuisine, Henri Poincaré, high batting average, housing crisis, income per capita, index fund, Internet Archive, invention of the printing press, invisible hand, Isaac Newton, James Watt: steam engine, John Nash: game theory, John von Neumann, Kenneth Rogoff, knowledge economy, locking in a profit, Loma Prieta earthquake, market bubble, Mikhail Gorbachev, Moneyball by Michael Lewis explains big data, Monroe Doctrine, mortgage debt, Nate Silver, new economy, Norbert Wiener, PageRank, pattern recognition, pets.com, prediction markets, Productivity paradox, random walk, Richard Thaler, Robert Shiller, Robert Shiller, Rodney Brooks, Ronald Reagan, Saturday Night Live, savings glut, security theater, short selling, Skype, statistical model, Steven Pinker, The Great Moderation, The Market for Lemons, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, transfer pricing, University of East Anglia, Watson beat the top human players on Jeopardy!, wikimedia commons

As you’ll see in chapter 9, computers aren’t good at every task we hope they might accomplish and have been far from a panacea for prediction. But computers are very good at computing: at repeating the same arithmetic tasks over and over again and doing so quickly and accurately. Tasks like chess that abide by relatively simple rules, but which are difficult computationally, are right in their wheelhouse. So, potentially, was the weather. The first computer weather forecast was made in 1950 by the mathematician John von Neumann, who used a machine that could make about 5,000 calculations per second.17 That was a lot faster than Richardson could manage with a pencil and paper in a French hay field. Still, the forecast wasn’t any good, failing to do any better than a more-or-less random guess. Eventually, by the mid-1960s, computers would start to demonstrate some skill at weather forecasting. And the Bluefire—some 15 billion times faster than the first computer forecast and perhaps a quadrillion times faster than Richardson—displays quite a bit of acumen because of the speed of computation.

…

It actually does a much worse job of explaining the real world.58 As obvious as this might seem when explained in this way, many forecasters completely ignore this problem. The wide array of statistical methods available to researchers enables them to be no less fanciful—and no more scientific—than a child finding animal patterns in clouds.* “With four parameters I can fit an elephant,” the mathematician John von Neumann once said of this problem.59 “And with five I can make him wiggle his trunk.” Overfitting represents a double whammy: it makes our model look better on paper but perform worse in the real world. Because of the latter trait, an overfit model eventually will get its comeuppance if and when it is used to make real predictions. Because of the former, it may look superficially more impressive until then, claiming to make very accurate and newsworthy predictions and to represent an advance over previously applied techniques.

**
Red Plenty
** by
Francis Spufford

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

affirmative action, anti-communist, Anton Chekhov, asset allocation, Buckminster Fuller, clean water, cognitive dissonance, computer age, double helix, Fellow of the Royal Society, John von Neumann, linear programming, market clearing, New Journalism, oil shock, Plutocrats, plutocrats, profit motive, RAND corporation, Simon Kuznets, the scientific method

He could help to make it happen, three extra percent at a time, though he already understood that it would take a huge quantity of work to compose the necessary dynamic models. It might be a lifetime’s work. But he could do it. He could tune up the whole Soviet orchestra, if they’d let him. His left foot dripped. He really must find a way to get new shoes. Notes – I.1 The Prodigy, 1938 1 Without thinking about it, Leonid Vitalevich: Leonid Vitalevich Kantorich (1912–86), mathematician and economist, nearest Soviet equivalent to John von Neumann, later (1975) to be the only Soviet winner of the Nobel Prize for Economics (shared with Tjalling Koopmans). Calling someone by first name and patronymic expresses formal esteem, in Russian; he is mostly referred to that way here, to suggest that he is being viewed with respectful acquaintance but not intimacy. With fictional elaboration, this scene on the tram is true to his history, for which see his Nobel Prize autobiography, in Assar Lindbeck, ed., Nobel Lectures, Economics 1969–1980 (Singapore: World Scientific Publishing Co., 1992); and the collection of his letters and articles, with colleagues’ memoirs, in V.L.Kantorovich, S.S.Kutateladze and Ya.

…

For an exploitation in contemporary fantasy of Russian folklore and the Soviet/post-Soviet setting, see Liz Williams, Nine Layers of Sky (New York: Bantam Spectra, 2003). 3 The stories’ name for a magic carpet: see Kravchenko, The World of the Russian Fairy Tale. 4 ‘In our day,’ Nikita Khrushchev told a crowd: see Khrushchev in America: Full Texts of the Speeches Made by N.S.Khrushchev on His Tour of the United States, September 15–27, 1959 (New York: Crosscurrents Press, 1960), which includes this speech, made in Moscow on his return. 5 All Russia was (in Lenin’s words) ‘one office, one factory’: technically, in fact, a prediction by him about the working of post-revolutionary society, made just before the Bolshevik putsch, and published just after it, in The State and Revolution (1918), ch. 5. ‘The whole of society will have become one office and one factory with equal work and equal pay.’ There are many, many editions, but see, for example, V.I.Lenin, Selected Works vol. 2 (Moscow: Progress Publishers, 1970). I.1 The Prodigy, 1938 1 Without thinking about it, Leonid Vitalevich: Leonid Vitalevich Kantorovich (1912–86), mathematician and economist, nearest Soviet equivalent to John von Neumann, later (1975) to be the only Soviet winner of the Nobel Prize for Economics (shared with Tjalling Koopmans). Calling someone by first name and patronymic expresses formal esteem, in Russian; he is mostly referred to that way here, to suggest that he is being viewed with respectful acquaintance but not intimacy. With fictional elaboration, this scene on the tram is true to his history, for which see his Nobel Prize autobiography, in Assar Lindbeck, ed., Nobel Lectures, Economics 1969–1980 (Singapore: World Scientific Publishing Co., 1992); and the collection of his letters and articles, with colleagues’ memoirs, in V.L.Kantorovich, S.S.Kutateladze and Ya.

**
Climbing Mount Improbable
** by
Richard Dawkins,
Lalla Ward

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Buckminster Fuller, computer age, Drosophila, Fellow of the Royal Society, industrial robot, invention of radio, John von Neumann, Menlo Park, phenotype, Robert X Cringely, stem cell, trade route

Make a new robot, then feed the same TRIP program into its on-board computer and turn it loose on the world to do the same thing.’ The hypothetical robot that we have now worked towards can be called a TRIP robot. A TRIP robot such as we are now imagining is a machine of great technical ingenuity and complexity. The principle was discussed by the celebrated Hungarian-American mathematician John von Neumann (one of two candidates for the honoured title of the father of the modern computer—the other was Alan Turing, the young British mathematician who, through his codebreaking genius, may have done more than any other individual on the Allied side to win the Second World War, but who was driven to suicide after the war by judicial persecution, including enforced hormone injections, for his homosexuality).

**
The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman
** by
Richard P. Feynman,
Jeffrey Robbins

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Brownian motion, impulse control, index card, John von Neumann, Murray Gell-Mann, pattern recognition, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, the scientific method

Ultimately, for fun again and intellectual pleasure, we could imagine machines as tiny as a few microns across, with wheels and cables all interconnected by wires, silicon connections, so that the thing as a whole, a very large device, moves not like the awkward motions of our present stiff machines but in the smooth way of the neck of a swan, which after all is a lot of little machines, the cells all interconnected and all controlled in a smooth way. Why can’t we do that ourselves? ______ *John von Neumann (1903–1957), a Hungarian-American mathematician who is credited as being one of the fathers of the computer. Ed. *The jerky movements of particles caused by the constant random collisions of molecules, first noted in print in 1928 by botanist Robert Brown, and explained by Albert Einstein in a 1905 paper in Annalen der Physik. Ed. *Sci. Am. July 1985; Japanese Transl.–SAIENSU, Sept. 1985.

**
My Life as a Quant: Reflections on Physics and Finance
** by
Emanuel Derman

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Berlin Wall, bioinformatics, Black-Scholes formula, Brownian motion, capital asset pricing model, Claude Shannon: information theory, Emanuel Derman, fixed income, Gödel, Escher, Bach, haute couture, hiring and firing, implied volatility, interest rate derivative, Jeff Bezos, John von Neumann, law of one price, linked data, Long Term Capital Management, moral hazard, Murray Gell-Mann, pre–internet, publish or perish, quantitative trading / quantitative ﬁnance, Richard Feynman, Sharpe ratio, statistical arbitrage, statistical model, Stephen Hawking, Steve Jobs, stochastic volatility, technology bubble, transaction costs, value at risk, volatility smile, Y2K, yield curve, zero-coupon bond

It looked very unprofessorally businesslike, an early precursor of soon-to-arrive European Filofaxes and, a decade later, American Palm Pilots. David clearly thought big. In those days he was planning what he called "NonVon," a parallel-processing computer comprised of many small processors and memory units. It was to be the antithesis of the standard computer with one large central processor, a design that had prevailed since John von Neumann and the ENIAC computer of the 1940s. David's confidence inspired fear and envy. John Kender complained half-jokingly to me that while he and the other assistant professors in the tenure race at Columbia were trying to get modest government grants to do their work, David was always talking about ambitious proposals on a much larger scale, with plans for NonVon eventually to require a staff of tens to hundreds.

**
We-Think: Mass Innovation, Not Mass Production
** by
Charles Leadbeater

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1960s counterculture, Andrew Keen, barriers to entry, bioinformatics, c2.com, call centre, citizen journalism, clean water, cloud computing, complexity theory, congestion charging, death of newspapers, Debian, digital Maoism, double helix, Edward Lloyd's coffeehouse, frictionless, frictionless market, future of work, game design, Google Earth, Google X / Alphabet X, Hacker Ethic, Hernando de Soto, hive mind, Howard Rheingold, interchangeable parts, Isaac Newton, James Watt: steam engine, Jane Jacobs, Jaron Lanier, Jean Tirole, jimmy wales, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, lone genius, M-Pesa, Mark Zuckerberg, Marshall McLuhan, Menlo Park, microcredit, new economy, Nicholas Carr, online collectivism, planetary scale, post scarcity, Richard Stallman, Silicon Valley, slashdot, social web, software patent, Steven Levy, Stewart Brand, supply-chain management, The Death and Life of Great American Cities, the market place, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Whole Earth Catalog, Zipcar

But for simpler, basic goods – tin roofs, plates, combs, stoves, spades, buckets – manufacturing might become much more localised. Just as the fax machine, printer and photocopier have spread the world over, so could low-cost manufacturing of machines that make reliable products, customised to local needs. One such machine could be based on Bath University’s RepRap, which looks like a large photocopier and can make three-dimensional objects from designs stored inside its computer. In the 1950s the mathematician John von Neumann imagined a universal constructor: a computer linked to a manufacturing robot that could make virtually any physical object, including replicating itself. The closest the world got to such a machine was the replicator in Star Wars, which could make any object out of thin air; the RepRap might make that a reality. The RepRap developed as a rapid prototyping machine, making models that are built up by thin layers of plastic being sprayed on top of one another.

**
Some Remarks
** by
Neal Stephenson

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

airport security, augmented reality, barriers to entry, British Empire, cable laying ship, call centre, cellular automata, edge city, Eratosthenes, Fellow of the Royal Society, Hacker Ethic, impulse control, Iridium satellite, Isaac Newton, Jaron Lanier, John von Neumann, Just-in-time delivery, Kevin Kelly, music of the spheres, Norbert Wiener, offshore financial centre, oil shock, packet switching, pirate software, Richard Feynman, Richard Feynman, Saturday Night Live, shareholder value, Silicon Valley, Skype, slashdot, social web, Socratic dialogue, South China Sea, special economic zone, Stephen Hawking, the scientific method, trade route, Turing machine, uranium enrichment, Vernor Vinge, X Prize

Following Leibniz’s suggestion, if F exists from t1 to tn and has a different thought at each moment of its existence, then at every moment, there will be an instruction about what to think next. The present thought occurring at t1, together with the Production Rule, will determine what F will think at t2.” Combined with the monadic property of being able to perceive the states of all other monads, this comes close to being a mathematically formal definition of cellular automata, a branch of mathematics generally agreed to have been invented by Stanislaw Ulam and John von Neumann during the 1940s as an outgrowth of work at Los Alamos. The impressive capabilities of such systems have, in subsequent decades, drawn the attention of many luminaries from the worlds of mathematics and physics, some of whom have proposed that the physical universe might, in fact, consist of cellular automata carrying out a calculation—a hypothesis known as Digital Physics, or It from Bit. 4.

**
What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry
** by
John Markoff

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Any sufficiently advanced technology is indistinguishable from magic, Apple II, back-to-the-land, Bill Duvall, Bill Gates: Altair 8800, Buckminster Fuller, California gold rush, card file, computer age, computer vision, conceptual framework, cuban missile crisis, Douglas Engelbart, Dynabook, El Camino Real, general-purpose programming language, Golden Gate Park, Hacker Ethic, hypertext link, informal economy, information retrieval, invention of the printing press, Jeff Rulifson, John Nash: game theory, John von Neumann, Kevin Kelly, knowledge worker, Mahatma Gandhi, Menlo Park, Mother of all demos, Norbert Wiener, packet switching, Paul Terrell, popular electronics, QWERTY keyboard, RAND corporation, RFC: Request For Comment, Richard Stallman, Robert X Cringely, Sand Hill Road, Silicon Valley, Silicon Valley startup, South of Market, San Francisco, speech recognition, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Thorstein Veblen, Turing test, union organizing, Vannevar Bush, Whole Earth Catalog, William Shockley: the traitorous eight

Composed of thirteen thousand mechanical relays, the SSEC, which could perform a lumbering twenty-five instructions per second (today an Intel Pentium microprocessor will easily surpass three billion instructions in the same second), was a computing machine that straddled the divide between calculators and modern computers. It didn’t have a memory in the modern sense, and programs were entered via punched paper tape. The skills Crane developed on the SSEC later proved useful when he was hired to work on a new computer being built by the legendary mathematician John Von Neumann at the Institute for Advanced Study in Princeton. Frustrated with the slow speed of getting data into and out of his machine, Von Neumann had persuaded IBM’s founder, Tom Watson Sr., to donate a punch-card reader to help speed up the process. Since he was one of the few people who knew how card readers worked, Crane was enlisted in the project. In Princeton, he was witness to one of the world’s first artificial light shows when, late at night, he sat and watched the Johniac’s one hundred thousand neon tubes dance on and off in rhythmic patterns.

**
Utopias: A Brief History From Ancient Writings to Virtual Communities
** by
Howard P. Segal

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1960s counterculture, British Empire, Buckminster Fuller, complexity theory, David Brooks, death of newspapers, dematerialisation, deskilling, energy security, European colonialism, Francis Fukuyama: the end of history, full employment, future of journalism, garden city movement, germ theory of disease, Golden Gate Park, invention of the printing press, Isaac Newton, Jeff Bezos, John von Neumann, knowledge economy, Louis Pasteur, Mark Zuckerberg, means of production, Nicholas Carr, Nikolai Kondratiev, out of africa, Ralph Waldo Emerson, Ray Kurzweil, Ronald Reagan, Silicon Valley, Skype, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Stewart Brand, technoutopianism, Thomas Malthus, Thorstein Veblen, transcontinental railway, traveling salesman, union organizing, urban planning, War on Poverty, Whole Earth Catalog

So, too, do the failures of other experts in such realms as environmental protection and nuclear power to achieve promised goals safely and efﬁciently.57 For that matter, predictions of an ever growing population in the developing world are now recognized as outdated, in favor of a far more complex picture.58 These dismal records have in turn led to a declining faith in forecasting as a serious intellectual and moral enterprise—just as, paradoxically, forecasting has become a highly proﬁtable industry. A revealing footnote here is the failure of the otherwise brilliant scientists and engineers who invented computers during and after World War II to anticipate the evolution of the computers of their day. Interviews, memoirs, and other accounts from pioneers such as John Mauchly and John Von Neumann reveal no expectations of signiﬁcant changes from the handful of room-sized behemoths—operated by skilled programmers and dependent on vacuum tubes that constantly needed to be replaced— that were to be used only by the largest national and international institutions to solve the most complex quantitative problems. They were 160 Utopia Reconsidered generally unable to “step outside the box” in speculating about computers’ future.

**
This Will Make You Smarter: 150 New Scientific Concepts to Improve Your Thinking
** by
John Brockman

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

23andMe, Albert Einstein, Alfred Russel Wallace, banking crisis, Barry Marshall: ulcers, Benoit Mandelbrot, Berlin Wall, biofilm, Black Swan, butterfly effect, Cass Sunstein, cloud computing, congestion charging, correlation does not imply causation, Daniel Kahneman / Amos Tversky, dark matter, data acquisition, David Brooks, delayed gratification, Emanuel Derman, epigenetics, Exxon Valdez, Flash crash, Flynn Effect, hive mind, impulse control, information retrieval, Isaac Newton, Jaron Lanier, John von Neumann, Kevin Kelly, mandelbrot fractal, market design, Mars Rover, Marshall McLuhan, microbiome, Murray Gell-Mann, Nicholas Carr, open economy, place-making, placebo effect, pre–internet, QWERTY keyboard, random walk, randomized controlled trial, rent control, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Richard Thaler, Schrödinger's Cat, security theater, Silicon Valley, stem cell, Steve Jobs, Steven Pinker, Stewart Brand, the scientific method, Thorstein Veblen, Turing complete, Turing machine, Walter Mischel, Whole Earth Catalog

In a zero-sum game, a rational actor seeking the greatest gain for himself or herself will necessarily be seeking the maximum loss for the other actor. In a positive-sum game, a rational, self-interested actor may benefit the other actor with the same choice that benefits himself or herself. More colloquially, positive-sum games are called win-win situations and are captured in the cliché “Everybody wins.” This family of concepts—zero-sum, nonzero-sum, positive-sum, negative-sum, constant-sum, and variable-sum games—was introduced by John von Neumann and Oskar Morgenstern when they invented the mathematical theory of games in 1944. The Google Books Ngram tool shows that the terms saw a steady increase in popularity beginning in the 1950s, and their colloquial relative “win-win” began a similar ascent in the 1970s. Once people are thrown together in an interaction, their choices don’t determine whether they are in a zero- or nonzero-sum game; the game is a part of the world they live in.

**
Wonderland: How Play Made the Modern World
** by
Steven Johnson

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Alfred Russel Wallace, Antoine Gombaud: Chevalier de Méré, Berlin Wall, bitcoin, Book of Ingenious Devices, Buckminster Fuller, Claude Shannon: information theory, Clayton Christensen, colonial exploitation, computer age, conceptual framework, crowdsourcing, cuban missile crisis, Drosophila, Fellow of the Royal Society, game design, global village, Hedy Lamarr / George Antheil, HyperCard, invention of air conditioning, invention of the printing press, invention of the telegraph, Islamic Golden Age, Jacquard loom, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, Jane Jacobs, John von Neumann, joint-stock company, Joseph-Marie Jacquard, Landlord's Game, lone genius, megacity, Minecraft, Murano, Venice glass, music of the spheres, Necker cube, New Urbanism, Oculus Rift, On the Economy of Machinery and Manufactures, pattern recognition, pets.com, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, QWERTY keyboard, Ray Oldenburg, spice trade, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, supply-chain management, talking drums, the built environment, The Great Good Place, the scientific method, The Structural Transformation of the Public Sphere, trade route, Turing machine, Turing test, Upton Sinclair, urban planning, Victor Gruen, Watson beat the top human players on Jeopardy!, white flight, Whole Earth Catalog, working poor, Wunderkammern

Turing’s speculations form a kind of origin point for two parallel paths that would run through the rest of the century: building intelligence into computers by teaching them to play chess, and studying humans playing chess as a way of understanding our own intelligence. Those interpretative paths would lead to some extraordinary breakthroughs: from the early work on cybernetics and game theory from people like Claude Shannon and John von Neumann, to machines like IBM’s Deep Blue that could defeat grandmasters with ease. In cognitive science, the litany of insights that derived from the study of chess could almost fill an entire textbook, insights that have helped us understand the human capacity for problem solving, pattern recognition, visual memory, and the crucial skill that scientists call, somewhat awkwardly, chunking, which involves grouping a collection of ideas or facts into a single “chunk” so that they can be processed and remembered as a unit.

**
The Undoing Project: A Friendship That Changed Our Minds
** by
Michael Lewis

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, availability heuristic, Cass Sunstein, choice architecture, complexity theory, Daniel Kahneman / Amos Tversky, Donald Trump, Douglas Hofstadter, endowment effect, feminist movement, framing effect, hindsight bias, John von Neumann, loss aversion, medical residency, Menlo Park, Murray Gell-Mann, Nate Silver, New Journalism, Richard Thaler, Saturday Night Live, statistical model, Walter Mischel, Yom Kippur War

It distinctly did not explain the human desire to buy a lottery ticket, however. It effectively turned a blind eye to gambling. Odd this, as the search for a theory about how people made risky decisions had started as an attempt to make Frenchmen shrewder gamblers. Amos’s text skipped over the long, tortured history of utility theory after Bernoulli all the way to 1944. A Hungarian Jew named John von Neumann and an Austrian anti-Semite named Oskar Morgenstern, both of whom fled Europe for America, somehow came together that year to publish what might be called the rules of rationality. A rational person making a decision between risky propositions, for instance, shouldn’t violate the von Neumann and Morgenstern transitivity axiom: If he preferred A to B and B to C, then he should prefer A to C.

**
Erwin Schrodinger and the Quantum Revolution
** by
John Gribbin

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Albert Michelson, All science is either physics or stamp collecting, Arthur Eddington, British Empire, Brownian motion, double helix, Drosophila, Edmond Halley, Ernest Rutherford, Fellow of the Royal Society, Henri Poincaré, Isaac Newton, John von Neumann, Richard Feynman, Richard Feynman, Schrödinger's Cat, Solar eclipse in 1919, The Present Situation in Quantum Mechanics, the scientific method, trade route, upwardly mobile

As the Oxford physicist David Deutsch (b. 1953) has put it, “a non-local hidden variable theory means, in ordinary language, a theory in which influences propagate across space and time without passing through the space in between: [in other words] they propagate instantaneously.”1 Apart from the momentum of the Copenhagen juggernaut, there was another reason why most physicists did not take hidden variables theory seriously in the 1950s. In 1932, John von Neumann (1903–57), a Hungarian-born mathematical genius, had published a book in which, among other things, he “proved” that hidden variables theories could not work. His contemporaries were so in awe of von Neumann’s ability that for a generation this proof was barely questioned, and it was widely cited as gospel, without being spelled out in full, in standard texts such as Max Born’s Natural Philosophy of Cause and Chance, published in 1949.

**
Powers and Prospects
** by
Noam Chomsky

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

anti-communist, Berlin Wall, Bretton Woods, colonial rule, declining real wages, deindustrialization, deskilling, Fall of the Berlin Wall, invisible hand, Jacques de Vaucanson, John von Neumann, Monroe Doctrine, RAND corporation, Ronald Reagan, South China Sea, theory of mind, Tobin tax, Turing test

Consider, say, chemistry and biology. The distinguished biologist François Jacob observes that ‘for the biologist, the living begins only with what was able to constitute a genetic program’, while ‘for the chemist, in contrast, it is somewhat arbitrary to make a demarcation where there can only be continuity’. Others might want to add crystals to the mix, or self-replicating automata of the kind pioneered by John von Neumann. There is no ‘right answer’, no reason to seek sharper boundaries to distinguish among physical, biological, chemical, and other aspects of the world. No discipline has any prior claim to particular objects in the world, whether they are complex molecules, stars, or human language. I should make it clear that these remarks are not uncontentious. There is much vigorous debate about the matter in the case of language, though rarely about other objects of the world.

**
Language and Mind
** by
Noam Chomsky

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Alfred Russel Wallace, finite state, John von Neumann, pattern recognition, phenotype, theory of mind

For those who sought a more mathematical formulation of the basic processes, there was the newly developed mathematical theory of communication, which, it was widely believed in the early 1950s, had provided a fundamental concept – the concept of “information” – that would unify the social and behavioral sciences and permit the development of a solid and satisfactory mathematical theory of human behavior on a probabilistic base. At about the same time, the theory of automata developed as an independent study, making use of closely related mathematical notions. And it was linked at once, and quite properly, to earlier explorations of the theory of neural nets. There were those – John von Neumann, for example – who felt that the entire development was dubious and shaky at best, and probably quite misconceived, but such qualms did not go far to dispel the feeling that mathematics, technology, and behavioristic linguistics and psychology were converging on a point of view that was very simple, very clear, and fully adequate to provide a basic understanding of what tradition had left shrouded in mystery.

**
Reinventing the Bazaar: A Natural History of Markets
** by
John McMillan

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

accounting loophole / creative accounting, Albert Einstein, Andrei Shleifer, Anton Chekhov, Asian financial crisis, congestion charging, corporate governance, crony capitalism, Dava Sobel, Deng Xiaoping, experimental economics, experimental subject, fear of failure, first-price auction, frictionless, frictionless market, George Akerlof, George Gilder, global village, Hernando de Soto, I think there is a world market for maybe five computers, income inequality, income per capita, informal economy, invisible hand, Isaac Newton, job-hopping, John Harrison: Longitude, John von Neumann, land reform, lone genius, manufacturing employment, market clearing, market design, market friction, market microstructure, means of production, Network effects, new economy, offshore financial centre, pez dispenser, pre–internet, price mechanism, profit maximization, profit motive, proxy bid, purchasing power parity, Ronald Coase, Ronald Reagan, sealed-bid auction, second-price auction, Silicon Valley, spectrum auction, Stewart Brand, The Market for Lemons, The Nature of the Firm, The Wealth of Nations by Adam Smith, trade liberalization, transaction costs, War on Poverty, Xiaogang Anhui farmers, yield management

The rules governing trading induce the experimental economy to “compute” the outcome that equates demand with supply. What is sometimes called the wisdom of the market results from the dispersion of decision-making. Markets make fewer big mistakes than planners. This is not because businesspeople are necessarily smarter than bureaucrats. The folklore of the computer industry, for example, relates a host of wrong predictions from those best placed to know. In 1954, John von Neumann, the mathematical genius who helped invent the computer, said, “I think there is a world market for maybe five computers.” In 1977, Ken Olson, president of Digital Equipment Corp., said, “There is no reason anyone would want a computer in their home.” In 1981, Bill Gates, founder of Microsoft, is reported to have said, “640K ought to be enough for anybody.” Businesspeople are as prone to forecasting error as anyone else.

**
Spin
** by
Robert Charles Wilson

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

airport security, Colonization of Mars, invention of writing, invisible hand, John von Neumann, Mahatma Gandhi, megacity, oil shale / tar sands, rolodex, Stephen Hawking

Given the inherent difficulty of sublight-speed travel as a way of exploring the galaxy, most technological cultures eventually settle for an expanding grid of von Neumann machines—which is what the replicators are—that costs nothing to maintain and generates a trickle of scientific information that expands exponentially over historical time." "Okay," I said, "I understand that. The Martian replicators aren't unique. They ran into what you call an ecology—" "A von Neumann ecology." (After the twentieth-century mathematician John von Neumann, who first suggested the possibility of self-reproducing machines.) "A von Neumann ecology, and they were absorbed by it. But that doesn't tell us anything about the Hypotheticals or the Spin." Jason pursed his lips impatiently. "Tyler, no. You don't understand. The Hypotheticals are the von Neumann ecology. They're one and the same." * * * * * At this point I had to step back and reconsider exactly who was in the room with me.

**
The Net Delusion: The Dark Side of Internet Freedom
** by
Evgeny Morozov

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

A Declaration of the Independence of Cyberspace, Ayatollah Khomeini, Berlin Wall, borderless world, Buckminster Fuller, Cass Sunstein, citizen journalism, cloud computing, cognitive dissonance, Columbine, computer age, conceptual framework, crowdsourcing, Dissolution of the Soviet Union, don't be evil, failed state, Fall of the Berlin Wall, Francis Fukuyama: the end of history, global village, Google Earth, illegal immigration, invention of radio, invention of the printing press, invisible hand, John von Neumann, Marshall McLuhan, Naomi Klein, Network effects, new economy, New Urbanism, pirate software, pre–internet, Productivity paradox, RAND corporation, Ronald Reagan, Ronald Reagan: Tear down this wall, Silicon Valley, Silicon Valley startup, Sinatra Doctrine, Skype, Slavoj Žižek, social graph, Steve Jobs, technoutopianism, The Wisdom of Crowds, urban planning, Washington Consensus, WikiLeaks, women in the workforce

How about feeding the world’s books into a scanner and dealing with the consequences later? Name a problem that has to deal with information, and Google is already on top of it. Why the Ultimate Technological Fix Is Online It’s not all Google’s fault. There is something about the Internet and its do-it-yourself ethos that invites an endless production of quick fixes, bringing to mind the mathematician John von Neumann’s insightful observation that “technological possibilities are irresistible to man. If man can go to the moon, he will. If he can control the climate, he will” (even though on that last point, von Neumann may have been a bit off ). With the Internet, it seems, everything is irresistible, if only because everything is within easy grasp. It’s the Internet, not nuclear power, that is widely seen as the ultimate technological fix to all of humanity’s problems.

**
The Age of Spiritual Machines: When Computers Exceed Human Intelligence
** by
Ray Kurzweil

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Buckminster Fuller, call centre, cellular automata, combinatorial explosion, complexity theory, computer age, computer vision, cosmological constant, cosmological principle, Danny Hillis, double helix, Douglas Hofstadter, first square of the chessboard / second half of the chessboard, fudge factor, George Gilder, Gödel, Escher, Bach, I think there is a world market for maybe five computers, information retrieval, invention of movable type, Isaac Newton, iterative process, Jacquard loom, Jacquard loom, John von Neumann, Lao Tzu, Law of Accelerating Returns, mandelbrot fractal, Marshall McLuhan, Menlo Park, natural language processing, Norbert Wiener, optical character recognition, pattern recognition, phenotype, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, speech recognition, Steven Pinker, Stewart Brand, stochastic process, technological singularity, Ted Kaczynski, telepresence, the medium is the message, traveling salesman, Turing machine, Turing test, Whole Earth Review, Y2K

—Professor Marshal Foch, 1912 “I think there is a world market for maybe five computers.” —IBM Chairman Thomas Watson, 1943 “Computers in the future may weigh no more than 1.5 tons.” —Popular Mechanics, 1949 “It would appear that we have reached the limits of what is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in five years.” —John von Neumann, 1949 “There’s no reason for individuals to have a computer in their home.” —Ken Olson, 1977 “640,000 bytes of memory ought to be enough for anybody.” —Bill Gates, 1981 “Long before the year 2000, the entire antiquated structure of college degrees, majors and credits will be a shambles.” —Alvin Toffler “The Internet will catastrophically collapse in 1996.” —Robert Metcalfe (inventor of Ethernet), who, in 1997, ate his words (literally) in front of an audience Now I get to toot my own horn, and can share with you those predictions of mine that worked out particularly well.

**
Secrets and Lies: Digital Security in a Networked World
** by
Bruce Schneier

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Ayatollah Khomeini, barriers to entry, business process, butterfly effect, cashless society, Columbine, defense in depth, double entry bookkeeping, fault tolerance, game design, IFF: identification friend or foe, John von Neumann, knapsack problem, mutually assured destruction, pez dispenser, pirate software, profit motive, Richard Feynman, Richard Feynman, risk tolerance, Silicon Valley, Simon Singh, slashdot, statistical model, Steve Ballmer, Steven Levy, the payments system, Y2K, Yogi Berra

Almost every computer security system that uses cryptography needs random numbers—for keys, unique values in protocols, and so on—and the security of those systems is often dependent on the randomness of those random numbers. If the random number generator is insecure, the entire system breaks. Depending on who you talk to, generating random numbers from a computer is either trivial or impossible. Theoretically, it’s impossible. John von Neumann, the father of computers, said: “Anyone who considers arithmetic methods of producing random digits is, of course, in a state of sin.” What he means is that it is impossible to get something truly random out of a deterministic beast like a computer. This is true, but luckily we can get by anyway. What we really need out of a random number generator is not that the numbers be truly random, but that they be unpredictable and irreproducible.

**
Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software
** by
Scott Rosenberg

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

A Pattern Language, Berlin Wall, c2.com, call centre, collaborative editing, conceptual framework, continuous integration, Douglas Engelbart, Douglas Hofstadter, Dynabook, en.wikipedia.org, Firefox, Ford paid five dollars a day, Francis Fukuyama: the end of history, Grace Hopper, Gödel, Escher, Bach, Howard Rheingold, index card, Internet Archive, inventory management, Jaron Lanier, John von Neumann, knowledge worker, life extension, Loma Prieta earthquake, Menlo Park, Merlin Mann, new economy, Nicholas Carr, Norbert Wiener, pattern recognition, Paul Graham, Potemkin village, RAND corporation, Ray Kurzweil, Richard Stallman, Ronald Reagan, semantic web, side project, Silicon Valley, Singularitarianism, slashdot, software studies, South of Market, San Francisco, speech recognition, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Stewart Brand, Ted Nelson, Therac-25, thinkpad, Turing test, VA Linux, Vannevar Bush, Vernor Vinge, web application, Whole Earth Catalog, Y2K

Kay is the kind of maverick who has been honored by many in his field but only selectively followed. Yet other central figures in the development of modern software share his complaint that the software profession has taken a fundamental wrong turn. As early as 1978, John Backus, the father of Fortran, was expressing parallel views. Programming, Backus argued, had grown out of the ideas of John von Neumann, the mathematician who, at the dawn of computing in the 1940s, devised the basic structure of the “stored program” of sequentially executed instructions. But those ideas had become a straitjacket. “Von Neumann languages constantly keep our noses pressed in the dirt of address computation and the separate computation of single words,” he wrote. “While it was perhaps natural and inevitable that languages like Fortran and its successors should have developed out of the concept of the von Neumann computer as they did, the fact that such languages have dominated our thinking for twenty years is unfortunate.

**
What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence
** by
John Brockman

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, agricultural Revolution, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic trading, artificial general intelligence, augmented reality, autonomous vehicles, bitcoin, blockchain, clean water, cognitive dissonance, Colonization of Mars, complexity theory, computer age, computer vision, constrained optimization, corporate personhood, cosmological principle, cryptocurrency, cuban missile crisis, Danny Hillis, dark matter, discrete time, Elon Musk, Emanuel Derman, endowment effect, epigenetics, Ernest Rutherford, experimental economics, Flash crash, friendly AI, Google Glasses, hive mind, income inequality, information trail, Internet of things, invention of writing, iterative process, Jaron Lanier, job automation, John von Neumann, Kevin Kelly, knowledge worker, loose coupling, microbiome, Moneyball by Michael Lewis explains big data, natural language processing, Network effects, Norbert Wiener, pattern recognition, Peter Singer: altruism, phenotype, planetary scale, Ray Kurzweil, recommendation engine, Republic of Letters, RFID, Richard Thaler, Rory Sutherland, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, strong AI, Stuxnet, superintelligent machines, supervolcano, the scientific method, The Wisdom of Crowds, theory of mind, Thorstein Veblen, too big to fail, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

AI techniques, like machine learning, are now routinely used for speech recognition, translation, behavior modeling, robotic control, risk management, and other applications. McKinsey predicts that these technologies will create more than $50 trillion of economic value by 2025. If this is accurate, we should expect dramatically increased investment soon. The recent successes are being driven by cheap computer power and plentiful training data. Modern AI is based on the theory of “rational agents,” arising from work on microeconomics in the 1940s by John von Neumann and others. AI systems can be thought of as trying to approximate rational behavior using limited resources. There’s an algorithm for computing the optimal action for achieving a desired outcome, but it’s computationally expensive. Experiments have found that simple learning algorithms with lots of training data often outperform complex hand-crafted models. Today’s systems primarily provide value by learning better statistical models and performing statistical inference for classification and decision making.

**
The God Delusion
** by
Richard Dawkins

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, Ayatollah Khomeini, Brownian motion, cosmological principle, David Attenborough, Desert Island Discs, double helix, en.wikipedia.org, experimental subject, Fellow of the Royal Society, gravity well, invisible hand, John von Neumann, luminiferous ether, Menlo Park, meta analysis, meta-analysis, Murray Gell-Mann, Necker cube, Peter Singer: altruism, phenotype, placebo effect, planetary scale, Ralph Waldo Emerson, Richard Feynman, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, stem cell, Stephen Hawking, Steven Pinker, the scientific method, theory of mind, Thorstein Veblen, trickle-down economics, unbiased observer

FAITH AND HOMOSEXUALITY In Afghanistan under the Taliban, the official punishment for homosexuality was execution, by the tasteful method of burial alive under a wall pushed over on top of the victim. The ‘crime’ itself being a private act, performed by consenting adults who were doing nobody else any harm, we again have here the classic hallmark of religious absolutism. My own country has no right to be smug. Private homosexuality was a criminal offence in Britain up until – astonishingly – 1967. In 1954 the British mathematician Alan Turing, a candidate along with John von Neumann for the title of father of the computer, committed suicide after being convicted of the criminal offence of homosexual behaviour in private. Admittedly Turing was not buried alive under a wall pushed over by a tank. He was offered a choice between two years in prison (you can imagine how the other prisoners would have treated him) and a course of hormone injections which could be said to amount to chemical castration, and would have caused him to grow breasts.

**
Connectography: Mapping the Future of Global Civilization
** by
Parag Khanna

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

1919 Motor Transport Corps convoy, 2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 3D printing, 9 dash line, additive manufacturing, Admiral Zheng, affirmative action, agricultural Revolution, Airbnb, Albert Einstein, amateurs talk tactics, professionals talk logistics, Amazon Mechanical Turk, Asian financial crisis, asset allocation, autonomous vehicles, banking crisis, Basel III, Berlin Wall, bitcoin, Black Swan, blockchain, borderless world, Boycotts of Israel, Branko Milanovic, BRICs, British Empire, business intelligence, call centre, capital controls, charter city, clean water, cloud computing, collateralized debt obligation, complexity theory, corporate governance, corporate social responsibility, credit crunch, crony capitalism, crowdsourcing, cryptocurrency, cuban missile crisis, data is the new oil, David Ricardo: comparative advantage, deglobalization, deindustrialization, dematerialisation, Deng Xiaoping, Detroit bankruptcy, diversification, Doha Development Round, edge city, Edward Snowden, Elon Musk, energy security, ethereum blockchain, European colonialism, eurozone crisis, failed state, Fall of the Berlin Wall, family office, Ferguson, Missouri, financial innovation, financial repression, forward guidance, global supply chain, global value chain, global village, Google Earth, Hernando de Soto, high net worth, Hyperloop, ice-free Arctic, if you build it, they will come, illegal immigration, income inequality, income per capita, industrial robot, informal economy, Infrastructure as a Service, interest rate swap, Internet of things, Isaac Newton, Jane Jacobs, Jaron Lanier, John von Neumann, Julian Assange, Just-in-time delivery, Kevin Kelly, Khyber Pass, Kibera, Kickstarter, labour market flexibility, labour mobility, LNG terminal, low cost carrier, manufacturing employment, mass affluent, megacity, Mercator projection, microcredit, mittelstand, Monroe Doctrine, mutually assured destruction, New Economic Geography, new economy, New Urbanism, offshore financial centre, oil rush, oil shale / tar sands, oil shock, openstreetmap, out of africa, Panamax, Peace of Westphalia, peak oil, Peter Thiel, Plutocrats, plutocrats, post-oil, post-Panamax, private military company, purchasing power parity, QWERTY keyboard, race to the bottom, Rana Plaza, rent-seeking, reserve currency, Robert Gordon, Robert Shiller, Robert Shiller, Ronald Coase, Scramble for Africa, Second Machine Age, sharing economy, Shenzhen was a fishing village, Silicon Valley, Silicon Valley startup, six sigma, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, South China Sea, South Sea Bubble, sovereign wealth fund, special economic zone, spice trade, Stuxnet, supply-chain management, sustainable-tourism, TaskRabbit, telepresence, the built environment, Tim Cook: Apple, trade route, transaction costs, UNCLOS, uranium enrichment, urban planning, urban sprawl, WikiLeaks, young professional, zero day

As Guangzhou has graduated from factory town to financial center, its glittering central business district features the aerodynamic 103-story-tall IFC tower, modern art museums one would expect to find in Zurich, and an opera house designed by Zaha Hadid. Just outside the city, the Singapore-run Knowledge City and Guangzhou Science City were built to resemble a low-rise version of Silicon Valley, with leafy boulevards that feature bronze statues of Albert Einstein and the mathematician John von Neumann. Singapore has opened a branch of its elite Chinese-language Hwa Chong Institution while also partnering with the local government to develop new curricula for the South China University of Technology, which already graduates some of the country’s top entrepreneurs establishing companies in digital industries such as cloud computing and GPS navigation, materials engineering, renewable energy, biotechnology, and pharmaceuticals.

**
How I Became a Quant: Insights From 25 of Wall Street's Elite
** by
Richard R. Lindsey,
Barry Schachter

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, algorithmic trading, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, asset allocation, asset-backed security, backtesting, bank run, banking crisis, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, business process, buy low sell high, capital asset pricing model, centre right, collateralized debt obligation, corporate governance, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, currency manipulation / currency intervention, discounted cash flows, disintermediation, diversification, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, full employment, George Akerlof, Gordon Gekko, hiring and firing, implied volatility, index fund, interest rate derivative, interest rate swap, John von Neumann, linear programming, Loma Prieta earthquake, Long Term Capital Management, margin call, market friction, market microstructure, martingale, merger arbitrage, Nick Leeson, P = NP, pattern recognition, pensions crisis, performance metric, prediction markets, profit maximization, purchasing power parity, quantitative trading / quantitative ﬁnance, QWERTY keyboard, RAND corporation, random walk, Ray Kurzweil, Richard Feynman, Richard Feynman, Richard Stallman, risk-adjusted returns, risk/return, shareholder value, Sharpe ratio, short selling, Silicon Valley, six sigma, sorting algorithm, statistical arbitrage, statistical model, stem cell, Steven Levy, stochastic process, systematic trading, technology bubble, The Great Moderation, the scientific method, too big to fail, trade route, transaction costs, transfer pricing, value at risk, volatility smile, Wiener process, yield curve, young professional

I had an immediate “I’ll show them” attitude. Thank you, whoever did that. As I said, the math core offerings were nothing new, so I looked around. Then came another fortuitous break: my discoveries of game theory and of my eventual thesis advisor, William F. Lucas. This guy is a math legend, though unlike most, as humble a man as you will ever meet. What made him a legend was an elegant counter example to John von Neumann’s and Oskar Morganstern’s conjecture that all cooperative N-person games have solutions according to their self-proclaimed definition of such. Until that time, cooperative game theory was thought uninteresting with no open issues. However, once Lucas proved the case was not closed, the whole subject blossomed with theories of solution concepts, some of which have proved extremely valuable in applications such as voting analysis and fair division.

**
Smarter Faster Better: The Secrets of Being Productive in Life and Business
** by
Charles Duhigg

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Air France Flight 447, Asperger Syndrome, Atul Gawande, Black Swan, cognitive dissonance, Daniel Kahneman / Amos Tversky, David Brooks, epigenetics, Erik Brynjolfsson, framing effect, hiring and firing, index card, John von Neumann, knowledge worker, Lean Startup, Malcom McLean invented shipping containers, meta analysis, meta-analysis, new economy, Saturday Night Live, Silicon Valley, Silicon Valley startup, statistical model, Steve Jobs, the scientific method, theory of mind, Toyota Production System, Yom Kippur War

., Numerical Methods in Finance: Bordeaux, June 2010, Springer Proceedings in Mathematics, vol. 12 (Berlin: Springer Berlin Heidelberg, 2012); René Carmona et al., “An Introduction to Particle Methods with Financial Application,” in Numerical Methods in Finance, 3–49; Pierre Del Moral, Mean Field Simulation for Monte Carlo Integration (Boca Raton, Fla.: CRC Press, 2013); Roger Eckhardt, “Stan Ulam, John von Neumann, and the Monte Carlo Method,” Los Alamos Science, special issue (1987): 131–37. in the shape of a hat Andrew Hargadon and Robert I. Sutton, “Technology Brokering and Innovation in a Product Development Firm,” Administrative Science Quarterly 42, no. 4 (1997): 716–49; Roger P. Brown, “Polymers in Sport and Leisure,” Rapra Review Reports 12, no. 3 (November 2, 2001); Melissa Larson, “From Bombers to Bikes,” Quality 37, no. 9 (1998): 30.

**
Into the Black: The Extraordinary Untold Story of the First Flight of the Space Shuttle Columbia and the Astronauts Who Flew Her
** by
Rowland White,
Richard Truly

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Ayatollah Khomeini, Berlin Wall, cuban missile crisis, Fall of the Berlin Wall, Isaac Newton, John von Neumann, Maui Hawaii, Mercator projection, Ronald Reagan, V2 rocket

The consulting firm Mathematica Inc. was the creation of economist Oskar Morgenstern. As with Hans Mark, Morgenstern was resident in the United States as a consequence of Hitler’s annexation of Austria in 1938. A professor of economics at the University of Vienna, Morgenstern was visiting Princeton when the Nazis seized Vienna. He remained at the American university, where he met the Hungarian-born mathematical genius John von Neumann. A prodigy who as a child could memorize and recite the phone book, von Neumann had earned his PhD in mathematics at just twenty-two. Still in his twenties, he took up, alongside Albert Einstein, one of five professorships at Princeton’s Institute for Advanced Study. Von Neumann’s polymathic brilliance ranged from quantum mechanics to the hydrogen bomb. In 1944, along with fellow émigré Morgenstern, with the publication of the landmark Theory of Games and Economic Behavior, he created game theory, a mathematical model for the prediction and analysis of rational decision making.

**
Big Bang
** by
Simon Singh

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Albert Michelson, All science is either physics or stamp collecting, Andrew Wiles, anthropic principle, Arthur Eddington, Astronomia nova, Brownian motion, carbon-based life, Cepheid variable, Chance favours the prepared mind, Commentariolus, Copley Medal, cosmic abundance, cosmic microwave background, cosmological constant, cosmological principle, dark matter, Dava Sobel, Defenestration of Prague, discovery of penicillin, Dmitri Mendeleev, Edmond Halley, Edward Charles Pickering, Eratosthenes, Ernest Rutherford, Erwin Freundlich, Fellow of the Royal Society, fudge factor, Hans Lippershey, Harlow Shapley and Heber Curtis, Harvard Computers: women astronomers, Henri Poincaré, horn antenna, if you see hoof prints, think horses—not zebras, Index librorum prohibitorum, invention of the telescope, Isaac Newton, John von Neumann, Karl Jansky, Louis Daguerre, Louis Pasteur, luminiferous ether, Magellanic Cloud, Murray Gell-Mann, music of the spheres, Olbers’ paradox, On the Revolutions of the Heavenly Spheres, Paul Erdős, retrograde motion, Richard Feynman, Richard Feynman, scientific mainstream, Simon Singh, Solar eclipse in 1919, Stephen Hawking, the scientific method, Thomas Kuhn: the structure of scientific revolutions, unbiased observer, V2 rocket, Wilhelm Olbers, William of Occam

RICHARD DAWKINS (1941- ), English biologist Science is nothing but trained and organised common sense differing from the latter only as a veteran may differ from a raw recruit; and its methods differ from those of common sense only as far as the guardsman’s cut and thrust differ from the manner in which a savage wields his club. THOMAS HENRY HUXLEY (1825-95), English biologist The sciences do not try to explain, they hardly even try to interpret, they mainly make models. By a model is meant a mathematical construct which, with the addition of certain verbal interpretations, describes observed phenomena. The justification of such a mathematical construct is solely and precisely that it is expected to work. JOHN VON NEUMANN (1903-57), Hungarian-born mathematician The science of today is the technology of tomorrow. EDWARD TELLER (1908-2003), American physicist Every great advance in science has issued from a new audacity of imagination. JOHN DEWEY (1859-1952), American philosopher Four stages of acceptance: i) this is worthless nonsense, ii) this is an interesting, but perverse, point of view, iii) this is true, but quite unimportant, iv) I always said so.

**
Culture and Prosperity: The Truth About Markets - Why Some Nations Are Rich but Most Remain Poor
** by
John Kay

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Asian financial crisis, Barry Marshall: ulcers, Berlin Wall, Big bang: deregulation of the City of London, California gold rush, complexity theory, computer age, constrained optimization, corporate governance, corporate social responsibility, correlation does not imply causation, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, Donald Trump, double entry bookkeeping, double helix, Edward Lloyd's coffeehouse, equity premium, Ernest Rutherford, European colonialism, experimental economics, Exxon Valdez, failed state, financial innovation, Francis Fukuyama: the end of history, George Akerlof, George Gilder, greed is good, haute couture, illegal immigration, income inequality, invention of the telephone, invention of the wheel, invisible hand, John Nash: game theory, John von Neumann, Kevin Kelly, knowledge economy, labour market flexibility, late capitalism, Long Term Capital Management, loss aversion, Mahatma Gandhi, market bubble, market clearing, market fundamentalism, means of production, Menlo Park, Mikhail Gorbachev, money: store of value / unit of account / medium of exchange, moral hazard, Naomi Klein, Nash equilibrium, new economy, oil shale / tar sands, oil shock, pets.com, popular electronics, price discrimination, price mechanism, prisoner's dilemma, profit maximization, purchasing power parity, QWERTY keyboard, Ralph Nader, RAND corporation, random walk, rent-seeking, risk tolerance, road to serfdom, Ronald Coase, Ronald Reagan, second-price auction, shareholder value, Silicon Valley, Simon Kuznets, South Sea Bubble, Steve Jobs, telemarketer, The Chicago School, The Death and Life of Great American Cities, The Market for Lemons, The Nature of the Firm, The Predators' Ball, The Wealth of Nations by Adam Smith, Thorstein Veblen, total factor productivity, transaction costs, tulip mania, urban decay, Washington Consensus, women in the workforce, yield curve, yield management

Institutional, or transactions costs, economics recognizes that economic lives are lived in and through economic institutions. Behavioral economics contemplates alternative assumptions about motives and the nature of economic behavior. I will introduce game theory and institutional economics in the present chapter and take up behavioral economics in the chapter that follows. Economic Theory After Arrow and Debreu eeeeeeeee&ee&oeeeeeeoeeeoeeoeeooeeeoe In 1944,John von Neumann and Oskar Morgenstern published The Theory ofGames and Economic Behavior. This approach was, after an interval, to revolutionize economic theory. The analysis of competitive markets supposes anonymous interactions among many buyers and many sellers. The fragmentation and impersonality of these markets leads to incentive compatibility-there is no need to consider the behavior and responses of other market participants.

**
Near and Distant Neighbors: A New History of Soviet Intelligence
** by
Jonathan Haslam

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Benoit Mandelbrot, Berlin Wall, Bolshevik threat, Bretton Woods, British Empire, cuban missile crisis, falling living standards, John von Neumann, Robert Hanssen: Double agent, Ronald Reagan, Vladimir Vetrov: Farewell Dossier, éminence grise

It took the best part of six months, given the scale of the internal and external Soviet communications systems, and was implemented in the strictest secrecy. Finally, all Moscow’s communications with the outside world were suddenly severed. For Washington, this was Black Friday: October 29, 1948. When signals resumed on Monday, nothing could be deciphered. Computer Catch-Up At the Institute for Advanced Study the publication in 1946 by the Hungarian-born mathematician John von Neumann of a seminal article on the construction of the computer and the appearance of the first American civilian computer, ENIAC, that same year served to warn the Russians that the United States was moving into more innovative computation. This was despite the fact that it was a symbolic rather than a genuine threat, since the machine was digital with an outside program driven slowly, step by step, with only a single memory.

**
Stock Market Wizards: Interviews With America's Top Stock Traders
** by
Jack D. Schwager

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Asian financial crisis, banking crisis, barriers to entry, Black-Scholes formula, commodity trading advisor, computer vision, East Village, financial independence, fixed income, implied volatility, index fund, Jeff Bezos, John von Neumann, locking in a profit, Long Term Capital Management, margin call, paper trading, passive investing, pattern recognition, random walk, risk tolerance, risk-adjusted returns, short selling, Silicon Valley, statistical arbitrage, the scientific method, transaction costs, Y2K

Shaw must be fond of cacti, which lined the windowsills and included a treesize plant in the corner of the room. A large, irregular-polygon-shaped, brushed aluminum table, which served as a desk on one end and a conference area on the other, dominated the center of the room. We sat directly across from each other at the conference end. THE Q U A N T I T A T I V E EDGE traditional von Neumann machine, named after John von Neumann, has a single central processing unit (CPU) connected to a single memory unit. Originally, the two were well matched in speed and size. Over time, however, as processors became faster and memories got larger, the connection between the two—the time it takes for the CPU to get things out of memory, perform the computations, and place the results back into memory—became more and more of a bottleneck.

**
Capital Ideas: The Improbable Origins of Modern Wall Street
** by
Peter L. Bernstein

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, asset allocation, backtesting, Benoit Mandelbrot, Black-Scholes formula, Bonfire of the Vanities, Brownian motion, buy low sell high, capital asset pricing model, debt deflation, diversified portfolio, Eugene Fama: efficient market hypothesis, financial innovation, financial intermediation, fixed income, full employment, implied volatility, index arbitrage, index fund, interest rate swap, invisible hand, John von Neumann, Joseph Schumpeter, law of one price, linear programming, Louis Bachelier, mandelbrot fractal, martingale, means of production, new economy, New Journalism, profit maximization, Ralph Nader, RAND corporation, random walk, Richard Thaler, risk/return, Robert Shiller, Robert Shiller, Ronald Reagan, stochastic process, the market place, The Predators' Ball, the scientific method, The Wealth of Nations by Adam Smith, Thorstein Veblen, transaction costs, transfer pricing, zero-coupon bond

The investor’s sensitivity to changing wealth and risk is known as the utility function, and the elements that determine the shape of the utility function are obscure. As Roy put it, “A man who seeks advice about his actions will not be grateful for the suggestion that he maximize his expected utility.”19 The complexity of the subject has attracted the attention of some of the best thinkers of our time, including Kenneth Arrow, a Nobel Prize-winner, and Oskar Morgenstern and John von Neumann, famous for having invented game theory. But this is not the only feature of the Markowitz paradigm with controversial implications. The calculation of the Efficient Frontier is a task that would defy the abilities and capabilities of many investors, and even the capacities of many computers. so it is fair to ask whether the relationship between risk and return is as neat as Markowitz postulates.

**
Masterminds of Programming: Conversations With the Creators of Major Programming Languages
** by
Federico Biancuzzi,
Shane Warden

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

business intelligence, business process, cellular automata, cloud computing, complexity theory, conceptual framework, continuous integration, data acquisition, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, general-purpose programming language, HyperCard, information retrieval, iterative process, John von Neumann, linear programming, loose coupling, Mars Rover, millennium bug, NP-complete, Paul Graham, performance metric, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

Aho is the Lawrence Gussman professor in the computer science department at Columbia University. He served as chair of the department from 1995 to 1997, and in the spring of 2003. Professor Aho has a B.A.Sc. in engineering physics from the University of Toronto and a Ph.D. in electrical engineering/computer science from Princeton University. Professor Aho won the Great Teacher Award for 2003 from the Society of Columbia Graduates. Professor Aho has won the IEEE John von Neumann Medal and is a Member of the U.S. National Academy of Engineering and the American Academy of Arts and Sciences. He received honorary doctorates from the Universities of Helsinki and Waterloo, and is a Fellow of the American Association for the Advancement of Science, the ACM, Bell Labs, and the IEEE. Professor Aho is well known for his many papers and books on algorithms and data structures, programming languages, compilers, and the foundations of computer science.

**
Thinking, Fast and Slow
** by
Daniel Kahneman

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

Albert Einstein, Atul Gawande, availability heuristic, Black Swan, Cass Sunstein, Checklist Manifesto, choice architecture, cognitive bias, complexity theory, correlation coefficient, correlation does not imply causation, Daniel Kahneman / Amos Tversky, delayed gratification, demand response, endowment effect, experimental economics, experimental subject, Exxon Valdez, feminist movement, framing effect, hindsight bias, index card, job satisfaction, John von Neumann, libertarian paternalism, loss aversion, medical residency, mental accounting, meta analysis, meta-analysis, nudge unit, pattern recognition, pre–internet, price anchoring, quantitative trading / quantitative ﬁnance, random walk, Richard Thaler, risk tolerance, Ronald Reagan, The Chicago School, The Wisdom of Crowds, transaction costs, union organizing, Walter Mischel, Yom Kippur War

Expected utility theory was not intended as a psychological model; it was a logic of choice, based on elementary rules (axioms) of rationality. Consider this example: If you prefer an apple to a banana, then you also prefer a 10% chance to win an apple to a 10% chance to win a banana. The apple and the banana stand for any objects of choice (including gambles), and the 10% chance stands for any probability. The mathematician John von Neumann, one of the giant intellectual figures of the twentieth century, and the economist Oskar Morgenstern had derived their theory of rational choice between gambles from a few axioms. Economists adopted expected utility theory in a dual role: as a logic that prescribes how decisions should be made, and as a description of how Econs make choices. Amos and I were psychologists, however, and we set out to understand how Humans actually make risky choices, without assuming anything about their rationality.

**
Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations
** by
Thomas L. Friedman

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

3D printing, additive manufacturing, affirmative action, Airbnb, AltaVista, Amazon Web Services, autonomous vehicles, Ayatollah Khomeini, barriers to entry, Berlin Wall, Bernie Sanders, bitcoin, blockchain, business process, call centre, centre right, Clayton Christensen, clean water, cloud computing, corporate social responsibility, crowdsourcing, David Brooks, demand response, demographic dividend, demographic transition, Deng Xiaoping, Donald Trump, Erik Brynjolfsson, failed state, Fall of the Berlin Wall, Ferguson, Missouri, first square of the chessboard / second half of the chessboard, Flash crash, game design, gig economy, global supply chain, illegal immigration, immigration reform, income inequality, indoor plumbing, Internet of things, invention of the steam engine, inventory management, Jeff Bezos, job automation, John von Neumann, Khan Academy, Kickstarter, knowledge economy, knowledge worker, land tenure, linear programming, low skilled workers, Lyft, Mark Zuckerberg, Maui Hawaii, Menlo Park, Mikhail Gorbachev, mutually assured destruction, pattern recognition, planetary scale, pull request, Ralph Waldo Emerson, ransomware, Ray Kurzweil, Richard Florida, ride hailing / ride sharing, Robert Gordon, Ronald Reagan, Second Machine Age, self-driving car, shareholder value, sharing economy, Silicon Valley, Skype, smart cities, South China Sea, Steve Jobs, TaskRabbit, Thomas L Friedman, transaction costs, Transnistria, urban decay, urban planning, Watson beat the top human players on Jeopardy!, WikiLeaks, women in the workforce, Y2K, Yogi Berra

Because of the Humphrey/Mondale/Fraser/Freeman connection, we felt that Minnesota was special, and so we must be special too. That, and the creosote. Norm Ornstein Dear Tom, It is an oddity often remarked upon that at the turn of the century, one small, obscure provincial area of Hungary, then under the benign tutelage of Emperor Franz Josef, spawned several towering figures in the fields of physics and mathematics—among them Edward Teller, George de Hevesy, Eugene Wigner, Leo Szilard and John von Neumann. This group, many of them Nobel Prize winners, all of them products of the Jewish middle class, were referred to in their diaspora as the “Men from Mars” because of their obscure provenance and their thick Finno-Ugaric accents. What explosive tinder in this remote corner of the Carpathians had nourished such a forest fire of genius? No one knows. Many years later, the Jewish middle class of remote and obscure St.

**
Write Great Code, Volume 1
** by
Randall Hyde

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

AltaVista, business process, John von Neumann, locality of reference, Von Neumann architecture, Y2K

These two hardware components may have as large a performance impact on your software as the CPU’s speed. Knowing about memory performance characteristics, data locality, and cache operation can help you design software that runs as fast as possible. Writing great code requires a strong knowledge of the computer’s architecture. 6.1 The Basic System Components The basic operational design of a computer system is called its architecture. John von Neumann, a pioneer in computer design, is given credit for the principal architecture in use today. For example, the 80x86 family uses the von Neumann architecture (VNA). A typical von Neumann system has three major components: the central processing unit (CPU), memory, and input/output (I/O), as shown in Figure 6-1. Figure 6-1. Typical von Neumann machine In VNA machines, like the 80x86, the CPU is where all the action takes place.

**
The Beginning of Infinity: Explanations That Transform the World
** by
David Deutsch

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

agricultural Revolution, Albert Michelson, anthropic principle, artificial general intelligence, Bonfire of the Vanities, conceptual framework, cosmological principle, dark matter, David Attenborough, discovery of DNA, Douglas Hofstadter, Eratosthenes, Ernest Rutherford, first-past-the-post, Georg Cantor, Gödel, Escher, Bach, illegal immigration, invention of movable type, Isaac Newton, Islamic Golden Age, Jacquard loom, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Loebner Prize, Louis Pasteur, pattern recognition, Richard Feynman, Richard Feynman, Search for Extraterrestrial Intelligence, Stephen Hawking, supervolcano, technological singularity, The Coming Technological Singularity, the scientific method, Thomas Malthus, Thorstein Veblen, Turing test, Vernor Vinge, Whole Earth Review, William of Occam

As a matter of fact, there is no such thing as mathematical ‘inspiration’ (mathematical knowledge coming from an infallible source, traditionally God): as I explained in Chapter 8, our knowledge of mathematics is not infallible. But if Representative Mills meant that mathematicians are, or somehow ought to be, society’s best judges of fairness, then he was simply mistaken.* The National Academy of Sciences panel that reported to Congress in 1948 included the mathematician and physicist John von Neumann. It decided that a rule invented by the statistician Joseph Adna Hill (which is the one in use today) is the most impartial between states. But the mathematicians Michel Balinski and Peyton Young have since concluded that it favours smaller states. This illustrates again that different criteria of ‘impartiality’ favour different apportionment rules, and which of them is the right criterion cannot be determined by mathematics.

**
On Thermonuclear War
** by
Herman Kahn

Amazon: amazon.com — amazon.co.uk — amazon.de — amazon.fr

British Empire, defense in depth, John von Neumann, mutually assured destruction, New Journalism, oil shale / tar sands, Project Plowshare, RAND corporation

Henry Kissinger remarks in his book of a few years ago, Nuclear Weapons and Foreign Policy, that the fires of Prometheus had then been unleashed. This is an understatement of the things that are now technologically feasible but that "cost a little too much." I have not seen any figures, but I surmise that relatively thin margins of cost prevent us from doing such extraordinary projects as melting ice caps and diverting ocean currents. The coming crisis in technology was described by the late John von Neumann in an article entitled "Can We Survive Technology?" 8 To quote von Neumann: "'The great globe itself is in a rapidly maturing crisis—a crisis attributable to the fact that the environment in which technological progress must occur has become both undersized and underorganized. . . . "In the first half of this century the accelerating Industrial Revolution encountered an absolute limitation—not on technological progress as such, but on an essential safety factor.