Claude Shannon: information theory

94 results back to index


pages: 550 words: 154,725

The Idea Factory: Bell Labs and the Great Age of American Innovation by Jon Gertner

Albert Einstein, back-to-the-land, Black Swan, business climate, Charles Babbage, Claude Shannon: information theory, Clayton Christensen, complexity theory, corporate governance, cuban missile crisis, Dennis Ritchie, Edward Thorp, Fairchild Semiconductor, Henry Singleton, horn antenna, Hush-A-Phone, information retrieval, invention of the telephone, James Watt: steam engine, Karl Jansky, Ken Thompson, knowledge economy, Leonard Kleinrock, machine readable, Metcalfe’s law, Nicholas Carr, Norbert Wiener, Picturephone, Richard Feynman, Robert Metcalfe, Russell Ohl, Sand Hill Road, Silicon Valley, Skype, space junk, Steve Jobs, Telecommunications Act of 1996, Teledyne, traveling salesman, undersea cable, uranium enrichment, vertical integration, William Shockley: the traitorous eight

Shannon Collection, Library of Congress. 3 Vannevar Bush, recommendation for Claude Shannon for a National Research Fellowship, undated, circa late 1939. Shannon Collection, Library of Congress. 4 Erico Marui Guizzo, “The Essential Message: Claude Shannon and the Making of Information Theory” (master’s thesis, MIT, 2003). 5 Len Kleinrock, a former student of Shannon’s, author interview. 6 Liversidge, “Profile of Claude Shannon.” 7 Claude Shannon, letter to Dr. V. Bush, December 13, 1939. Shannon Collection, Library of Congress. 8 Liversidge, “Profile of Claude Shannon.” Biographical facts relating to Shannon’s father are in a personal letter Shannon wrote, October 20, 1981, to Ms.

John Pierce likened the surprise he felt upon encountering his friend’s ideas to the dropping of a powerful explosive. 29 Shannon, Kyoto Prize speech. 30 Robert Lucky, Silicon Dreams: Information, Man, and Machine (New York: St. Martin’s Press, 1991). 31 “Information Theory,” from Shannon’s article on the topic in Encyclopedia Britannica, 14th ed., 1968. 32 John Robinson Pierce, quoted in M. Mitchell Waldrop, “Claude Shannon: Reluctant Father of the Digital Age,” Technology Review, January 7, 2002. For my explication of information theory, I owe a debt to a number of people who have written on Shannon’s theory: John Pierce, Warren Weaver, David Kahn, Robert Gallager, Bob Lucky, Neil Sloane, Aaron Wyner, John Horgan, Wiliam Poundstone, and Erico Marui Guizzo.

Many of these exploits are detailed in William Poundstone’s book Fortune’s Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street (New York: Hill & Wang, 2005). 27 Arthur Lewbel, author interview. 28 Claude Shannon, “Scientific Aspects of Juggling,” in Collected Papers, edited by N. J. A. Sloane and Aaron D. Wyner (New York: IEEE Press, 1993). 29 Arthur Lewbel, “A Personal Tribute to Claude Shannon,” http://www2.bc.edu/~lewbel/Shannon.html. 30 Shannon still rarely answered letters or spoke with reporters. The exception was when he received a letter about chess or juggling, which he would often answer promptly, even if the correspondent was a high school student. 31 John Horgan, “Claude E. Shannon: Unicyclist, Juggler and Father of Information Theory,” Scientific American, January 1990. 32 Rudi Kompfner, note to John Pierce, June 1, 1977.


pages: 855 words: 178,507

The Information: A History, a Theory, a Flood by James Gleick

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, bank run, bioinformatics, Bletchley Park, Brownian motion, butterfly effect, Charles Babbage, citation needed, classic study, Claude Shannon: information theory, clockwork universe, computer age, Computing Machinery and Intelligence, conceptual framework, crowdsourcing, death of newspapers, discovery of DNA, Donald Knuth, double helix, Douglas Hofstadter, en.wikipedia.org, Eratosthenes, Fellow of the Royal Society, Gregor Mendel, Gödel, Escher, Bach, Henri Poincaré, Honoré de Balzac, index card, informal economy, information retrieval, invention of the printing press, invention of writing, Isaac Newton, Jacquard loom, Jaron Lanier, jimmy wales, Johannes Kepler, John von Neumann, Joseph-Marie Jacquard, Lewis Mumford, lifelogging, Louis Daguerre, machine translation, Marshall McLuhan, Menlo Park, microbiome, Milgram experiment, Network effects, New Journalism, Norbert Wiener, Norman Macrae, On the Economy of Machinery and Manufactures, PageRank, pattern recognition, phenotype, Pierre-Simon Laplace, pre–internet, quantum cryptography, Ralph Waldo Emerson, RAND corporation, reversible computing, Richard Feynman, Rubik’s Cube, Simon Singh, Socratic dialogue, Stephen Hawking, Steven Pinker, stochastic process, talking drums, the High Line, The Wisdom of Crowds, transcontinental railway, Turing machine, Turing test, women in the workforce, yottabyte

♦ “D MEASURES, IN A SENSE, HOW MUCH A TEXT”: “Communication Theory of Secrecy Systems,” in Claude Shannon, Collected Papers, 85. ♦ “THE ENEMY IS NO BETTER OFF”: Ibid., 97. ♦ “THE ‘MEANING’ OF A MESSAGE IS GENERALLY IRRELEVANT”: “Communication Theory—Exposition of Fundamentals,” IRE Transactions on Information Theory, no. 1 (February 1950), in Claude Shannon, Collected Papers, 173. ♦ “WHAT GIBBS DID FOR PHYSICAL CHEMISTRY”: Warren Weaver letter to Claude Shannon, 27 January 1949, Manuscript Division, Library of Congress. ♦ “SOMETHING OF A DELAYED ACTION BOMB”: John R. Pierce, “The Early Days of Information Theory,” IEEE Transactions on Information Theory 19, no. 1 (1973): 4

♦ “THE CAPACITY OF A SYSTEM TO TRANSMIT”: R. V. L. Hartley, “Transmission of Information,” 537. 7. INFORMATION THEORY ♦ “PERHAPS COMING UP WITH A THEORY”: Jon Barwise, “Information and Circumstance,” Notre Dame Journal of Formal Logic 27, no. 3 (1986): 324. ♦ SAID NOTHING TO EACH OTHER ABOUT THEIR WORK: Shannon interview with Robert Price: “A Conversation with Claude Shannon: One Man’s Approach to Problem Solving,” IEEE Communications Magazine 22 (1984): 125; cf. Alan Turing to Claude Shannon, 3 June 1953, Manuscript Division, Library of Congress. ♦ “NO, I’M NOT INTERESTED IN DEVELOPING A POWERFUL BRAIN”: Andrew Hodges, Alan Turing: The Enigma (London: Vintage, 1992), 251

The actual lines, from Cummings’s poem “voices to voices, lip to lip,” are: “who cares if some oneeyed son of a bitch / invents an instrument to measure Spring with?” ♦ A MACHINE THAT WOULD REPAIR ITSELF: Claude Shannon to Irene Angus, 8 August 1952, Manuscript Division, Library of Congress. ♦ “WHAT HAPPENS IF YOU SWITCH ON ONE OF THESE MECHANICAL COMPUTERS”: Robert McCraken, “The Sinister Machines,” Wyoming Tribune, March 1954. ♦ “INFORMATION THEORY, PHOTOSYNTHESIS, AND RELIGION”: Peter Elias, “Two Famous Papers,” IRE Transactions on Information Theory 4, no. 3 (1958): 99. ♦ “WE HAVE HEARD OF ‘ENTROPIES’ ”: E. Colin Cherry, On Human Communication (Cambridge, Mass.: MIT Press, 1957), 214. 9.


pages: 389 words: 109,207

Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street by William Poundstone

"RICO laws" OR "Racketeer Influenced and Corrupt Organizations", Albert Einstein, anti-communist, asset allocation, Bear Stearns, beat the dealer, Benoit Mandelbrot, Black Monday: stock market crash in 1987, Black-Scholes formula, Bletchley Park, Brownian motion, buy and hold, buy low sell high, capital asset pricing model, Claude Shannon: information theory, computer age, correlation coefficient, diversified portfolio, Edward Thorp, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial engineering, Henry Singleton, high net worth, index fund, interest rate swap, Isaac Newton, Johann Wolfgang von Goethe, John Meriwether, John von Neumann, junk bonds, Kenneth Arrow, Long Term Capital Management, Louis Bachelier, margin call, market bubble, market fundamentalism, Marshall McLuhan, Michael Milken, Myron Scholes, New Journalism, Norbert Wiener, offshore financial centre, Paul Samuelson, publish or perish, quantitative trading / quantitative finance, random walk, risk free rate, risk tolerance, risk-adjusted returns, Robert Shiller, Ronald Reagan, Rubik’s Cube, short selling, speech recognition, statistical arbitrage, Teledyne, The Predators' Ball, The Wealth of Nations by Adam Smith, transaction costs, traveling salesman, value at risk, zero-coupon bond, zero-sum game

Several of its scientists, notably Billy Kluver, collaborated with the New York avant-garde: John Cage, Robert Rauschenberg, Nam June Paik, Andy Warhol, David Tudor, and others, some of whom lived and worked steps away from Bell Labs’ Manhattan building on West Street. Many of these artists were acquainted with at least the name of Claude Shannon and the conceptual gist of his theory. To people like Cage and Rauschenberg, who were exploring how minimal a work of music or art may be, information theory appeared to have something to say—even if no one was ever entirely sure what. Shannon came to feel that information theory had been over-sold. In a 1956 editorial he gently derided the information theory “bandwagon.” People who did not understand the theory deeply were seizing on it as a trendy metaphor and overstating its relevance to fields remote from its origin.

PART ONE Entropy Claude Shannon LIFE IS A GAMBLE. There are few sure things, least of all in the competitive world of academic recruitment. Claude Shannon was as close to a sure thing as existed. That is why the Massachusetts Institute of Technology was prepared to do what was necessary to lure Shannon away from AT&T’s Bell Labs, and why the institute was delighted when Shannon became a visiting professor in 1956. Shannon had done what practically no one else had done since the Renaissance. He had single-handedly invented an important new science. Shannon’s information theory is an abstract science of communication that lies behind computers, the Internet, and all digital media.

Weaver’s essay presented information theory as a humanistic discipline—perhaps misleadingly so. Strongly influenced by Shannon, media theorist Marshall McLuhan coined the term “information age” in Understanding Media (1964). Oracular as some of his pronouncements were, McLuhan spoke loud and clear with that concise coinage. It captured the way the electronic media (still analog in the 1960s) were changing the world. It implied, more presciently than McLuhan could have known, that Claude Shannon was a prime mover in that revolution. There were earnest attempts to apply information theory to semantics, linguistics, psychology, economics, management, quantum physics, literary criticism, garden design, music, the visual arts, and even religion.


pages: 229 words: 67,599

The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J. Nahin

air gap, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Charles Babbage, Claude Shannon: information theory, Computing Machinery and Intelligence, conceptual framework, Edward Thorp, Fellow of the Royal Society, finite state, four colour theorem, Georg Cantor, Grace Hopper, Isaac Newton, John von Neumann, knapsack problem, New Journalism, Pierre-Simon Laplace, reversible computing, Richard Feynman, Schrödinger's Cat, Steve Jobs, Steve Wozniak, thinkpad, Thomas Bayes, Turing machine, Turing test, V2 rocket

See also Busy Beaver Game redundancy Reilly, Sidney relay: crummy; theory of Riordan, John RS flip-flop sample (point); (space) shannon (information unit). See also bit Shannon, Catherine (sister of Claude) Shannon, Claude (father of Claude) Shannon, Claude Elwood; codes by; life of; on probability; his salesmen and engineers puzzle; and switches; and time machines; and Turing machines Shannon, Mabel (mother of Claude) Shannon-Hagelbarger theorem Sheffer, Henry Shestakov, Victor Shor, Peter. See also algorithm (Shor’s) Sklansky, Jack source rate sphere-packing spooky-action-at-a-distance. See also quantum mechanics square-root of NOT state-transition state-vector; collapse of.

The Logician and the Engineer How George Boole and Claude Shannon Created the Information Age PAUL J . NAHIN PRINCETON UNIVERSITY PRESS PRINCETON AND OXFORD Copyright © 2013 by Princeton University Press Published by Princeton University Press, 41 William Street, Princeton, New Jersey 08540 In the United Kingdom: Princeton University Press, 6 Oxford Street, Woodstock, Oxfordshire OX20 1TW press.princeton.edu All Rights Reserved Library of Congress Cataloging-in-Publication Data Nahin, Paul J. The logician and the engineer : how George Boole and Claude Shannon created the information age / Paul J.

Gallager, “Claude Elwood Shannon,” Proceedings of the American Philosophical Society, June 2003, pp. 187–191. (b) Anthony Liversidge, “Profile of Claude Shannon,” Omni, August 1987 (reprinted in Shannon’s Collected Papers, N.J.A. Sloane and Aaron D. Wyner, editors, IEEE Press, 1993). (c) Solomon W. Golomb et al., “Claude Elwood Shannon (1916–2001),” Notices of the American Mathematical Society, January 2002, pp. 8–16. (d) James F. Crow, “Shannon’s Brief Foray into Genetics,” Genetics, November 2001, pp. 915–917. 10. Liversidge, “Profile of Claude Shannon.” 4 Boolean Algebra They who are acquainted with the present state of the theory of Symbolical Algebra, are aware that the validity of the processes of analysis does not depend upon the interpretation of the symbols which are employed, but solely upon the laws of their combination


pages: 352 words: 120,202

Tools for Thought: The History and Future of Mind-Expanding Technology by Howard Rheingold

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bletchley Park, card file, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, Compatible Time-Sharing System, computer age, Computer Lib, Computing Machinery and Intelligence, conceptual framework, Conway's Game of Life, Douglas Engelbart, Dynabook, experimental subject, Hacker Ethic, heat death of the universe, Howard Rheingold, human-factors engineering, interchangeable parts, invention of movable type, invention of the printing press, Ivan Sutherland, Jacquard loom, John von Neumann, knowledge worker, machine readable, Marshall McLuhan, Menlo Park, Neil Armstrong, Norbert Wiener, packet switching, pattern recognition, popular electronics, post-industrial society, Project Xanadu, RAND corporation, Robert Metcalfe, Silicon Valley, speech recognition, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, telemarketer, The Home Computer Revolution, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

It turns out that coding and storing happen to be central problems in the logical design of computing machines and the creation of software. The basic scientific work that resulted in information theory did not originate from any investigation of computation, however, but from an analysis of communication. Claude Shannon, several years younger than Turing, working about a year after the British logician's discoveries in metamathematics, did another nifty little bit of graduate work that tied together theory and engineering, philosophy, and machinery. Chapter Six: Inside Information His unicycle skills notwithstanding, Claude Shannon has been more flamboyant but no less brilliant than his elder colleagues.

Cybernetics was about the nature of control and communication systems in animals, humans, and machines. Claude Shannon, another lone-wolf genius, is still known to his neighbors in Cambridge, Massachusetts, for his skill at riding a motorcycle. In 1937, as a twenty-one-year-old graduate student, he showed that Boole's logical algebra was the perfect tool for analyzing the complex networks of switching circuits used in telephone systems and, later, in computers. During the war and afterward, Shannon established the mathematical foundation of information theory. Together with cybernetics, this collection of theorems about information and communication created a new way to understand people and machines--and established information as a cosmic fundamental, along with energy and matter.

Ideally, the same set of mathematical tools would work for both electrical and logical operations. The problem of the late 1930s was that nobody knew of any mathematical operations that had the power to describe both logical and electrical networks. Then the right kind of mind looked in the right place. An exceptionably astute graduate student at MIT named Claude Shannon, who later invented information theory, found Boole's algebra to be exactly what the engineers were looking for. Without Boole, a poverty-stricken, self-taught mathematics teacher who was born the same year as Ada, the critical link between logic and mathematics might never have been accomplished. While the Analytical Engine was an inspiring attempt, it had remarkably little effect on the later thinkers who created modern computers.


pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive by Brian Christian

"Friedman doctrine" OR "shareholder theory", 4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bertrand Russell: In Praise of Idleness, Blue Ocean Strategy, carbon footprint, cellular automata, Charles Babbage, Claude Shannon: information theory, cognitive dissonance, commoditize, complexity theory, Computing Machinery and Intelligence, crowdsourcing, David Heinemeier Hansson, Donald Trump, Douglas Hofstadter, George Akerlof, Gödel, Escher, Bach, high net worth, Isaac Newton, Jacques de Vaucanson, Jaron Lanier, job automation, Kaizen: continuous improvement, Ken Thompson, l'esprit de l'escalier, language acquisition, Loebner Prize, machine translation, Menlo Park, operational security, Ray Kurzweil, RFID, Richard Feynman, Ronald Reagan, SimCity, Skype, Social Responsibility of Business Is to Increase Its Profits, starchitect, statistical model, Stephen Hawking, Steve Jobs, Steven Pinker, Thales of Miletus, theory of mind, Thomas Bayes, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero-sum game

A Mathematical Theory of Communication It seems, at first glance, that information theory—the science of data transmission, data encryption, and data compression—would be mostly a question of engineering, having little to do with the psychological and philosophical questions that surround the Turing test and AI. But these two ships turn out to be sailing quite the same seas. The landmark paper that launched information theory is Claude Shannon’s 1948 “A Mathematical Theory of Communication,” and as it happens, this notion of scientifically evaluating “communication” binds information theory and the Turing test to each other from the get-go.

For more, see Hofstadter’s I Am a Strange Loop. 56 Benjamin Seider, Gilad Hirschberger, Kristin Nelson, and Robert Levenson, “We Can Work It Out: Age Differences in Relational Pronouns, Physiology, and Behavior in Marital Conflict,” Psychology and Aging 24, no. 3 (September 2009), pp. 604–13. 10. High Surprisal 1 Claude Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27 (1948), pp. 379–423, 623–56. 2 average American teenager: Katie Hafner, “Texting May Be Taking a Toll,” New York Times, May 25, 2009. 3 The two are in fact related: For more information on the connections between Shannon (information) entropy and thermodynamic entropy, see, e.g., Edwin Jaynes, “Information Theory and Statistical Mechanics,” Physical Review 106, no. 4, (May 1957), pp. 620–30; and Edwin Jaynes, “Information Theory and Statistical Mechanics II,” Physical Review 108, no. 2 (October 1957), pp. 171–90. 4 Donald Barthelme, “Not-Knowing,” in Not-Knowing: The Essays and Interviews of Donald Barthelme, edited by Kim Herzinger (New York: Random House, 1997). 5 Jonathan Safran Foer, Extremely Loud and Incredibly Close (Boston: Houghton Mifflin, 2005). 6 The cloze test comes originally from W.

Epilogue: The Unsung Beauty of the Glassware Cabinet Acknowledgments Notes The beautiful changes as a forest is changed By a chameleon’s tuning his skin to it; As a mantis, arranged On a green leaf, grows Into it, makes the leaf leafier … –RICHARD WILBUR I think metaphysics is good if it improves everyday life; otherwise forget it. –ROBERT PIRSIG As President, I believe that robotics can inspire young people to pursue science and engineering. And I also want to keep an eye on those robots, in case they try anything. –BARACK OBAMA 0. Prologue Claude Shannon, artificial intelligence pioneer and founder of information theory, met his wife, Mary Elizabeth, at work. This was Bell Labs in Murray Hill, New Jersey, the early 1940s. He was an engineer, working on wartime cryptography and signal transmission. She was a computer. 1. Introduction: The Most Human Human I wake up five thousand miles from home in a hotel room with no shower: for the first time in fifteen years, I take a bath.


pages: 444 words: 111,837

Einstein's Fridge: How the Difference Between Hot and Cold Explains the Universe by Paul Sen

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anthropic principle, anti-communist, Bletchley Park, British Empire, Brownian motion, Claude Shannon: information theory, Computing Machinery and Intelligence, cosmic microwave background, cosmological constant, Ernest Rutherford, heat death of the universe, invention of radio, Isaac Newton, James Watt: steam engine, John von Neumann, Khan Academy, Kickstarter, Richard Feynman, seminal paper, Stephen Hawking, traveling salesman, Turing complete, Turing test

Sparking intense research over the last few decades, another field of physics entered the discussion along with thermodynamics, quantum theory, and general relativity. That field is information theory. To see why, imagine again that a box of hot gas is hurtling into the black hole. To calculate the gas’s entropy, one could theoretically make an enormously long list of the position of each gas molecule and the direction in which it was traveling. One could then convert each item in this list into a binary number using the methods discovered by Claude Shannon in the 1940s. Doing so would generate a complete description of the gas’s contents as a long string of 1s and 0s.

But in 1982, in a taped interview Shannon is rather hazy about why he chose the term entropy. MST PPL HV: From “Information Theory” by Claude E. Shannon, Encyclopaedia Britannica, 14th ed. “information is physical”: Landauer wrote an article with this title in 1991 in Physics Today. another Bell Labs discovery: See Idea Factory by Gertner. 10 million-millionths of a joule of heat: See the calculation in chapter 10 of The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J. Nahin. hot plates on stoves: See “A Research Agenda Towards Zero-Power ICT” by Gabriel Abadal Berini, Giorgos Fagas, Luca Gammaitoni, and Douglas Paul, 2014.

Ma Bell, as the company: For a detailed history of Bell Labs, see The Idea Factory: Bell Labs and the Great Age of American Innovation by Jon Gertner. “Mr. Watson, come here”: See above. Claude Shannon was born: For details on Shannon’s life and work, see A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni and Rob Goodman. “if you walked a couple of blocks”: As quoted in “Profile of Claude Shannon” by Anthony Liversidge, in the introduction to Claude Elwood Shannon: Collected Papers. “when a husband was capable”: As quoted in Mind at Play by Soni and Goodman. “A decidedly unconventional type of youngster”: Letter from Vannevar Bush to E.


pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI by John Brockman

AI winter, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alignment Problem, AlphaGo, artificial general intelligence, Asilomar, autonomous vehicles, basic income, Benoit Mandelbrot, Bill Joy: nanobots, Bletchley Park, Buckminster Fuller, cellular automata, Claude Shannon: information theory, Computing Machinery and Intelligence, CRISPR, Daniel Kahneman / Amos Tversky, Danny Hillis, data science, David Graeber, deep learning, DeepMind, Demis Hassabis, easy for humans, difficult for computers, Elon Musk, Eratosthenes, Ernest Rutherford, fake news, finite state, friendly AI, future of work, Geoffrey Hinton, Geoffrey West, Santa Fe Institute, gig economy, Hans Moravec, heat death of the universe, hype cycle, income inequality, industrial robot, information retrieval, invention of writing, it is difficult to get a man to understand something, when his salary depends on his not understanding it, James Watt: steam engine, Jeff Hawkins, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kickstarter, Laplace demon, Large Hadron Collider, Loebner Prize, machine translation, market fundamentalism, Marshall McLuhan, Menlo Park, military-industrial complex, mirror neurons, Nick Bostrom, Norbert Wiener, OpenAI, optical character recognition, paperclip maximiser, pattern recognition, personalized medicine, Picturephone, profit maximization, profit motive, public intellectual, quantum cryptography, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, Richard Feynman, Rodney Brooks, self-driving car, sexual politics, Silicon Valley, Skype, social graph, speech recognition, statistical model, Stephen Hawking, Steven Pinker, Stewart Brand, strong AI, superintelligent machines, supervolcano, synthetic biology, systems thinking, technological determinism, technological singularity, technoutopianism, TED Talk, telemarketer, telerobotics, The future is already here, the long tail, the scientific method, theory of mind, trolley problem, Turing machine, Turing test, universal basic income, Upton Sinclair, Von Neumann architecture, Whole Earth Catalog, Y2K, you are the product, zero-sum game

Inspired by communications theorist Marshall McLuhan, architect-designer Buckminster Fuller, futurist John McHale, and cultural anthropologists Edward T. “Ned” Hall and Edmund Carpenter, I started reading avidly in the fields of information theory, cybernetics, and systems theory. McLuhan suggested I read biologist J. Z. Young’s Doubt and Certainty in Science, in which he said that we create tools and we mold ourselves through our use of them. The other text he recommended was Warren Weaver and Claude Shannon’s 1949 paper “Recent Contributions to the Mathematical Theory of Communication,” which begins: “The word communication will be used here in a very broad sense to include all of the procedures by which one mind may affect another.

“The machine plays no favorites between manual labor and white-collar labor,” he observed. For all that, many of the central arguments in The Human Use of Human Beings seem closer to the 19th century than the 21st. In particular, although Wiener made reference throughout to Claude Shannon’s then-new work on information theory, he seems not to have fully embraced Shannon’s notion of information as consisting of irreducible, meaning-free bits. Since Wiener’s day, Shannon’s theory has come to undergird recent advances in “Big Data” and “deep learning,” which makes it all the more interesting to revisit Wiener’s cybernetic imagination.

Wiener’s three c’s (command, control, communication) drew on the mathematics of probability to formalize systems (whether biological or mechanical) theorized as a set of inputs of information achieving outputs of actions in an environment—a muscular, fleshy agenda often minimized in genealogies of AI. But the etymology does little to capture the excitement felt by participants, as mathematics joined theoretical biology (Arturo Rosenblueth) and information theory (Claude Shannon, Walter Pitts, Warren McCulloch) to produce a barrage of interdisciplinary research and publications viewed as changing not just the way science was done but the way future humans would engage with the technosphere. As Wiener put it, “We have modified our environment so radically that we must now modify ourselves in order to exist.”* The pressing question is: How are we modifying ourselves?


pages: 759 words: 166,687

Between Human and Machine: Feedback, Control, and Computing Before Cybernetics by David A. Mindell

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Charles Babbage, Charles Lindbergh, Claude Shannon: information theory, Computer Numeric Control, discrete time, Dr. Strangelove, Frederick Winslow Taylor, From Mathematics to the Technologies of Life and Death, James Watt: steam engine, John von Neumann, Lewis Mumford, Menlo Park, military-industrial complex, Neil Armstrong, Norbert Wiener, Paul Samuelson, public intellectual, Ronald Reagan, scientific management, Silicon Valley, Spread Networks laid a new fibre optics cable between New York and Chicago, tacit knowledge, telerobotics, Turing machine

Indeed, Nyquist’s and Hartley’s notions began to resemble digital representations, as their techniques analyzed the subtleties of converting discrete pulses to and from the continuous world. These men laid the groundwork for the theory of information that Claude Shannon would articulate in 1948. 79 In the first paragraph of his famous work, Shannon cited Nyquist’s and Hartley’s work on transmission and information theory: each related the abstraction of signals to the extension of human activity by the telephone’s spreading network. Conclusion Harold Black, Hendrik Bode, and Harry Nyquist brought negative feedback and the vacuum tube within the realm of signals, frequencies, and networks.

As Wiener had done in Extrapolation, Interpolation, and Smoothing of Stationary Time Series , Blackman, Bode, and Shannon broadened the relevance of their study beyond fire control, treating it as “a special case of the transmission, manipulation, and utilization of intelligence.” Like Wiener, they pointed out that predicting time-series data had broad application to “weather records, stock market prices, production statistics, and the like.” 21 Just a few years later Claude Shannon expanded these ideas in his 1948 paper, “A Mathematical Theory of Communication,” which famously created information theory. 22 Following Nyquist’s and Hartley’s analyses years earlier, Shannon defined the act of communication as transferring a given message, or a series of symbols, from one place to another (from one person to another or from one machine to another) through a noisy channel of finite bandwidth.

From this work emerged new theories and practices of feedback, control, and computing. As always, the continuous nature of these processes makes the choice of beginning and ending somewhat arbitrary. This narrative ends in 1945, when the Office of Scientific Research and Development (OSRD) closed down, and with the subsequent publication of Cybernetics , Claude Shannon’s “Mathematical Theory of Communications,” and the Massachusetts Institute of Technology (MIT) Radiation Laboratory series of textbooks on radar, electronics, and servomechanisms. These and other publications helped spread the results of the war’s massive research and development projects and laid foundations for a new era of communications, control, and computing.


pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy by George Gilder

23andMe, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AlphaGo, AltaVista, Amazon Web Services, AOL-Time Warner, Asilomar, augmented reality, Ben Horowitz, bitcoin, Bitcoin Ponzi scheme, Bletchley Park, blockchain, Bob Noyce, British Empire, Brownian motion, Burning Man, business process, butterfly effect, carbon footprint, cellular automata, Claude Shannon: information theory, Clayton Christensen, cloud computing, computer age, computer vision, crony capitalism, cross-subsidies, cryptocurrency, Danny Hillis, decentralized internet, deep learning, DeepMind, Demis Hassabis, disintermediation, distributed ledger, don't be evil, Donald Knuth, Donald Trump, double entry bookkeeping, driverless car, Elon Musk, Erik Brynjolfsson, Ethereum, ethereum blockchain, fake news, fault tolerance, fiat currency, Firefox, first square of the chessboard, first square of the chessboard / second half of the chessboard, floating exchange rates, Fractional reserve banking, game design, Geoffrey Hinton, George Gilder, Google Earth, Google Glasses, Google Hangouts, index fund, inflation targeting, informal economy, initial coin offering, Internet of things, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, Jim Simons, Joan Didion, John Markoff, John von Neumann, Julian Assange, Kevin Kelly, Law of Accelerating Returns, machine translation, Marc Andreessen, Mark Zuckerberg, Mary Meeker, means of production, Menlo Park, Metcalfe’s law, Money creation, money: store of value / unit of account / medium of exchange, move fast and break things, Neal Stephenson, Network effects, new economy, Nick Bostrom, Norbert Wiener, Oculus Rift, OSI model, PageRank, pattern recognition, Paul Graham, peer-to-peer, Peter Thiel, Ponzi scheme, prediction markets, quantitative easing, random walk, ransomware, Ray Kurzweil, reality distortion field, Recombinant DNA, Renaissance Technologies, Robert Mercer, Robert Metcalfe, Ronald Coase, Ross Ulbricht, Ruby on Rails, Sand Hill Road, Satoshi Nakamoto, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Singularitarianism, Skype, smart contracts, Snapchat, Snow Crash, software is eating the world, sorting algorithm, South Sea Bubble, speech recognition, Stephen Hawking, Steve Jobs, Steven Levy, Stewart Brand, stochastic process, Susan Wojcicki, TED Talk, telepresence, Tesla Model S, The Soul of a New Machine, theory of mind, Tim Cook: Apple, transaction costs, tulip mania, Turing complete, Turing machine, Vernor Vinge, Vitalik Buterin, Von Neumann architecture, Watson beat the top human players on Jeopardy!, WikiLeaks, Y Combinator, zero-sum game

Another, less celebrated, was his key role in establishing a trustworthy gold standard, which made economic valuations as calculable and reliable as the physical dimensions of items in trade. Since Claude Shannon in 1948 and Peter Drucker in the 1950s, we have all spoken of the information economy as if it were a new idea. But both Newton’s physics and his gold standard were information systems. More specifically, the Newtonian system is what we call today an information theory. Newton’s biographers typically underestimate his achievement in establishing the information theory of money on a firm foundation. As one writes, Watching over the minting of a nation’s coin, catching a few counterfeiters, increasing an already respectably sized personal fortune, being a political figure, even dictating to one’s fellow scientists [as president of the Royal Society]; it should all seem a crass and empty ambition once you have written a Principia.2 But build a better money ratchet and the world will beat a path to your door.

They offer the still-remote promise of new computer architectures such as quantum computers that can actually model physical reality and thus may finally yield some real intelligence. The current generation in Silicon Valley has yet to come to terms with the findings of von Neumann and Gödel early in the last century or with the breakthroughs in information theory of Claude Shannon, Gregory Chaitin, Anton Kolmogorov, and John R. Pierce. In a series of powerful arguments, Chaitin, the inventor of algorithmic information theory, has translated Gödel into modern terms. When Silicon Valley’s AI theorists push the logic of their case to explosive extremes, they defy the most crucial findings of twentieth-century mathematics and computer science.

Ludwig Boltzmann (1844–1906) identified this difference with missing information, or uncertainty about the arrangement of the molecules, thus opening the way for Claude Shannon and information theory. Both forms of entropy register disorder. Boltzmann’s entropy is analog and governed by the natural logarithm e, while Shannon’s entropy is digital and governed by log 2. Chaitin’s Law: Gregory Chaitin, inventor of algorithmic information theory, ordains that you cannot use static, eternal, perfect mathematics to model dynamic creative life. Determinist math traps the mathematician in a mechanical process that cannot yield innovation or surprise, learning or life.


pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, animal electricity, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, Bletchley Park, British Empire, business process, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, complexity theory, Computing Machinery and Intelligence, continuous integration, Conway's Game of Life, cosmological principle, dark matter, data science, deep learning, DeepMind, dematerialisation, double helix, Douglas Hofstadter, driverless car, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, Hans Moravec, income inequality, index card, industrial robot, intentional community, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, Kickstarter, liberal capitalism, lifelogging, machine translation, millennium bug, mirror neurons, Moravec's paradox, natural language processing, Nick Bostrom, Norbert Wiener, off grid, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, Plato's cave, post-industrial society, power law, precautionary principle, prediction markets, Ray Kurzweil, Recombinant DNA, Rodney Brooks, Second Machine Age, self-driving car, seminal paper, Silicon Valley, social intelligence, speech recognition, stem cell, Stephen Hawking, Steven Pinker, Strategic Defense Initiative, strong AI, Stuart Kauffman, synthetic biology, systems thinking, technological singularity, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

For now, I want to focus on four individuals who took part in the Macy Conferences, and whose work laid the foundations for Artificial Intelligence: Norbert Wiener, Claude Shannon, Warren McCulloch and John von Neumann. We have already met the first two. Norbert Wiener was the grand visionary of cybernetics. Inspired by mechanical control systems, such as artillery targeting and servomechanisms, as well as Claude Shannon’s mathematical theory of communication and information, he articulated the theory of cybernetics in his landmark book, Cybernetics, of 1948.4 Godfather number two, Claude Shannon, was the genius who gave us information theory. We saw how Wiener and Shannon pondered on the ontology of information, and how they decided to regard it as something beyond matter and energy.

Instead of adhering to the principle that universals are always predicated by physical objects they adopted the opposite, Platonic, idea. In Part III of this book, we will explore in more detail the technological reasons behind this conceptual shift. But before we do so, let us first examine the concept of information. The father of information theory is the American mathematician, electronic engineer and cryptographer Claude Shannon (1916–2001). He worked as a cryptanalyst in the Second World War, and in early 1943 he met Alan Turing, who had been posted to Washington to work with the Americans on breaking the German naval codes. Like his English counterpart, Shannon is one of the great heroes of computer science, a man whose work has shaped the world we live in.

Two of them were, arguably, the most significant. In 1876 Graham Bell discovered the telephone. And four years later, in 1879, Thomas Edison discovered the incandescent lamp.14 These two inventions would become fundamental in the birth and evolution of electronics and telecommunications. By the time Claude Shannon began to formulate his information theory, there was a pressing need for better telecommunications systems because of the ubiquity of the telephone. Meanwhile, Edison’s incandescent lamp had evolved into sophisticated vacuum tubes that acted as electronic amplifiers, rectifiers, switches and oscillators. Shannon was the genius who combined logic with electronics.


pages: 293 words: 91,110

The Chip: How Two Americans Invented the Microchip and Launched a Revolution by T. R. Reid

Albert Einstein, Bob Noyce, Claude Shannon: information theory, computer age, cotton gin, discovery of penicillin, double helix, Ernest Rutherford, Fairchild Semiconductor, full employment, George Gilder, Guggenheim Bilbao, hiring and firing, industrial robot, Internet Archive, Isaac Newton, John von Neumann, Menlo Park, New Journalism, Norbert Wiener, oil shock, PalmPilot, Parkinson's law, popular electronics, Richard Feynman, Ronald Reagan, seminal paper, Silicon Valley, Turing machine, William Shockley: the traitorous eight

He published another seminal paper, “A Mathematical Theory of Communication,” that launched an even more important new academic discipline known as information theory; today information theory is fundamental not only in electronics and computer science but also in linguistics, sociology, and numerous other fields. You could argue that Claude Shannon was the Alexander Graham Bell of the cellular phone, because mobile communications would be impossible without the basic formulas of information theory that Shannon devised. In 1949, Shannon published a monograph—once again, the first one ever written on the topic—called “Programming a Computer for Playing Chess.”

To this day, the capacity of computers and other digital devices is still measured in bits; if a personal computer is rated at 64 megabits, that means it comes with enough random-access memory to store 64 million bits, or distinct pieces of information. The term is now used by digital designers everywhere, many of whom have probably never heard of Claude Shannon. Shannon wouldn’t mind that, though. He was not one to blow his own horn. During the years he taught information theory at MIT, he never mentioned that he was the creator of the academic discipline his students were studying, and seemed somewhat embarrassed when diligent students figured out that their prof was the progenitor. Early in 2001, Bell Labs set up an exhibit in Shannon’s honor, noting how many of his twentieth-century ideas have become part and parcel of daily life in the new century.

There is also interesting Booleana in Mary Everest Boole, A Boolean Anthology (Association of Teachers of Mathematics, 1972). Dover Press deserves our gratitude for keeping in print a paperback version of George Boole’s masterpiece, The Laws of Thought (New York: Dover Publications, 1951). There is as yet no biography of Claude Shannon, but a reader might be interested in the book that launched the burgeoning field of information theory—that is, Claude E. Shannon, The Mathematical Theory of Communication (Champaign: University of Illinois Press, 1949). Computer history is just now emerging as an academic discipline of its own, and there will no doubt be some fine books written on the work of von Neumann, Turing, and other computer pioneers.


pages: 209 words: 53,236

The Scandal of Money by George Gilder

Affordable Care Act / Obamacare, Alan Greenspan, bank run, behavioural economics, Bernie Sanders, bitcoin, blockchain, borderless world, Bretton Woods, capital controls, Capital in the Twenty-First Century by Thomas Piketty, Carmen Reinhart, central bank independence, Claude Shannon: information theory, Clayton Christensen, cloud computing, corporate governance, cryptocurrency, currency manipulation / currency intervention, currency risk, Daniel Kahneman / Amos Tversky, decentralized internet, Deng Xiaoping, disintermediation, Donald Trump, fiat currency, financial innovation, Fractional reserve banking, full employment, George Gilder, glass ceiling, guns versus butter model, Home mortgage interest deduction, impact investing, index fund, indoor plumbing, industrial robot, inflation targeting, informal economy, Innovator's Dilemma, Internet of things, invisible hand, Isaac Newton, James Carville said: "I would like to be reincarnated as the bond market. You can intimidate everybody.", Jeff Bezos, John Bogle, John von Neumann, Joseph Schumpeter, Kenneth Rogoff, knowledge economy, Law of Accelerating Returns, low interest rates, Marc Andreessen, Mark Spitznagel, Mark Zuckerberg, Menlo Park, Metcalfe’s law, Money creation, money: store of value / unit of account / medium of exchange, mortgage tax deduction, Nixon triggered the end of the Bretton Woods system, obamacare, OSI model, Paul Samuelson, Peter Thiel, Ponzi scheme, price stability, Productivity paradox, proprietary trading, purchasing power parity, quantitative easing, quantitative trading / quantitative finance, Ray Kurzweil, reality distortion field, reserve currency, road to serfdom, Robert Gordon, Robert Metcalfe, Ronald Reagan, Sand Hill Road, Satoshi Nakamoto, Search for Extraterrestrial Intelligence, secular stagnation, seigniorage, Silicon Valley, Skinner box, smart grid, Solyndra, South China Sea, special drawing rights, The Great Moderation, The Rise and Fall of American Growth, The Wealth of Nations by Adam Smith, Tim Cook: Apple, time value of money, too big to fail, transaction costs, trickle-down economics, Turing machine, winner-take-all economy, yield curve, zero-sum game

Like the electromagnetic spectrum, which bears all the messages of the Internet to and from your smartphone or computer, it must be rooted in the absolute speed of light, the ultimate guarantor of the integrity of time. Dominating our own era and revealing in fundamental ways the nature of money is the information theory of Kurt Gödel, John von Neumann, Alan Turing, and Claude Shannon. Information theory tells us that information is not order but disorder, not the predictable regularity that contains no news, but the unexpected modulation, the surprising bits. But human creativity and surprise depend upon a matrix of regularities, from the laws of physics to the stability of money.4 Information theory has impelled the global ascendancy of information technology. From worldwide webs of glass and light to a boom in biotech based on treating life itself as chiefly an information system, a new system of the world is transforming our lives.

The Stanford physicist and Nobel laureate Robert Laughlin has derided the elaborate efforts of scientists to find significance in the intrinsically transitory forms that arise on their computers during phase changes, such as bubbles in water on the brink of a boil.7 These computational figments have an analogue here in the outside traders’ search for momentary correlations. As Claude Shannon knew, in principle a creative pattern of data points—reflecting long and purposeful preparation and invention—is indistinguishable from a random pattern. Both are high entropy. Parsing of random patterns for transitory correlations fails to yield new knowledge. You cannot meaningfully study the ups and downs of the market with an oscilloscope. You need a microscope, exploring inside the cells of individual companies. Currency values should be stable. In information theory terms, they should function as low-entropy carriers for high-entropy creations.

Because we use it to prioritize most of our activities, register and endow our accomplishments of learning and invention, and organize the life-sustaining work of our society, money is more than a mere payments system. It expresses a system of the world. That is why I link it to the information theory of Kurt Gödel, Alan Turing, and Claude Shannon. Each of these thinkers attempted to define his philosophy in utilitarian and determinist mathematics. Addressing pure logic as math, Gödel concluded that even arithmetic cannot constitute a complete and coherent system. All logical schemes have to move beyond self-referential circularity and invoke axioms outside themselves.


pages: 350 words: 90,898

A World Without Email: Reimagining Work in an Age of Communication Overload by Cal Newport

Cal Newport, call centre, Claude Shannon: information theory, cognitive dissonance, collaborative editing, Compatible Time-Sharing System, computer age, COVID-19, creative destruction, data science, David Heinemeier Hansson, fault tolerance, Ford Model T, Frederick Winslow Taylor, future of work, Garrett Hardin, hive mind, Inbox Zero, interchangeable parts, it's over 9,000, James Watt: steam engine, Jaron Lanier, John Markoff, John Nash: game theory, Joseph Schumpeter, Kanban, Kickstarter, knowledge worker, Marshall McLuhan, Nash equilibrium, passive income, Paul Graham, place-making, pneumatic tube, remote work: asynchronous communication, remote working, Richard Feynman, rolodex, Salesforce, Saturday Night Live, scientific management, Silicon Valley, Silicon Valley startup, Skype, social graph, stealth mode startup, Steve Jobs, supply-chain management, technological determinism, the medium is the message, the scientific method, Tragedy of the Commons, web application, work culture , Y Combinator

When I was writing my master’s thesis at MIT in the electrical engineering and computer science department (the field Shannon created from scratch with his 1937 work), we heard about Shannon’s spectacular student efforts. In retrospect, I’m not sure if this was supposed to motivate us or demoralize us. 2. For a more complete treatment of Claude Shannon, I recommend Jimmy Soni and Rob Goodman’s fascinating 2017 biography, which was the source for much of the summary that follows: A Mind at Play: How Claude Shannon Invented the Information Age (New York: Simon & Schuster, 2017). 3. Information theorists would traditionally use the word code instead of protocol in this instance, but for the sake of clarity in the discussion we’re having here, I’m going to use protocol—as in a set of communication rules agreed on in advance—as it sidesteps the colloquial associations people hold with respect to the word code. 4.

Whether you’re deploying complex automation or just following handcrafted procedures, these processes will reduce your dependence on the hyperactive hive mind workflow and reward you with extra cognitive energy and mental peace. Make automatic what you can reasonably make automatic, and only then worry about what to do with what remains. Chapter 6 The Protocol Principle The Invention of Information Claude Shannon is one of the most important figures in twentieth-century science, yet few outside the specialized fields he helped innovate know his name. Perhaps his largest intellectual leap was his 1937 MIT master’s thesis, which he submitted at the age of twenty-one and, among other contributions, laid the foundation for all of digital electronics.1 But it’s toward another of his most famous works that I’ll turn our attention now, as it will prove useful in our quest to move beyond the hyperactive hive mind workflow.

You might think that the gains here are small—how hard is it to send some emails?—but if you’re like me, you’ll likely be surprised by the feeling of a burden being lifted when you eliminate all these ongoing scheduling conversations, which have a way of nibbling at the borders of your concentration, driving you again and again back into the hive mind chatter. Claude Shannon’s framework underscores this reality. Meeting-scheduling protocols induce a small extra inconvenience cost, as you have to set up the system, and your correspondents now have to select times from a website instead of simply shooting back a short email reply in the moment. But the cognitive cycles saved are so substantial that there’s no comparison: the average cost of these meeting-scheduling protocols is significantly lower than what’s required by the status quo of energy-minimizing email ping-pong.


pages: 505 words: 142,118

A Man for All Markets by Edward O. Thorp

"RICO laws" OR "Racketeer Influenced and Corrupt Organizations", 3Com Palm IPO, Alan Greenspan, Albert Einstein, asset allocation, Bear Stearns, beat the dealer, Bernie Madoff, Black Monday: stock market crash in 1987, Black Swan, Black-Scholes formula, book value, Brownian motion, buy and hold, buy low sell high, caloric restriction, caloric restriction, carried interest, Chuck Templeton: OpenTable:, Claude Shannon: information theory, cognitive dissonance, collateralized debt obligation, Credit Default Swap, credit default swaps / collateralized debt obligations, diversification, Edward Thorp, Erdős number, Eugene Fama: efficient market hypothesis, financial engineering, financial innovation, Garrett Hardin, George Santayana, German hyperinflation, Glass-Steagall Act, Henri Poincaré, high net worth, High speed trading, index arbitrage, index fund, interest rate swap, invisible hand, Jarndyce and Jarndyce, Jeff Bezos, John Bogle, John Meriwether, John Nash: game theory, junk bonds, Kenneth Arrow, Livingstone, I presume, Long Term Capital Management, Louis Bachelier, low interest rates, margin call, Mason jar, merger arbitrage, Michael Milken, Murray Gell-Mann, Myron Scholes, NetJets, Norbert Wiener, PalmPilot, passive investing, Paul Erdős, Paul Samuelson, Pluto: dwarf planet, Ponzi scheme, power law, price anchoring, publish or perish, quantitative trading / quantitative finance, race to the bottom, random walk, Renaissance Technologies, RFID, Richard Feynman, risk-adjusted returns, Robert Shiller, rolodex, Sharpe ratio, short selling, Silicon Valley, Stanford marshmallow experiment, statistical arbitrage, stem cell, stock buybacks, stocks for the long run, survivorship bias, tail risk, The Myth of the Rational Market, The Predators' Ball, the rule of 72, The Wisdom of Crowds, too big to fail, Tragedy of the Commons, uptick rule, Upton Sinclair, value at risk, Vanguard fund, Vilfredo Pareto, Works Progress Administration

Thorp’s method is as follows: He cuts to the chase in identifying a clear edge (that is something that in the long run puts the odds in his favor). The edge has to be obvious and uncomplicated. For instance, calculating the momentum of a roulette wheel, which he did with the first wearable computer (and with no less a coconspirator than the great Claude Shannon, father of information theory), he estimated a typical edge of roughly 40 percent per bet. But that part is easy, very easy. It is capturing the edge, converting it into dollars in the bank, restaurant meals, interesting cruises, and Christmas gifts to friends and family—that’s the hard part. It is the dosage of your betting—not too little, not too much—that matters in the end.

To protect myself from this happening with my work on blackjack, I settled on Proceedings of the National Academy of Sciences, as it was the quickest to publish of any journal I knew, taking as little as two or three months, and was also very prestigious. This required a member of the academy to approve and forward my work, so I sought out the only mathematics member of the academy at MIT, Claude Shannon. Claude was famous for the creation of information theory, which is crucial for modern computing, communications, and much more. The department secretary arranged a short appointment with a reluctant Shannon at noon. However, she warned me that Shannon was going to be in for only a few minutes, that I shouldn’t expect more, and that he didn’t spend time on topics or people that didn’t interest him.

During the long ride back I wondered how my research into the mathematical theory of a game might change my life. In the abstract, life is a mixture of chance and choice. Chance can be thought of as the cards you are dealt in life. Choice is how you play them. I chose to investigate blackjack. As a result, chance offered me a new set of unexpected opportunities. Ever since my first meeting with Claude Shannon in September, we had been working on the roulette project approximately twenty hours a week. Meanwhile, I was teaching courses, doing research in pure mathematics, attending department functions, writing up my blackjack research, and adjusting to being a new father. Following a roulette work session at the Shannons’, Claude asked me at dinner if I thought anything would ever top this in my life.


pages: 193 words: 19,478

Memory Machines: The Evolution of Hypertext by Belinda Barnet

augmented reality, Benoit Mandelbrot, Bill Duvall, British Empire, Buckminster Fuller, Charles Babbage, Claude Shannon: information theory, collateralized debt obligation, computer age, Computer Lib, conceptual framework, Douglas Engelbart, Douglas Engelbart, game design, hiring and firing, Howard Rheingold, HyperCard, hypertext link, Ian Bogost, information retrieval, Internet Archive, John Markoff, linked data, mandelbrot fractal, Marshall McLuhan, Menlo Park, nonsequential writing, Norbert Wiener, Project Xanadu, publish or perish, Robert Metcalfe, semantic web, seminal paper, Steve Jobs, Stewart Brand, technoutopianism, Ted Nelson, the scientific method, Vannevar Bush, wikimedia commons

(Engelbart 1986) The Analyzer evaluated and solved these equations by mechanical integration. It automated a task that was previously done by human beings and created a small revolution at MIT. Many of the people who worked on the machine (for example Harold Hazen, Gordon Brown and Claude Shannon) later made contributions to feedback control, information theory and computing (Mindell 2000). The machine was a huge success that brought prestige and a flood of federal money to MIT and Bush. However, by the spring of 1950 the Analyzer was gathering dust in a storeroom; the project had died. Why did it fail? Why did the world’s most 14 Memory Machines important analogue computer end up obsolescing in a backroom?

(Burke 1991, 147) Bush transferred these three technologies to the new design. This decision was not pure genius on his part; they were perfect analogues for a popular conception of how the brain worked at the time. The scientific community at MIT were developing a pronounced interest in man-machine analogues, and although Claude Shannon had not yet published his information theory (Dutta 1995), it was already being formulated. Much discussion also took place around MIT about how the brain might process information in the manner of an analogue machine. Bush thought and designed in terms of analogies between brain and machine, electricity and information.

They wanted a long-term project that would give the United States the most technically advanced cryptanalytic capabilities in the world, a superfast machine to count the coincidences of letters in two messages or copies of a single message. Bush assembled a research team for this project that included Claude Shannon, one of the early information theorists and a significant part of the emerging cybernetics community (Nyce and Kahn 1991). Three new technologies were emerging at the time that handled information: photoelectricity, microfilm and digital electronics. All three were just emerging, but, unlike the fragile magnetic recording [Bush’s] students were exploring, they appeared to be ready to use in calculation machines.


pages: 250 words: 73,574

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers by John MacCormick, Chris Bishop

Ada Lovelace, AltaVista, Charles Babbage, Claude Shannon: information theory, Computing Machinery and Intelligence, fault tolerance, information retrieval, Menlo Park, PageRank, pattern recognition, Richard Feynman, Silicon Valley, Simon Singh, sorting algorithm, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, traveling salesman, Turing machine, Turing test, Vannevar Bush

So it is not altogether surprising that the two major events triggering the creation of error-correcting codes both occurred in the research laboratories of the Bell Telephone Company. The two heroes of our story, Claude Shannon and Richard Hamming, were both researchers at Bell Labs. Hamming we have met already: it was his annoyance at the weekend crashes of a company computer that led directly to his invention of the first error-correcting codes, now known as Hamming codes. However, error-correcting codes are just one part of a larger discipline called information theory, and most computer scientists trace the birth of the field of information theory to a 1948 paper by Claude Shannon. This extraordinary paper, entitled “The Mathematical Theory of Communication,” is described in one biography of Shannon as “the Magna Carta of the information age.”

THE ORIGINS OF COMPRESSION ALGORITHMS The same-as-earlier trick described in this chapter—one of the main compression methods used in ZIP files—is known to computer scientists as the LZ77 algorithm. It was invented by two Israeli computer scientists, Abraham Lempel and Jacob Ziv, and published in 1977. To trace the origins of compression algorithms, however, we need to delve three decades further back into scientific history. We have already met Claude Shannon, the Bell Labs scientist who founded the field of information theory with his 1948 paper. Shannon was one of the two main heroes in our story of error-correcting codes (chapter 5), but he and his 1948 paper also figure importantly in the emergence of compression algorithms. This is no coincidence. In fact, error-correcting codes and compression algorithms are two sides of the same coin.

Mathematicians will greatly enjoy Thompson's delightful book, but it definitely assumes the reader has a healthy dose of college math. Dewdney's book (see above) has two interesting chapters on coding theory. The two quotations about Shannon on pages 77-78 are taken from a brief biography by N. J. A. Sloane and A. D. Wyner, appearing in Claude Shannon: Collected Papers edited by Sloane and Wyner (1993). Pattern recognition (chapter 6). Bishop's lectures (see above) have some interesting material that nicely complements this chapter. The geographical data about political donations is taken from the Fundrace project of the Huffington Post. All the handwritten digit data is taken from a dataset provided by Yann LeCun, of New York University's Courant Institute, and his collaborators.


pages: 720 words: 197,129

The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution by Walter Isaacson

1960s counterculture, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, Alvin Toffler, Apollo Guidance Computer, Apple II, augmented reality, back-to-the-land, beat the dealer, Bill Atkinson, Bill Gates: Altair 8800, bitcoin, Bletchley Park, Bob Noyce, Buckminster Fuller, Byte Shop, c2.com, call centre, Charles Babbage, citizen journalism, Claude Shannon: information theory, Clayton Christensen, commoditize, commons-based peer production, computer age, Computing Machinery and Intelligence, content marketing, crowdsourcing, cryptocurrency, Debian, desegregation, Donald Davies, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, driverless car, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, en.wikipedia.org, eternal september, Evgeny Morozov, Fairchild Semiconductor, financial engineering, Firefox, Free Software Foundation, Gary Kildall, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, Haight Ashbury, Hans Moravec, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, index card, Internet Archive, Ivan Sutherland, Jacquard loom, Jaron Lanier, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Joseph-Marie Jacquard, Leonard Kleinrock, Lewis Mumford, linear model of innovation, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, Mother of all demos, Neil Armstrong, new economy, New Journalism, Norbert Wiener, Norman Macrae, packet switching, PageRank, Paul Terrell, pirate software, popular electronics, pre–internet, Project Xanadu, punch-card reader, RAND corporation, Ray Kurzweil, reality distortion field, RFC: Request For Comment, Richard Feynman, Richard Stallman, Robert Metcalfe, Rubik’s Cube, Sand Hill Road, Saturday Night Live, self-driving car, Silicon Valley, Silicon Valley startup, Skype, slashdot, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Steven Pinker, Stewart Brand, Susan Wojcicki, technological singularity, technoutopianism, Ted Nelson, Teledyne, the Cathedral and the Bazaar, The Coming Technological Singularity, The Nature of the Firm, The Wisdom of Crowds, Turing complete, Turing machine, Turing test, value engineering, Vannevar Bush, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Whole Earth Review, wikimedia commons, William Shockley: the traitorous eight, Yochai Benkler

This Shannon section draws on Jon Gertner, The Idea Factory: Bell Labs and the Great Age of American Innovation (Penguin, 2012; locations refer to the Kindle edition), chapter 7; M. Mitchell Waldrop, “Claude Shannon: Reluctant Father of the Digital Age,” MIT Technology Review, July 2001; Graham Collins, “Claude E. Shannon: Founder of Information Theory,” Scientific American, Oct. 2012; James Gleick, The Information (Pantheon, 2011), chapter 7. 14. Peter Galison, Image and Logic (University of Chicago, 1997), 781. 15. Claude Shannon, “A Symbolic Analysis of Relay and Switching Circuits,” Transactions of the American Institute of Electrical Engineers, Dec. 1938. For a clear explanation, see Daniel Hillis, The Pattern on the Stone (Perseus, 1998), 2–10. 16.

Originally located on the western edge of Manhattan’s Greenwich Village overlooking the Hudson River, it brought together theoreticians, materials scientists, metallurgists, engineers, and even AT&T pole climbers. It was where George Stibitz developed a computer using electromagnetic relays and Claude Shannon worked on information theory. Like Xerox PARC and other corporate research satellites that followed, Bell Labs showed how sustained innovation could occur when people with a variety of talents were brought together, preferably in close physical proximity where they could have frequent meetings and serendipitous encounters.

Church was not only generous; he introduced the term Turing machine for what Turing had called a Logical Computing Machine. Thus at twenty-four, Turing’s name became indelibly stamped on one of the most important concepts of the digital age.12 CLAUDE SHANNON AND GEORGE STIBITZ AT BELL LABS There was another seminal theoretical breakthrough in 1937, similar to Turing’s in that it was purely a thought experiment. This one was the work of an MIT graduate student named Claude Shannon, who that year turned in the most influential master’s thesis of all time, a paper that Scientific American later dubbed “the Magna Carta of the Information Age.”13 Shannon grew up in a small Michigan town where he built model planes and amateur radios, then went on to major in electrical engineering and math at the University of Michigan.


pages: 252 words: 74,167

Thinking Machines: The Inside Story of Artificial Intelligence and Our Race to Build the Future by Luke Dormehl

"World Economic Forum" Davos, Ada Lovelace, agricultural Revolution, AI winter, Albert Einstein, Alexey Pajitnov wrote Tetris, algorithmic management, algorithmic trading, AlphaGo, Amazon Mechanical Turk, Apple II, artificial general intelligence, Automated Insights, autonomous vehicles, backpropagation, Bletchley Park, book scanning, borderless world, call centre, cellular automata, Charles Babbage, Claude Shannon: information theory, cloud computing, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, crowdsourcing, deep learning, DeepMind, driverless car, drone strike, Elon Musk, Flash crash, Ford Model T, friendly AI, game design, Geoffrey Hinton, global village, Google X / Alphabet X, Hans Moravec, hive mind, industrial robot, information retrieval, Internet of things, iterative process, Jaron Lanier, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kickstarter, Kodak vs Instagram, Law of Accelerating Returns, life extension, Loebner Prize, machine translation, Marc Andreessen, Mark Zuckerberg, Menlo Park, Mustafa Suleyman, natural language processing, Nick Bostrom, Norbert Wiener, out of africa, PageRank, paperclip maximiser, pattern recognition, radical life extension, Ray Kurzweil, recommendation engine, remote working, RFID, scientific management, self-driving car, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, social intelligence, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, tech billionaire, technological singularity, The Coming Technological Singularity, The Future of Employment, Tim Cook: Apple, Tony Fadell, too big to fail, traumatic brain injury, Turing machine, Turing test, Vernor Vinge, warehouse robotics, Watson beat the top human players on Jeopardy!

‘But most of this is probably used in remembering visual impressions, and other comparatively wasteful ways. One might reasonably hope to be able to make some real progress [towards Artificial Intelligence] with a few million digits [of computer memory].’ The third of AI’s forefathers was a man named Claude Shannon, known today as the father of ‘information theory’. Born in 1916 – making him the youngest of the three – Shannon’s big contribution to computing related to the way in which transistors work. Transistors are the billions of tiny switches that make up a computer. An algorithm is the sequence of instructions that tells a computer what to do by switching these transistors on and off.

Compared with the unreliable memory of humans, a machine capable of accessing thousands of items in the span of microseconds had a clear advantage. There are entire books written about the birth of modern computing, but three men stand out as laying the philosophical and technical groundwork for the field that became known as Artificial Intelligence: John von Neumann, Alan Turing and Claude Shannon. A native of Hungary, von Neumann was born in 1903 into a Jewish banking family in Budapest. In 1930, he arrived at Princeton University as a maths teacher and, by 1933, had established himself as one of six professors in the new Institute for Advanced Study in Princeton: a position he stayed in until the day he died.

In the summer of 1956 – when Elvis Presley was scandalising audiences with his hip gyrations, Marilyn Monroe married playwright Arthur Miller, and President Dwight Eisenhower authorised ‘In God we trust’ as the US national motto – AI’s first official conference took place. A rolling six-week workshop, bringing together the smartest academics from a broad range of disciplines, the event unfolded on the sprawling 269-acre estate of Dartmouth College in Hanover, New England. Along with Claude Shannon, two of the organisers were young men named John McCarthy and Marvin Minsky, both of whom became significant players in the growing field of Artificial Intelligence. ‘The study [of AI] is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can be so precisely described that a machine can be made to simulate it,’ they wrote.


The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Atkinson, Bill Duvall, Bill Gates: Altair 8800, Bletchley Park, Boeing 747, Byte Shop, Charles Babbage, Claude Shannon: information theory, Compatible Time-Sharing System, computer age, Computing Machinery and Intelligence, conceptual framework, cuban missile crisis, Dennis Ritchie, do well by doing good, Donald Davies, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, Fairchild Semiconductor, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, functional programming, Gary Kildall, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, Ivan Sutherland, James Watt: steam engine, Jeff Rulifson, John von Neumann, Ken Thompson, Leonard Kleinrock, machine translation, Marc Andreessen, Menlo Park, Multics, New Journalism, Norbert Wiener, packet switching, pink-collar, pneumatic tube, popular electronics, RAND corporation, RFC: Request For Comment, Robert Metcalfe, Silicon Valley, Skinner box, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, The Soul of a New Machine, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process, zero-sum game

And yet even in the midst of all that, Claude Shannon's long-delayed opus on information theory exploded like a bomb. His analysis of communication was breathtaking in scope, masterful in execution-and, for most people, totally unexpected. "It was like a bolt out of the blue, a really unique thing," recalls his Bell Labs colleague John Pierce. "I don't know of any other theory that came in a complete form like that, with very few antecedents or history." "It was a revelation," agrees Oliver Selfridge. "Around MIT the reaction was 'Brilliant! Why didn't I think of that?' Information theory gave us a whole con- ceptual vocabulary, as well as a technical vocabulary."

NEW KINDS OF PEOPLE 75 Miller kept on reading. And by the time he was finished, he now says, he knew that his life had changed. THE CONJURER Legend has it that Claude Shannon published "A Mathematical Theory of Com- munication" in 1948 only because his boss at Bell Labs finally badgered him into it. And whatever the truth of that story, the point is that no one who knew Shannon has any trouble believing it. "He wrote beautiful papers-when he wrote," says Robert Fano, who became a leader of MIT's information-theory group in the 1950s and still has a reveren- tial photograph of Shannon hanging in his office. "And he gave beautiful talks- when he gave a talk.

Indeed, his meet- ing with Chomsky was as pivotal as the moment eight years earlier when he'd picked up that July 1948 issue of the Bell Systems Technical Journal containing Claude Shannon's article-and all the more so because Chomsky's message was to be strongly reinforced just a short time later. Miller remembers the day very clearly: Tuesday, September 11, 1956, the second day of the second international conference on information theory. Actually, the whole conference was good. Held in MIT's Sloan Building, right on the river- front, it included talks by Jerry Wiesner, Bob Fano, Peter Elias, Oliver Selfridge, Walter Rosenblith, and even Shannon himself.


pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science by Chris Bernhardt

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, Bletchley Park, British Empire, cellular automata, Charles Babbage, Claude Shannon: information theory, complexity theory, Computing Machinery and Intelligence, Conway's Game of Life, discrete time, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Internet Archive, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Ken Thompson, Norbert Wiener, Paul Erdős, Reflections on Trusting Trust, Turing complete, Turing machine, Turing test, Von Neumann architecture

., von Neumann offered him a position as his assistant, but Turing decided not to accept and instead returned to England. During the time that Turing was working on his Ph.D. another breakthrough paper was written. This was on logic and switching circuits and was written by Claude Shannon. Claude Shannon In 1936, Claude Shannon graduated from the University of Michigan with two undergraduate degrees; one in electrical engineering and one in mathematics. He then went to M.I.T. for graduate school. At M.I.T. he worked on an early analog computer. This work led him to consider switches and digital computing.

His first bombe was operational in the spring of 1940.7 In 1942, after the United States joined the war, Turing went to America to work with cryptologists there, helping with the design of American bombes. While in America, he visited Bell Labs, which was then in lower Manhattan, where he met Claude Shannon. In addition to working on cryptology, they discussed their work on computing.8 At the end of the war there were a large number of bombes working in both the US and England. Being able to read Germany’s messages enabled the allies to locate German U-boats and the routes of ships. It helped with the choice of when and where to attack.

Rejewsksi, when asked about this, is said to have replied that he couldn’t think of a better name. 8. Both Shannon and Turing were interested in using ideas from probability to extract information from data. Shannon would later extend some of his wartime work and write the groundbreaking paper “A Mathematical Theory of Communication” that is one of the foundations of Information Theory. Turing wrote several articles on the application of probability to cryptography. These were classified and only now are being made available to the public (two papers were declassified in 2012. They are available at http://www.nationalarchives.gov.uk). 9. The English use the term valve where the Americans use vacuum tube. 10.


pages: 518 words: 107,836

How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy) by Benjamin Peters

Albert Einstein, American ideology, Andrei Shleifer, Anthropocene, Benoit Mandelbrot, bitcoin, Brownian motion, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive dissonance, commons-based peer production, computer age, conceptual framework, continuation of politics by other means, crony capitalism, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, disinformation, Dissolution of the Soviet Union, Donald Davies, double helix, Drosophila, Francis Fukuyama: the end of history, From Mathematics to the Technologies of Life and Death, Gabriella Coleman, hive mind, index card, informal economy, information asymmetry, invisible hand, Jacquard loom, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, Lewis Mumford, linear programming, mandelbrot fractal, Marshall McLuhan, means of production, megaproject, Menlo Park, Mikhail Gorbachev, military-industrial complex, mutually assured destruction, Network effects, Norbert Wiener, packet switching, Pareto efficiency, pattern recognition, Paul Erdős, Peter Thiel, Philip Mirowski, power law, RAND corporation, rent-seeking, road to serfdom, Ronald Coase, scientific mainstream, scientific management, Steve Jobs, Stewart Brand, stochastic process, surveillance capitalism, systems thinking, technoutopianism, the Cathedral and the Bazaar, the strength of weak ties, The Structural Transformation of the Public Sphere, transaction costs, Turing machine, work culture , Yochai Benkler

These include Claude Lévi-Strauss’s treatment of language as a technologically ordered series (after meeting with Macy Conference attendee Roman Jakobson in Paris in 1950); Jacques Lacan’s turning to mathematical concepts; Roland Barthes’s turn to schematic accounts of communication; Gilles Deleuze’s abandonment of meaning, with Claude Shannon’s information theory in hand; Felix Guattari’s, Michel Foucault’s, and other French theorists’ experimentation with terms such as encoding, decoding, information, and communication.34 Postmodern French theory owes a deep debt to postwar information theory and the cybernetic sciences. In England, cybernetics took on a different character in the form of the Ratio Club, a small but potent gathering of British cybernetic figures who gathered regularly in the basement of the National Hospital for Nervous Diseases in London from 1949 through 1955.

Gerovitch details the translation of their terms: “What Wiener called ‘the feedback mechanism’ they called ‘the theory of feedback’ … ‘basic principles of digital computing’ became ‘the theory of automatic high-speed electronic calculating machines’; ‘cybernetic models of human thinking’ became the ‘theory of self-organizing logical processes.’”70 In fact, the coauthors used the word theory six times in their definition of cybernetics to emphasize the theoretical nature of the new science, possibly as a way to avoid having to discuss the political implications of introducing a practical field of human-machine applications into a society well suited to adopt them. The coauthors also integrated and expanded the stochastic analysis of Claude Shannon’s information theory while simultaneously stripping Wiener’s organism-machine analogy of its political potency.71 Wiener’s core analogies between animal and machine, machine and mind were stressed as analogies—or how “self-organizing logical processes [appeared] similar to the processes of human thought” but were not synonyms.

The Macy Conferences, as they were informally known, staked out a spacious interdisciplinary purview for cybernetic research.11 In addition to McCulloch, who directed the conferences, a few noted participants included Wiener himself, the mathematician and game theorist John von Neumann, leading anthropologist Margaret Mead and her then husband Gregory Bateson, founding information theorist and engineer Claude Shannon, sociologist-statistician and communication theorist Paul Lazarsfeld, psychologist and computer scientist J.C.R. Licklider, as well as influential psychiatrists, psychoanalysts, and philosophers such as Kurt Lewin, F.S.C. Northrop, Molly Harrower, and Lawrence Kubie, among others. Relying on mathematical and formal definitions of communication, participants rendered permeable the boundaries that distinguished humans, machines, and animals as information systems.


pages: 267 words: 71,941

How to Predict the Unpredictable by William Poundstone

accounting loophole / creative accounting, Albert Einstein, Bernie Madoff, Brownian motion, business cycle, butter production in bangladesh, buy and hold, buy low sell high, call centre, centre right, Claude Shannon: information theory, computer age, crowdsourcing, Daniel Kahneman / Amos Tversky, Edward Thorp, Firefox, fixed income, forensic accounting, high net worth, index card, index fund, Jim Simons, John von Neumann, market bubble, money market fund, pattern recognition, Paul Samuelson, Ponzi scheme, power law, prediction markets, proprietary trading, random walk, Richard Thaler, risk-adjusted returns, Robert Shiller, Rubik’s Cube, statistical model, Steven Pinker, subprime mortgage crisis, transaction costs

At a place like Bell Labs in the 1950s, brilliant people were always firing off sparks of genius. John Pierce had a special job — catching the best ideas in a basket and pressing their originators to follow them up. A Caltech-educated engineer, Pierce juggled the roles of instigator, motivational speaker, and life coach. His hardest case may have been Claude Shannon. It was a running joke: “You should do something on that,” Pierce would say to Shannon. “Should?” Shannon would reply. “What does ‘should’ mean?” Shannon, in his late thirties, had wavy hair and handsome, rather angular features. At Bell Labs he came in when he pleased and left when he pleased.

This cuts the exponentially vast range of potential options down to manageable size. Password-cracking software does what Goodfellow did, only billions of times faster. * * * For a time, AT&T’s vision of a wireless future did not rule out telepathy. Thornton Fry, head of Bell Labs’ mathematics division and the man who hired Claude Shannon, was in that diehard minority of scientists who believed that J. B. Rhine might be onto something. About 1948, Bell Labs built an ESP machine. It was a device for generating random sequences that a would-be psychic would attempt to guess. By taking the place of Zener cards, the machine could exclude the possibility of cheating or unconscious signaling that had dogged Rhine’s research.

It silently churns data to guess locations of meth labs and carjackings; to put a price tag on the value of customers, employees, and managers; and, above all, to predict who will buy what and pay how much. Big Data makes the most of its ninja invisibility. The consumer rarely suspects how many of his actions have been tracked and outguessed. The predictions are not perfect (yet). And just as Claude Shannon could outguess his own prediction machine, you can outguess Big Data. This chapter will describe a few of the more widely applicable tricks. You’ve probably got weird calls from your mobile phone carrier, power company, or health club. The caller will ask if there’s anything she can do to serve you better.


The Deep Learning Revolution (The MIT Press) by Terrence J. Sejnowski

AI winter, Albert Einstein, algorithmic bias, algorithmic trading, AlphaGo, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, autonomous vehicles, backpropagation, Baxter: Rethink Robotics, behavioural economics, bioinformatics, cellular automata, Claude Shannon: information theory, cloud computing, complexity theory, computer vision, conceptual framework, constrained optimization, Conway's Game of Life, correlation does not imply causation, crowdsourcing, Danny Hillis, data science, deep learning, DeepMind, delayed gratification, Demis Hassabis, Dennis Ritchie, discovery of DNA, Donald Trump, Douglas Engelbart, driverless car, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, Flynn Effect, Frank Gehry, future of work, Geoffrey Hinton, Google Glasses, Google X / Alphabet X, Guggenheim Bilbao, Gödel, Escher, Bach, haute couture, Henri Poincaré, I think there is a world market for maybe five computers, industrial robot, informal economy, Internet of things, Isaac Newton, Jim Simons, John Conway, John Markoff, John von Neumann, language acquisition, Large Hadron Collider, machine readable, Mark Zuckerberg, Minecraft, natural language processing, Neil Armstrong, Netflix Prize, Norbert Wiener, OpenAI, orbital mechanics / astrodynamics, PageRank, pattern recognition, pneumatic tube, prediction markets, randomized controlled trial, Recombinant DNA, recommendation engine, Renaissance Technologies, Rodney Brooks, self-driving car, Silicon Valley, Silicon Valley startup, Socratic dialogue, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Von Neumann architecture, Watson beat the top human players on Jeopardy!, world market for maybe five computers, X Prize, Yogi Berra

When you make a cell phone call, your voice is encoded into bits and transmitted over radio waves and digital transmission lines to a receiver, where the digital signals are decoded and converted to sounds. Information theory puts bounds on the capacity of the communications channel (figure 15.2), and codes have been devised that approach the Shannon limit. Despite the many forms of information in the world, there is a way to measure precisely how much of it is in a data set. The unit of information is a “binary bit,” which can take on a value of 1 or 0. A “byte” is eight bits. 220 Chapter 15 Figure 15.1 Claude Shannon around 1963 in front of a telephone switching network. He worked at AT&T Bell Laboratories when he invented information theory. From Alfred Eisenstaedt/The LIFE Picture Collection/Getty Images.

The National Security Agency uses machine learning to sift through all of the data it has been collecting everywhere. The economy is going digital, and programming skills are in great demand at many companies. As the world shifts from an industrial to an information economy, education and job training will have to adapt. This already is having a profound impact on the world. Information Theory In 1948, Claude Shannon (figure 15.1) at the AT&T Bell Laboratories in Murray Hill, New Jersey, proposed a remarkably simple but subtle theory for information to understand signal transmission through noisy phone lines.1 Shannon’s theory drove the digital communications revolution that gave rise to cell phones, digital television, and the Internet.

The latest growth spurt has been fueled by the widespread availability of big data, and the story of NIPS has been one of preparing for this day to come. III Technological and Scientific Impact Timeline 1971—Noam Chomsky publishes “The Case against B. F. Skinner” in the New York Review of Books, an essay that steered a generation of cognitive scientists away from learning. 1982—Claude Shannon publishes the seminal book A Mathematical Theory of Communication, which laid the foundation for modern digital communication. 1989—Carver Mead publishes Analog VLSI and Neural Systems, founding the field of neuromorphic engineering, which builds computer chips inspired by biology. 2002—Stephen Wolfram publishes A New Kind of Science, which explored the computational capabilities of cellular automata, algorithms that are even simpler than neural networks but still capable of powerful computing. 2005—Sebastian Thrun’s team wins the DARPA Grand Challenge for an autonomous vehicle. 2008—Tobias Delbrück develops a highly successful spiking retina chip called the “Dynamic Vision Sensor” (DVS) that uses asynchronous spikes rather than synchronous frames used in current digital cameras. 2013—U.S.


pages: 608 words: 150,324

Life's Greatest Secret: The Race to Crack the Genetic Code by Matthew Cobb

a long time ago in a galaxy far, far away, Anthropocene, anti-communist, Asilomar, Asilomar Conference on Recombinant DNA, Benoit Mandelbrot, Berlin Wall, bioinformatics, Claude Shannon: information theory, conceptual framework, Copley Medal, CRISPR, dark matter, discovery of DNA, double helix, Drosophila, epigenetics, factory automation, From Mathematics to the Technologies of Life and Death, Gregor Mendel, heat death of the universe, James Watt: steam engine, John von Neumann, Kickstarter, Large Hadron Collider, military-industrial complex, New Journalism, Norbert Wiener, phenotype, post-materialism, Recombinant DNA, Stephen Hawking, synthetic biology

When it became a scientific best-seller we were all astonished, not least myself.14 * Wiener was not the sole creator of this revolution in thinking, nor did he claim to be. In the Introduction to Cybernetics, he described how this new vision of control had been developed through years of discussions with his intellectual partners, including Claude Shannon. Wiener generously explained that something like his ‘statistical theory of the amount of information’ had been simultaneously arrived at by Shannon in the US, by R. A. Fisher in the UK and by Andrei Kolmogoroff in the USSR. Fisher’s approach to information was in fact quite different, and the works of Kolmogoroff were unobtainable, but Shannon’s views were virtually identical to Wiener’s, as readers with the necessary mathematical ability were soon able to appreciate.

One of the ways in which these ideas came to influence biologists was through the promise of creating automata and thereby testing models of how organisms function and reproduce.20 These links were explored at a symposium on Cerebral Mechanisms in Behavior that was held at the California Institute of Technology (Caltech) a month before the publication of Cybernetics, at the end of September 1948. The symposium was a small affair – only fourteen speakers, with a further five participants, one of whom was the Caltech chemist Linus Pauling. 1. Claude Shannon’s model of communication. From Shannon and Weaver (1949). Von Neumann gave the opening talk, entitled ‘General and logical theory of automata’, and explored one of the defining features of life: its ability to reproduce. Von Neumann’s starting point was Alan Turing’s prewar theory of a universal machine that carried out its operations by reading and writing on a paper tape.

., ‘Biopolitics and intuitive algebra in the mathematization of cryptology? A review of Shannon’s ‘A mathematical theory of cryptography’ from 1945’, Cryptologia, vol. 23, 1999, pp. 261–6. Roche, J., ‘Notice nécrologique: André Boivin (1895–1949)’, Bulletin de la Société de Chimie Biologique, vol. 31, 1949, pp. 1564–7. Rogers, E. M., ‘Claude Shannon’s cryptography research during World War II and the mathematical theory of communication’, Proceedings, IEEE 28th International Carnaham Conference on Security Technology, 1994, pp. 1–5. Rogozin, I. B., Carmel, L., Csuros, M. and Koonin, E. V., ‘Origin and evolution of spliceosomal introns’, Biology Direct, vol. 7, 2012, p. 11.


pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, backpropagation, Bletchley Park, British Empire, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, computer age, Computing Machinery and Intelligence, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, independent contractor, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, launch on warning, low earth orbit, machine readable, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, spectrum auction, strong AI, synthetic biology, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

By repeated sorting and other iterated functions, primitive punched-card machines could perform complex operations, but, like the original Turing machine, they had only a small number of possible states. The fundamental unit of information was the bit; its explicit definition as the contraction of “binary digit” was first noted in an internal Bell Laboratories memo written by John W. Tukey on 9 January 1947,20 and first published in Claude Shannon’s Mathematical Theory of Communication in 1948.21 Shannon’s definition was foreshadowed by Vannevar Bush’s analysis, in 1936, of the number of “bits of information” that could be stored on a punched card. In those days bits were assigned only fleetingly to electrical or electronic form. Most bits, most of the time, were bits of paper (or bits of missing paper, represented by the chad that was carted off to landfills by the ton).

In November 1945, a committee led by von Neumann and including Herman Goldstine (still attached to the ENIAC project at the Moore School) held its first meeting in Vladimir Zworykin’s office at RCA. Also in attendance was John Tukey, a statistician with Bell Telephone Laboratories (and the originator of information theorist Claude Shannon’s landmark contraction of “binary digit” to “bit”). Von Neumann issued a memorandum of objectives, concluding that “it is to be expected that the future evolution of high-speed computing will be decisively influenced by the experiences gained.”22 By the spring of 1946 the project was under way and staff, led by Goldstine, were signing on.

Most early digital computers—from the Colossus to the IAS machine—used paper-tape teletype equipment for input and output between the computer and the outside world, augmented by the ubiquitous punched-card equipment from Hollerith and IBM. It was only natural that the first computers incorporated high-speed telegraphic equipment, and it is no accident that the genesis of the Colossus within the British Telecommunications Research Establishment was mirrored in the United States by early steps taken toward computers by Claude Shannon and others within Bell Laboratories and RCA. Only later did the communications industry and the computer industry become temporarily estranged. The solitary computers of the early 1950s exchanged code sequences by means of mutually intelligible storage media and, before the end of the decade, by connecting directly or, in language that has now been extended to human beings, on-line.


pages: 561 words: 120,899

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy by Sharon Bertsch McGrayne

Abraham Wald, Alan Greenspan, Bayesian statistics, bioinformatics, Bletchley Park, British Empire, classic study, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, data science, double helix, Dr. Strangelove, driverless car, Edmond Halley, Fellow of the Royal Society, full text search, government statistician, Henri Poincaré, Higgs boson, industrial research laboratory, Isaac Newton, Johannes Kepler, John Markoff, John Nash: game theory, John von Neumann, linear programming, longitudinal study, machine readable, machine translation, meta-analysis, Nate Silver, p-value, Pierre-Simon Laplace, placebo effect, prediction markets, RAND corporation, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman: Challenger O-ring, Robert Mercer, Ronald Reagan, seminal paper, speech recognition, statistical model, stochastic process, Suez canal 1869, Teledyne, the long tail, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, uranium enrichment, We are all Keynesians now, Yom Kippur War

During the Second World War, Warren Weaver of the Rockefeller Foundation was impressed with how “a multiplicity of languages impedes cultural interchange between the peoples of the earth and is a serious deterrent to international understanding.”6 Struck by the power of mechanized cryptography and by Claude Shannon’s new information theory, Weaver suggested that computerized statistical methods could treat translation as a cryptography problem. In the absence of computer power and a wealth of machine-readable text, Weaver’s idea lay fallow for decades. Ever since, the holy grail of translators has been a universal machine that can transform written and spoken words from one language into any other.

One spectacular success occurred during the Second World War, when Alan Turing developed Bayes to break Enigma, the German navy’s secret code, and in the process helped to both save Britain and invent modern electronic computers and software. Other leading mathematical thinkers—Andrei Kolmogorov in Russia and Claude Shannon in New York—also rethought Bayes for wartime decision making. During the years when ivory tower theorists thought they had rendered Bayes taboo, it helped start workers’ compensation insurance in the United States; save the Bell Telephone system from the financial panic of 1907; deliver Alfred Dreyfus from a French prison; direct Allied artillery fire and locate German U-boats; and locate earthquake epicenters and deduce (erroneously) that Earth’s core consists of molten iron.

He named his unit a ban for Banburismus and defined it as “about the smallest change in weight of evidence that is directly perceptible to human intuition.”13 One ban represented odds of 10 to 1 in favor of a guess, but Turing normally dealt with much smaller quantities, decibans and even centibans. The ban was basically the same as the bit, the measure of information Claude Shannon discovered by using Bayes’ rule at roughly the same time at Bell Telephone Laboratories. Turing’s measure of belief, the ban, and its supporting mathematical framework have been called his greatest intellectual contribution to Britain’s defense. To estimate the probability of a guess when information was arriving piecemeal, Turing used bans to discriminate between sequential hypotheses.


pages: 468 words: 137,055

Crypto: How the Code Rebels Beat the Government Saving Privacy in the Digital Age by Steven Levy

Albert Einstein, Bletchley Park, Claude Shannon: information theory, cognitive dissonance, Compatible Time-Sharing System, computer age, disinformation, Donald Knuth, Eratosthenes, Extropian, Fairchild Semiconductor, information security, invention of the telegraph, Jim Simons, John Gilmore, John Markoff, John Perry Barlow, Kevin Kelly, knapsack problem, Marc Andreessen, Mitch Kapor, MITM: man-in-the-middle, Mondo 2000, Network effects, new economy, NP-complete, quantum cryptography, Ronald Reagan, Saturday Night Live, Silicon Valley, Simon Singh, Stephen Hawking, Steven Levy, Watson beat the top human players on Jeopardy!, web of trust, Whole Earth Catalog, zero-sum game, Zimmermann PGP, éminence grise

But he never did. (Worse, he came to confuse this book with another book published at that time, David Kahn’s The Codebreakers, which delayed his reading of the more important work.) Similarly, one day at Mitre, a colleague moving out of his office gave Diffie a 1949 paper by Claude Shannon. The legendary father of information theory had been teaching at MIT since 1956, but Diffie had never met him, a slight, introverted professor who lived a quiet family life, pursuing a variety of interests from reading science fiction to listening to jazz. (Presumably, by the time Shannon had reached his sixties, he had put aside the unicycle he had once mastered.)

The answer was cryptography. Though Tuchman had a background in information theory, he had never specifically done any crypto work. But he soon found out about the system that the guys in IBM research at Yorktown Heights had cooked up. He ventured down to Watson Labs one day and heard Feistel speak about Lucifer. He immediately set up a lunch with Feistel and Alan Konheim. The first thing Tuchman asked Feistel was where he had gotten the ideas for Lucifer. Feistel, in his distinctive German accent, mentioned the early papers of Claude Shannon. “The Shannon paper reveals all,” he said. Meanwhile, Tuchman’s colleague Karl Meyer was exploring whether Lucifer might be a good fit for an expanded version of the Lloyd’s Cashpoint system.

Though Hellman didn’t work directly with Horst Feistel, the German-born cryptographer worked nearby in the building, and sometimes the two of them would sit together at lunch, where the older man would describe some of the classical cryptosystems and some of the means of breaking them. Hellman left IBM in 1970, accepting a post as assistant professor at MIT. At that time Peter Elias, who had worked closely with Claude Shannon, was just stepping down as the head of the electronic engineering department. Elias’s talks with Hellman drew the young academic deeper into crypto, and for the first time he began thinking about making it the focus of his research. “Partially, it was the magician aspect, being able to impress people with magic tricks,” he now explains.


pages: 242 words: 68,019

Why Information Grows: The Evolution of Order, From Atoms to Economies by Cesar Hidalgo

Ada Lovelace, Albert Einstein, Arthur Eddington, assortative mating, business cycle, Claude Shannon: information theory, David Ricardo: comparative advantage, Douglas Hofstadter, Everything should be made as simple as possible, Ford Model T, frictionless, frictionless market, George Akerlof, Gödel, Escher, Bach, income inequality, income per capita, industrial cluster, information asymmetry, invention of the telegraph, invisible hand, Isaac Newton, James Watt: steam engine, Jane Jacobs, job satisfaction, John von Neumann, Joi Ito, New Economic Geography, Norbert Wiener, p-value, Paul Samuelson, phenotype, price mechanism, Richard Florida, Robert Solow, Ronald Coase, Rubik’s Cube, seminal paper, Silicon Valley, Simon Kuznets, Skype, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, Stuart Kauffman, tacit knowledge, The Market for Lemons, The Nature of the Firm, The Wealth of Nations by Adam Smith, total factor productivity, transaction costs, working-age population

What is surprising to most people, however, is that information is meaningless, even though the meaningless nature of information, much like its physicality, is often misunderstood. In 1949 Claude Shannon and Warren Weaver published a short book entitled The Mathematical Theory of Communication. In its first section, Weaver described the conceptual aspects of information. In the second section, Shannon described the mathematics of what we now know as information theory. For information theory to be properly understood, Shannon and Weaver needed to detach the word information from its colloquial meaning. Weaver made this distinction early on his essay: “The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage.

This is another way of saying that the $2.5 million worth of value was stored not in the car’s atoms but in the way those atoms were arranged.3 That arrangement is information.4 So the value of the Bugatti is connected to physical order, which is information, even though people still debate what information is.5 According to Claude Shannon, the father of information theory, information is a measure of the minimum volume of communication required to uniquely specify a message. That is, it’s the number of bits we need to communicate an arrangement, like the arrangement of atoms that made the Bugatti. To grasp Shannon’s definition of information firmly, however, it is better to start with something simpler than a Bugatti.

Mathematicians continued to formalize the idea of information, but they framed their efforts in the context of communication technologies, transcending the efforts to decipher intercepted messages. The mathematicians who triumphed became known as the world’s first information theorists or cyberneticists. These pioneers included Claude Shannon, Warren Weaver, Alan Turing, and Norbert Wiener. In the 1950s and 1960s the idea of information took science by storm. Information was welcomed in all academic fields as a powerful concept that cut across scientific boundaries. Information was neither microscopic nor macroscopic.3 It could be inscribed sparsely on clay tablets or packed densely in a strand of DNA.


pages: 322 words: 88,197

Wonderland: How Play Made the Modern World by Steven Johnson

"hyperreality Baudrillard"~20 OR "Baudrillard hyperreality", Ada Lovelace, adjacent possible, Alfred Russel Wallace, Antoine Gombaud: Chevalier de Méré, Berlin Wall, bitcoin, Book of Ingenious Devices, Buckminster Fuller, Charles Babbage, Claude Shannon: information theory, Clayton Christensen, colonial exploitation, computer age, Computing Machinery and Intelligence, conceptual framework, cotton gin, crowdsourcing, cuban missile crisis, Drosophila, Edward Thorp, Fellow of the Royal Society, flying shuttle, game design, global village, Great Leap Forward, Hedy Lamarr / George Antheil, HyperCard, invention of air conditioning, invention of the printing press, invention of the telegraph, Islamic Golden Age, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, Jane Jacobs, John von Neumann, joint-stock company, Joseph-Marie Jacquard, land value tax, Landlord’s Game, Lewis Mumford, lone genius, mass immigration, megacity, Minecraft, moral panic, Murano, Venice glass, music of the spheres, Necker cube, New Urbanism, Oculus Rift, On the Economy of Machinery and Manufactures, pattern recognition, peer-to-peer, pets.com, placebo effect, pneumatic tube, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, QWERTY keyboard, Ray Oldenburg, SimCity, spice trade, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, supply-chain management, talking drums, the built environment, The Great Good Place, the scientific method, The Structural Transformation of the Public Sphere, trade route, Turing machine, Turing test, Upton Sinclair, urban planning, vertical integration, Victor Gruen, Watson beat the top human players on Jeopardy!, white flight, white picket fence, Whole Earth Catalog, working poor, Wunderkammern

He was, instead, a computer scientist from MIT named Edward Thorp, who had come to Vegas not to break the bank but rather to test a brand-new device: the very first wearable computer ever designed. Thorp had an accomplice at the roulette table, standing unobserved at the other end, pretending not to know his partner. He would have been unrecognizable to the average casino patron, but he was in fact one of the most important minds of the postwar era: Claude Shannon, the father of information theory and one of the key participants in the invention of digital computers. Thorp had begun thinking about beating the odds at roulette as a graduate student in physics at UCLA in 1955. Unlike card games like blackjack or poker where strategy could make a profound difference in outcomes, roulette was supposed to be a game of pure chance; the ball was equally likely to end up on any number on the wheel.

Turing’s speculations form a kind of origin point for two parallel paths that would run through the rest of the century: building intelligence into computers by teaching them to play chess, and studying humans playing chess as a way of understanding our own intelligence. Those interpretative paths would lead to some extraordinary breakthroughs: from the early work on cybernetics and game theory from people like Claude Shannon and John von Neumann, to machines like IBM’s Deep Blue that could defeat grandmasters with ease. In cognitive science, the litany of insights that derived from the study of chess could almost fill an entire textbook, insights that have helped us understand the human capacity for problem solving, pattern recognition, visual memory, and the crucial skill that scientists call, somewhat awkwardly, chunking, which involves grouping a collection of ideas or facts into a single “chunk” so that they can be processed and remembered as a unit.

Impressed by Thorp’s blackjack system, Shannon inquired whether Thorp was working on anything else “in the gambling area.” Dormant for five years, Thorp’s roulette investigation was suddenly reawakened as the two men began a furious year of activity, seeking a predictable pattern in the apparent randomness of the roulette wheel. Claude Shannon with an electronic mouse In his old, rambling wooden house outside of Cambridge, Shannon had created a basement exploratorium that would have astounded Merlin and Babbage. Thorp later described it as a “gadgeteer’s paradise”: It had perhaps a hundred thousand dollars (about six hundred thousand 1998 dollars) worth of electronic, electrical and mechanical items.


pages: 622 words: 169,014

Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction by Alec Nevala-Lee

Albert Einstein, Apollo 11, basic income, Claude Shannon: information theory, computer age, Doomsday Clock, Elon Musk, experimental subject, Ford paid five dollars a day, heat death of the universe, lone genius, Neil Armstrong, Norbert Wiener, Norman Mailer, planetary scale, Ralph Waldo Emerson, RAND corporation, Ronald Reagan, Strategic Defense Initiative, the map is not the territory, the scientific method, universal basic income, Upton Sinclair

Campbell even reached out to Wiener himself, writing that his former professor would “be greatly interested” in dianetics “as suggesting a new direction of development of the work from the cybernetics side,” and concluding, “Further study of dianetics will be of immense aid in your projects.” He also contacted his neighbor, the mathematician Claude Shannon, who had founded the field of information theory at Bell Labs. Shannon encouraged Warren McCulloch to meet Hubbard: “If you read science fiction as avidly as I do you’ll recognize him as one of the best writers in that field. . . . [He] has been doing some very interesting work lately in using a modified hypnotic technique for therapeutic purposes. . . .

At that point, Hubbard took over, claiming to have spent eleven years on observations of “the medicine man of the Goldi people of Manchuria, the shamans of North Borneo, Sioux medicine men, the cults of Los Angeles, and modern psychology. . . . Odds and ends like these, countless odds and ends.” The article reflected this hodgepodge of influences, comparing the analytic mind to “a well-greased Univac,” alluding to demon circuits, and quoting Claude Shannon and Warren McCulloch. Hubbard provided no real description of the therapy itself, but he concluded with what might have been the motto of the postwar Astounding: “Up there are the stars. Down in the arsenal is an atom bomb. Which one is it going to be?” The issue also carried an advertisement for Dianetics: The Modern Science of Mental Health, which was scheduled to be released by Hermitage House on April 19.

It was exactly the attitude that had infuriated him in Parker, and his refusal to be pinned down undermined any attempts to investigate the subject seriously. Many were skeptical, and when a fan asked Campbell whether the Hieronymus Machine was a hoax, like Asimov’s articles about thiotimoline, the editor seemed horrified by the implication. There were inquiries from Bell Aircraft and the RAND Corporation, and Claude Shannon offered to test it, although the timing never worked out. Campbell soon moved on to other causes, and Hieronymus himself felt that the editor had set back acceptance of his work by a century. The symbolic machine, he said, functioned because the ink conducted lines of force, but when it came to serious research, it wasn’t worth “a tinker’s damn.”


pages: 238 words: 46

When Things Start to Think by Neil A. Gershenfeld

3D printing, Ada Lovelace, Bretton Woods, cellular automata, Charles Babbage, Claude Shannon: information theory, Computing Machinery and Intelligence, disinformation, Dynabook, Hedy Lamarr / George Antheil, I think there is a world market for maybe five computers, information security, invention of movable type, Iridium satellite, Isaac Newton, Jacquard loom, Johannes Kepler, John von Neumann, low earth orbit, means of production, new economy, Nick Leeson, packet switching, RFID, speech recognition, Stephen Hawking, Steve Jobs, telemarketer, the medium is the message, Turing machine, Turing test, Vannevar Bush, world market for maybe five computers

Since then computer chess has been studied by a who's who of computing pioneers who took it to be a defining challenge for what came to be known as Artificial Intelligence. It was thought that if a machine could win at chess it would have to draw on fundamental insights into how humans think. Claude Shannon, the inventor of Information Theory, which provides the foundation for modern digital communications, designed a simple chess program in 1949 and was able to get it running to play endgames. The first program that could play a full game of chess was developed at IBM in 1957, and an MIT computer won the first BIT BELIEFS + 129 tournament match against a human player in 1967.

There is a disconnect between the breathless pronouncements of cyber gurus and the experience of ordinary people left perpetually upgrading hardware to meet the demands of new software, or wondering where their files have gone, or trying to understand why they can't connect to the network. The revolution so far has been for the computers, not the people. Digital data of all kinds, whether an e-mail message or a movie, is encoded as a string of O's and 1's because of a remarkable discovery by Claude Shannon and John von Neumann in the 1940s. Prior to their work, it was obvious that engineered systems degraded with time and use. A tape recording sounds worse after it is duplicated, a photocopy is less satisfactory than an original, a telephone call becomes more garbled the farther it has to travel.

An important step came in 1929 when Leo Szilard reduced the problem to its essence with a single molecule that could be on either side of a partition. While he wasn't able to solve the demon paradox, this introduced the notion of a "bit" of information. Szilard's one-bit analysis of Maxwell's demon provided the inspiration for Claude Shannon's theory of information in 1948. Just as the steam engine powered the Industrial Revolution, electronic communications was powering an information revolution. And just as finding the capacity of a steam engine was a matter of some industrial import, the growing demand for communications links required an understanding of how many messages could be sent through a wire.


Lifespan: Why We Age—and Why We Don't Have To by David A. Sinclair, Matthew D. Laplante

Albert Einstein, Albert Michelson, Anthropocene, anti-communist, Any sufficiently advanced technology is indistinguishable from magic, Atul Gawande, basic income, Berlin Wall, Bernie Sanders, biofilm, Biosphere 2, blockchain, British Empire, caloric restriction, caloric restriction, carbon footprint, Charles Babbage, Claude Shannon: information theory, clean water, creative destruction, CRISPR, dark matter, dematerialisation, discovery of DNA, double helix, Drosophila, Easter island, Edward Jenner, en.wikipedia.org, epigenetics, experimental subject, Fall of the Berlin Wall, Fellow of the Royal Society, global pandemic, Grace Hopper, helicopter parent, income inequality, invention of the telephone, Isaac Newton, John Snow's cholera map, Kevin Kelly, Khan Academy, labor-force participation, life extension, Louis Pasteur, McMansion, Menlo Park, meta-analysis, microbiome, mouse model, mutually assured destruction, Paul Samuelson, personalized medicine, phenotype, Philippa Foot, placebo effect, plutocrats, power law, quantum entanglement, randomized controlled trial, Richard Feynman, ride hailing / ride sharing, self-driving car, seminal paper, Skype, stem cell, Stephen Hawking, Steven Pinker, TED Talk, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Thomas Malthus, Tim Cook: Apple, Tragedy of the Commons, trolley problem, union organizing, universal basic income, WeWork, women in the workforce, zero-sum game

Worse still, information is lost as it’s copied. No one was more acutely disturbed by the problem of information loss than Claude Shannon, an electrical engineer from the Massachusetts Institute of Technology (MIT) in Boston. Having lived through World War II, Shannon knew firsthand how the introduction of “noise” into analog radio transmissions could cost lives. After the war, he wrote a short but profound scientific paper called “The Mathematical Theory of Communication” on how to preserve information, which many consider the foundation of Information Theory. If there is one paper that propelled us into the digital, wireless world in which we now live, that would be it.26 Shannon’s primary intention, of course, was to improve the robustness of electronic and radio communications between two points.

EPIGENETIC REPROGRAMMING REGROWS OPTIC NERVES AND RESTORES EYESIGHT IN OLD MICE. The Information Theory of Aging predicts that it is a loss of epigenetic rather than genetic information in the form of mutations. By infecting mice with reprogramming genes called Oct4, Sox2, and Klf4, the age of cells is reversed by the TET enzymes, which remove just the right methyl tags on DNA, reversing the clock of aging and allowing the cells to survive and grow like a newborn’s. How the enzymes know which tags are the youthful ones is a mystery. Solving that mystery would be the equivalent of finding Claude Shannon’s “observer,” the person who holds the the original data.

What they show is that aging can be reset. The scratches on the DVD can be removed, and the original information can be recovered. Epigenomic noise is not a one-way street. But how might we reset the body without becoming a clone? In his 1948 publications about the preservation of information during data transmissions, Claude Shannon provided a valuable clue.10 In an abstract sense, he proposed that information loss is simply an increase in entropy, or the uncertainty of resolving a message, and provided brilliant equations to back his ideas up. His work stemmed from the mathematics of Harry Nyquist and Ralph Hartley, two other engineers at Bell Labs who, in the 1920s, revolutionized our understanding of information transmission.


pages: 339 words: 57,031

From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism by Fred Turner

"World Economic Forum" Davos, 1960s counterculture, A Declaration of the Independence of Cyberspace, Alan Greenspan, Alvin Toffler, Apple's 1984 Super Bowl advert, back-to-the-land, Bill Atkinson, bioinformatics, Biosphere 2, book value, Buckminster Fuller, business cycle, Californian Ideology, classic study, Claude Shannon: information theory, complexity theory, computer age, Computer Lib, conceptual framework, Danny Hillis, dematerialisation, distributed generation, Douglas Engelbart, Douglas Engelbart, Dr. Strangelove, Dynabook, Electric Kool-Aid Acid Test, Fairchild Semiconductor, Ford Model T, From Mathematics to the Technologies of Life and Death, future of work, Future Shock, game design, George Gilder, global village, Golden Gate Park, Hacker Conference 1984, Hacker Ethic, Haight Ashbury, Herbert Marcuse, Herman Kahn, hive mind, Howard Rheingold, informal economy, intentional community, invisible hand, Ivan Sutherland, Jaron Lanier, John Gilmore, John Markoff, John Perry Barlow, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, Lewis Mumford, market bubble, Marshall McLuhan, mass immigration, means of production, Menlo Park, military-industrial complex, Mitch Kapor, Mondo 2000, Mother of all demos, new economy, Norbert Wiener, peer-to-peer, post-industrial society, postindustrial economy, Productivity paradox, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, reality distortion field, Richard Stallman, Robert Shiller, Ronald Reagan, Shoshana Zuboff, Silicon Valley, Silicon Valley ideology, South of Market, San Francisco, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, systems thinking, technoutopianism, Ted Nelson, Telecommunications Act of 1996, The Hackers Conference, the strength of weak ties, theory of mind, urban renewal, Vannevar Bush, We are as Gods, Whole Earth Catalog, Whole Earth Review, Yom Kippur War

In his book Cybernetics; or, Control and Communication in the Animal and the Machine, he defined cybernetics as a field focused on “the study of messages as a means of controlling machinery and society,” with machinery seeming to include, by analogy at least, biological organisms. For Wiener, the world, like the anti-aircraft predictor, was composed of systems linked by, and to some extent made out of, messages. Drawing on Claude Shannon’s information theory (published in 1948, but likely familiar to Wiener much earlier), Wiener defined messages as “forms of pattern and organization.”37 Like Shannon’s information, Wiener’s messages were surrounded by “noise,” yet they somehow maintained their integrity. So too did organisms and machines: incorporating and responding to feedback through structural mechanisms, Wiener explained, both kept themselves in a state of homeostasis.

Cybernetics, as the theory of control mechanisms in technology and nature and founded on the concepts of information and feedback, is but a part of a general theory of systems; cybernetic systems are a special case, however important, of systems showing self-regulation.” Bertalanffy, General System Theory, 3. For Bertalanffy, cybernetics was only one root of systems theory, albeit an important one. Others included the servomechanisms of the nineteenth century, Claude Shannon’s information theory, Von Neumann and Morgenstern’s game theory, and the increasing need in the post–World War II world to monitor and control large systems for social functions such as traffic and finance. For a critical analysis of the relationship between cybernetics and other systems theories, see Lilienfeld, Rise of Systems Theory. 44.

Sirius), 164 Goldstein, Emmanuel, 168, 169 Gore, Al, 219 Graham, Bill, 66 graphical user interface, 111 Grateful Dead, 13, 65, 66, 166 Great Society, 26 Greenblatt, Richard, 136 Greenspan, Alan, 215 Griesemer, James, 72 [ 319 ] Grooms, Red, 48 Gullichsen, Eric, 163 Gurdjieff, Georges Ivanovitch, Meetings with Remarkable Men, 187 hacker ethic, 134 –35, 136 hackers, 117, 132 –35, 133 Hackers’ Conference, 132, 137–38, 168, 169, 171, 219, 249, 254 hacking, as a free-speech issue, 169 Hafner, Katie, 143, 145, 221, 252 Hagel, John, Net Gain, 234 Haight-Ashbury, 32, 48, 66 – 67 Hapgood, Fred, 221–22 “happenings,” 48, 49, 67, 269n14 “hardware hackers,” 133 Harman, Willis, 61, 185, 274n12 Harper’s Magazine, 167 Hart, Pam, 117 Harvard Business Review, analysis of Out of Control, 204 –5 Harvey, David, 242 Hawken, Paul, 128, 185, 188 Hayles, Katherine, 26, 122 Hedgepeth, William, 77 Hefner, Christy, 211 Heims, Steve, 26, 122 Helmreich, Stefan, 198 Hermosillo, Carmen, 155 Herring, Susan, 152 Hertzfeld, Andy, 135 heterarchy, 156 Hewlett-Packard, 138 Hickman, Berry, 96 High Frontiers (’zine), 164 Hillis, Danny, 182, 183, 189 hippies, 32 Hiroshima, 16 Hitt, Jack, 167 Hofmann, Albert, 164 Hog Farm commune, 110 Holm, Richard, 44 Homebrew Computer Club, 70, 102, 106, 114 homeostat, 26, 146, 178 Horowitz, Ed, 208 Hoyt, Brad, 193 HTML code, 222 Hudson Institute, 186 Hudson Review, 47 human-machine collaboration, 108 –9, 111 hyperlinks, 213 [ 320 ] Index I Ching, 65, 82, 93 I. M. Pei, 178 Industry Standard, 207 information: economic paradox of, 136 –37; free dissemination of, 137 Information Processing Techniques Office, 108 Information Superhighway, 219 information system, material world imagined as, 15 information theory: and American art, 268n13; of Claude Shannon, 265n43; and microbiology, 43 – 44 Information Week, 131 Innis, Harold, 52, 269n21 Institute for Advanced Study, 185 Intel, 212 Intercontinental Ballistic Missile, 24 interdisciplinary migration, 58 International Federation for Advanced Study (IFAS), 61 Internet, 247; growth of, 160, 214; as infrastructure and symbol of new economic era, 7; as the New Millennium, 232 –36; privatization of backbone, 213; as symbol of a post-Fordist economic order, 202; utopian claims surrounding the emergence of the, 1–3, 33 Internet stocks, 214, 232 Inuit, 53 IT-290, 60 Jackson, Charles, 211 Jennings, Lois, 70 Jerome, Judson, 32 Jobs, Steve, 133, 138 Johnson, Lyndon, 26 Joselit, David, 46 journalism, shaping of public perceptions, 253 Joy, Bill, 220 juxtaposition, 84 Kahn, Herman, 130, 181, 186, 197 Kahn, Lloyd, 94, 95, 97 Kanter, Rosabeth Moss, 76, 271n9 Kapor, Mitch, 171–72, 218 Kaprow, Allan, 46, 48, 58, 67 Katz, Bruce, 211, 277n1 Kay, Alan, 111–13, 117, 177, 246 Kay, Lily, 44 Kelly, Kevin, 3, 16, 131–32; account of “vivisystems,” 200; commercial sphere as a site of social change, 202 –3; “computational metaphor,” 216; concept of “hive mind,” 201, 202, 204; doctrine of cyberevolutionism, 204; editorial model, 195; editor of Signal, 196; editor of Whole Earth Review, 177, 195 –96; as executive director of Wired, 7, 206, 209, 212, 217; and first Hackers’ Conference, 195; forum on hacking on the WELL, 168 –70; and Gilder, 223; and hacking community, 135; Internet as symbol of post-Fordist economy, 202; longing to return to an egalitarian world, 248; as network entrepreneur, 194 –99; “New Rules for the New Economy,” 15, 234 –35; Out of Control: The Rise of Neo-Biological Civilization, 176, 195, 199 –206; response to 1987 conference on artificial life, 199; review of Electric Word, 211; underplayed the work of embodied labor, 204; and the WELL, 148; on WELL design goals, 143 Keniston, Kenneth, 31; The Young Radicals, 261– 62 Kennedy, Alison (aka Queen Mu), 163 Kennedy, John F., 229, 271n10 Kent State University killings, 98, 118 Kepler’s bookstore, 70 Kerouac, Jack, 62 Kerr, Clark, 11, 12 Kesey, Ken: and geodesic dome, 94; leadership of Merry Pranksters, 63, 65, 67; and LSD, 61, 63; notion of Acid Test, 65; One Flew Over the Cuckoo’s Nest, 59 – 60, 64; rejection of agonistic politics, 64; subject of CIA experimental drug protocols, 60 – 61; and the Supplement, 81; and Trips Festival, 66; at Vietnam Day in 1965, 98 Keyworth, George, 222; “Magna Carta for the Knowledge Age,” 228 –30 Kleiner, Art, 131, 132, 135, 185 Kline, David, 287n37 Korzybski, Alfred, 62 Kravitz, Henry, 211 Kubrick, Stanley, 186 Kuhr, Barbara, 211, 285n2 Lama Foundation, 75, 76, 94, 97, 109, 119 Lampson, Butler, 111 Langton, Christopher, 198 Language Technology (magazine), 211 Lanier, Jaron, 163, 165, 172, 195 laser printer, 111 Index Last Whole Earth Catalog, 70, 81, 98, 112, 118 Learning Conferences, 181– 84 Leary, Timothy, 51, 163, 164, 165 legitimacy exchange, 25 –26, 84, 85, 88, 95, 250 Lehr, Stan, 210 Levy, Steven, 137, 139, 195; Hackers: Heroes of the Computer Revolution, 132 –35 Leyden, Peter, 233 –34 Libertarianism, 210, 249, 259, 287n49 Libre commune, 81, 94, 96, 109 Licklider, Joseph C.


pages: 137 words: 36,231

Information: A Very Short Introduction by Luciano Floridi

agricultural Revolution, Albert Einstein, bioinformatics, Bletchley Park, carbon footprint, Claude Shannon: information theory, Computing Machinery and Intelligence, conceptual framework, digital divide, disinformation, double helix, Douglas Engelbart, Douglas Engelbart, George Akerlof, Gordon Gekko, Gregor Mendel, industrial robot, information asymmetry, intangible asset, Internet of things, invention of writing, John Nash: game theory, John von Neumann, Laplace demon, machine translation, moral hazard, Nash equilibrium, Nelson Mandela, Norbert Wiener, Pareto efficiency, phenotype, Pierre-Simon Laplace, prisoner's dilemma, RAND corporation, RFID, Thomas Bayes, Turing machine, Vilfredo Pareto

The name for this branch of probability theory comes from Claude Shannon's seminal work. Shannon pioneered the field of mathematical studies of information and obtained many of its principal results, even though he acknowledged the importance of previous work done by other researchers and colleagues at Bell laboratories. After Shannon, MTC became known as information theory. Today, Shannon is considered `the father of information theory', and the kind of information MTC deals with is often qualified as Shannon information. The term `information theory' is an appealing but unfortunate label, which continues to cause endless misunderstandings.

Information is notorious for coming in many forms and having many meanings. It can be associated with several explanations, depending on the perspective adopted and the requirements and desiderata one has in mind. The father of information theory, Claude Shannon (1916-2001), for one, was very cautious: The word ' information' has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field.

Seligman, Information Flow: The Logic of Distributed Systems (Cambridge: Cambridge University Press, 1997). G. Bateson, Steps to an Ecology of Mind (Frogmore, St Albans: Paladin, 1973). T. M. Cover and J. A. Thomas, Elements of Information Theory (New York; Chichester: Wiley, 1991). F. I. Dretske, Knowledge and the Flow of Information (Oxford: Blackwell, 1981). D. S. Jones, Elementary Information Theory (Oxford: Clarendon Press, 1979). D. M. MacKay, Information, Mechanism and Meaning (Cambridge, MA: MIT Press, 1969). J. R. Pierce, An Introduction to Information Theory: Symbols, Signals and Noise, 2nd edn (New York: Dover Publications, 1980). A. M. Turing, `Computing Machinery and Intelligence', Minds and Machines, 1950, 59, 433-60.


pages: 524 words: 120,182

Complexity: A Guided Tour by Melanie Mitchell

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Albert Michelson, Alfred Russel Wallace, algorithmic management, anti-communist, Arthur Eddington, Benoit Mandelbrot, bioinformatics, cellular automata, Claude Shannon: information theory, clockwork universe, complexity theory, computer age, conceptual framework, Conway's Game of Life, dark matter, discrete time, double helix, Douglas Hofstadter, Eddington experiment, en.wikipedia.org, epigenetics, From Mathematics to the Technologies of Life and Death, Garrett Hardin, Geoffrey West, Santa Fe Institute, Gregor Mendel, Gödel, Escher, Bach, Hacker News, Hans Moravec, Henri Poincaré, invisible hand, Isaac Newton, John Conway, John von Neumann, Long Term Capital Management, mandelbrot fractal, market bubble, Menlo Park, Murray Gell-Mann, Network effects, Norbert Wiener, Norman Macrae, Paul Erdős, peer-to-peer, phenotype, Pierre-Simon Laplace, power law, Ray Kurzweil, reversible computing, scientific worldview, stem cell, Stuart Kauffman, synthetic biology, The Wealth of Nations by Adam Smith, Thomas Malthus, Tragedy of the Commons, Turing machine

On the other hand, people want language to be unambiguous, which they can accomplish by using different words for similar but nonidentical meanings. Zipf showed mathematically that these two pressures working together could produce the observed power-law distribution. In the 1950s, Benoit Mandelbrot, of fractal fame, had a somewhat different explanation, in terms of information content. Following Claude Shannon’s formulation of information theory (cf. chapter 3), Mandelbrot considered a word as a “message” being sent from a “source” who wants to maximize the amount of information while minimizing the cost of sending that information. For example, the words feline and cat mean the same thing, but the latter, being shorter, costs less (or takes less energy) to transmit.

The nineteenth-century studies of thermodynamics were inspired and driven by the challenge of improving steam engines. The studies of information by mathematician Claude Shannon were likewise driven by the twentieth-century revolution in communications—particularly the development of the telegraph and telephone. In the 1940s, Shannon adapted Boltzmann’s ideas to the more abstract realm of communications. Shannon worked at Bell Labs, a part of the American Telephone and Telegraph Company (AT&T). One of the most important problems for AT&T was to figure out how to transmit signals more quickly and reliably over telegraph and telephone wires. Claude Shannon, 1916–2001. (Reprinted with permission of Lucent Technologies Inc.

A prime mover of this group was the mathematician Norbert Wiener, whose work on the control of anti-aircraft guns during World War II had convinced him that the science underlying complex systems in both biology and engineering should focus not on the mass, energy, and force concepts of physics, but rather on the concepts of feedback, control, information, communication, and purpose (or “teleology”). Norbert Wiener, 1894–1964 (AIP Emilio Segre Visual Archives) In addition to Norbert Wiener, the series of Macy Foundation conferences included several scientific luminaries of the time, such as John von Neumann, Warren McCulloch, Margaret Mead, Gregory Bateson, Claude Shannon, W. Ross Ashby, among others. The meetings led Wiener to christen a new discipline of cybernetics, from the Greek word for “steersman”—that is, one who controls a ship. Wiener summed up cybernetics as “the entrie field of control and communication theory, whether in the machine or in the animal.”


pages: 124 words: 36,360

Kitten Clone: Inside Alcatel-Lucent by Douglas Coupland

"World Economic Forum" Davos, British Empire, cable laying ship, Claude Shannon: information theory, cosmic microwave background, Downton Abbey, Golden arches theory, Great Leap Forward, Hibernia Atlantic: Project Express, hiring and firing, industrial research laboratory, Isaac Newton, Jeff Bezos, Marshall McLuhan, messenger bag, military-industrial complex, Neal Stephenson, oil shale / tar sands, pre–internet, quantum entanglement, Richard Feynman, Silicon Valley, Skype, Steve Jobs, tech worker, technological determinism, TED Talk, Turing machine, undersea cable, upwardly mobile, urban planning, UUNET, Wall-E

But we couldn’t have done anything too practical with the transistor unless someone figured out that information needs to be converted into ones and zeroes to be processed properly. That happened here, and was shared with the world in the July and October 1948 issues of the Bell System Technical Journal, in Claude Shannon’s seminal ‘A Mathematical Theory of Communication.’ In the next decades, lasers and optical fibre were developed to overcome the limitations of copper wiring; cellular-based wireless communication was developed to maximize the amount of information that could be passed within a system. An iPhone 4, if made with vacuum tubes, would fill one-quarter of the Grand Canyon.”

Cinder-block walls, ivory enamel paint. Solid creative energy. And yes, as Markus Hofmann told me earlier, the hallways are very narrow and very, very long. Two people passing each other have no choice but to say hello, and it’s easy to imagine the transistor’s 1947 inventor, William Shockley, bumping into Claude Shannon while absent-mindedly whistling an Andrews Sisters tune, then saying, “Claude, wouldn’t it be something if all information could be reduced to ones and zeroes?” Aside from the dimmed lighting (cost-cutting) and the blue recycling boxes outside some of the doors, there’s no tangible difference in these hallways from what would have been here six decades earlier.

Maybe we can control what we invent and when we invent it. Since 1925, Bell Labs has generated seven Nobel prizes and changed the course of humanity with stunning regularity. No California redwood forest full of brainstorming genius billionaires could compete with Bell Labs’ creative heyday of the mid-twentieth century: the transistor, information theory, lasers, solar energy, radio astronomy, microchips, UNIX, mobile phones, mobile networks—all invented here. Most of these inventions led to the Internet in one way or other, and many of them could plausibly only have been invented here, in the way they were. So much for technological determinism.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, algorithmic bias, AlphaGo, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, behavioural economics, Bletchley Park, blockchain, Boston Dynamics, brain emulation, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, complexity theory, computer vision, Computing Machinery and Intelligence, connected car, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, fake news, Flash crash, full employment, future of work, Garrett Hardin, Geoffrey Hinton, Gerolamo Cardano, Goodhart's law, Hans Moravec, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, luminiferous ether, machine readable, machine translation, Mark Zuckerberg, multi-armed bandit, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, OpenAI, openstreetmap, P = NP, paperclip maximiser, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, surveillance capitalism, Thales of Miletus, The Future of Employment, The Theory of the Leisure Class by Thorstein Veblen, Thomas Bayes, Thorstein Veblen, Tragedy of the Commons, transport as a service, trolley problem, Turing machine, Turing test, universal basic income, uranium enrichment, vertical integration, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, web application, zero-sum game

From then on, I would be publicly committed to the view that my own field of research posed a potential risk to my own species. How Did We Get Here? The roots of AI stretch far back into antiquity, but its “official” beginning was in 1956. Two young mathematicians, John McCarthy and Marvin Minsky, had persuaded Claude Shannon, already famous as the inventor of information theory, and Nathaniel Rochester, the designer of IBM’s first commercial computer, to join them in organizing a summer program at Dartmouth College. The goal was stated as follows: The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.

AI had, by all accounts, achieved a massive breakthrough. From the point of view of AI research, the match represented no breakthrough at all. Deep Blue’s victory, impressive as it was, merely continued a trend that had been visible for decades. The basic design for chess-playing algorithms was laid out in 1950 by Claude Shannon,1 with major improvements in the early 1960s. After that, the chess ratings of the best programs improved steadily, mainly as a result of faster computers that allowed programs to look further ahead. In 1994,2 Peter Norvig and I charted the numerical ratings of the best chess programs from 1965 onwards, on a scale where Kasparov’s rating was 2805.

A very unfortunate incident with image labeling: Daniel Howley, “Google Photos mislabels 2 black Americans as gorillas,” Yahoo Tech, June 29, 2015. 72. Follow-up article on Google and gorillas: Tom Simonite, “When it comes to gorillas, Google Photos remains blind,” Wired, January 11, 2018. CHAPTER 3 1. The basic plan for game-playing algorithms was laid out by Claude Shannon, “Programming a computer for playing chess,” Philosophical Magazine, 7th ser., 41 (1950): 256–75. 2. See figure 5.12 of Stuart Russell and Peter Norvig, Artificial Intelligence: A Modern Approach, 1st ed. (Prentice Hall, 1995). Note that the rating of chess players and chess programs is not an exact science.


pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

Bill Gates: Altair 8800, Charles Babbage, Claude Shannon: information theory, computer age, Dennis Ritchie, digital divide, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Dynabook, Eratosthenes, Fairchild Semiconductor, Free Software Foundation, Gary Kildall, Grace Hopper, invention of the telegraph, Isaac Newton, Ivan Sutherland, Jacquard loom, James Watt: steam engine, John von Neumann, Joseph-Marie Jacquard, Ken Thompson, Louis Daguerre, millennium bug, Multics, Norbert Wiener, optical character recognition, popular electronics, Richard Feynman, Richard Stallman, Silicon Valley, Steve Jobs, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

John von Neumann wasn't the only person doing some major conceptual thinking about the nature of computers in the 1940s. Claude Shannon (born 1916) was another influential thinker. In Chapter 11, I discussed his 1938 master's thesis, which established the relationship between switches, relays, and Boolean algebra. In 1948, while working for Bell Telephone Laboratories, he published a paper in the Bell System Technical Journal entitled "A Mathematical Theory of Communication" that not only introduced the word bit in print but established a field of study today known as information theory. Information theory is concerned with transmitting digital information in the presence of noise (which usually prevents all the information from getting through) and how to compensate for that.

This bulb lights up if the switches describe a satisfactory cat. The switches shown in the control panel on page 104 are set for a female unneutered black cat. This satisfies your criteria, so the lightbulb is lit. Now all we have to do is design a circuit that makes this control panel work. You'll recall that Claude Shannon's thesis was entitled "A Symbolic Analysis of Relay and Switching Circuits." The relays he was referring to were quite similar to the telegraph relays that we encountered in Chapter 6. By the time of Shannon's paper, however, relays were being used for other purposes and, in particular, in the vast network of the telephone system.

In the second expression, the two operands are inverted and then combined with the Boolean OR operator. This is the same as combining the operands with the Boolean AND operator and then inverting (which is the NAND). De Morgan's Laws are an important tool for simplifying Boolean expressions and hence, for simplifying circuits. Historically, this was what Claude Shannon's paper really meant for electrical engineers. But obsessively simplifying circuits won't be a major concern in this book. It's preferable to get things working rather than to get things working as simply as possible. And what we're going to get working next is nothing less than an adding machine.


The Fractalist by Benoit Mandelbrot

Albert Einstein, Benoit Mandelbrot, Brownian motion, business cycle, Claude Shannon: information theory, discrete time, double helix, financial engineering, Georg Cantor, Henri Poincaré, Honoré de Balzac, illegal immigration, Isaac Newton, iterative process, Johannes Kepler, John von Neumann, linear programming, Louis Bachelier, Louis Blériot, Louis Pasteur, machine translation, mandelbrot fractal, New Journalism, Norbert Wiener, Olbers’ paradox, Paul Lévy, power law, Richard Feynman, statistical model, urban renewal, Vilfredo Pareto

Turning to me, he continued, “It seems that you don’t believe me!” I responded, “Of course I do. I tailored it to my needs, and am delighted that they also fit yours.” In a serious vein, liaising was a good opportunity to scout for Ph.D. topics. At Caltech, I had read the seed papers in which Claude Shannon founded information theory, and I badly wanted to know more. A get-together in London on this topic attracted me greatly, so I asked if I could attend. The air force obliged and sent me there. It was my first scientific conference. An Extended Sentence? The end of my twelve months of duty was approaching, and I was counting days.

Therefore, RLE was often filled with either the aroma of a traditional chocolate factory or the stench of a rendering plant that boiled carrion into pure white soap. I took all this as constant confirmation that the process of creation is intrinsically messy and suffers more from soulless order than from surrounding physical decay. Controversial Balance Between Conjecture and Proof Claude Shannon (1916–2001) was the intellectual leader whose wartime work, published in 1948, created information theory and provided RLE with an intellectual backbone. His work on noiseless channels was a point of departure for the theory of word frequencies presented in my Ph.D. thesis. But far more impressive was his noisy channel theorem. Actually, it was not a theorem at all, only a brilliant conjecture—in a style that is controversial, and of which I eventually became a very active supplier.

The timing was ideal because several new developments that had been “bottled up” by war conditions were being revealed in a kind of fireworks I saw on no other occasion. My restless curiosity led me to read works that were widely discussed when they appeared: Mathematical Theory of Communication by Claude Shannon, Cybernetics, or Control and Communication in the Animal and the Machine by Norbert Wiener, and Theory of Games and Economic Behavior by John von Neumann and Oskar Morgenstern. Except for a fleeting thought that I might return to mathematics in 1949 via the University of Chicago, I was beginning to think that the examples of Wiener and von Neumann might guide me to an idea big enough to make me, in some way, the Delbrück of a new field.


pages: 319 words: 90,965

The End of College: Creating the Future of Learning and the University of Everywhere by Kevin Carey

Albert Einstein, barriers to entry, Bayesian statistics, behavioural economics, Berlin Wall, Blue Ocean Strategy, business cycle, business intelligence, carbon-based life, classic study, Claude Shannon: information theory, complexity theory, data science, David Heinemeier Hansson, declining real wages, deliberate practice, discrete time, disruptive innovation, double helix, Douglas Engelbart, Douglas Engelbart, Downton Abbey, Drosophila, Fairchild Semiconductor, Firefox, Frank Gehry, Google X / Alphabet X, Gregor Mendel, informal economy, invention of the printing press, inventory management, John Markoff, Khan Academy, Kickstarter, low skilled workers, Lyft, Marc Andreessen, Mark Zuckerberg, meta-analysis, natural language processing, Network effects, open borders, pattern recognition, Peter Thiel, pez dispenser, Recombinant DNA, ride hailing / ride sharing, Ronald Reagan, Ruby on Rails, Sand Hill Road, self-driving car, Silicon Valley, Silicon Valley startup, social web, South of Market, San Francisco, speech recognition, Steve Jobs, technoutopianism, transcontinental railway, uber lyft, Vannevar Bush

Bill was enacted, the director of the national Office of Scientific Research and Development, Vannevar Bush, sent a report to President Truman titled Science: The Endless Frontier. Bush had a doctorate in electrical engineering from MIT, where he had served as a scientist and administrator. He and his colleagues had made important contributions to the emerging development of computer science; his student Claude Shannon helped develop the information theory that sits at the heart of modern computing. Science, Bush said, was a source of great good for humanity. Penicillin and other medical advances had saved countless lives. “In 1939 millions of people were employed in industries which did not even exist at the close of the last war—radio, air conditioning, rayon and other synthetic fibers, and plastics. . . .

Six months later, Simon attended a conference at Dartmouth College, where he and a small group of scientists gave a name to this new field of research: artificial intelligence. The study of the human mind and the exploding power of information technology were coming together, and the smartest people in the world were in the middle of the action. Among the Dartmouth participants was Claude Shannon, a former student of Vannevar Bush at MIT and one of the fathers of modern information theory. In addition to creating the blueprint for the Cold War university, Bush had also seen the future technological revolution. In a 1945 Atlantic article titled “As We May Think,” Bush observed that various fields of manufacturing and computing were on trajectories of improvement that would soon lead to changes that, although they might be unknowable in the specific, were highly predictable in general.

Until, that is, the summer of 2011, when a computer science professor named Sebastian Thrun had a kind of inspiration. Thrun was born in Germany and trained in computer science and statistics at the University of Bonn. His specialty was artificial intelligence, continuing the project first outlined at the Dartmouth conference by Herbert Simon, Claude Shannon, and others back in 1956. Carnegie Mellon hired Thrun as a professor in 1995, and he spent most of the next decade in Pittsburgh working at the intersection of computer science, statistics, and machines. Artificial intelligence had come in and out of fashion over the years. Initial hopes for replicating the human mind in silicon had proved highly optimistic as the parallel march of neuroscience and cognitive psychology revealed how fantastically complicated human cognition truly was.


pages: 294 words: 96,661

The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity by Byron Reese

"World Economic Forum" Davos, agricultural Revolution, AI winter, Apollo 11, artificial general intelligence, basic income, bread and circuses, Buckminster Fuller, business cycle, business process, Charles Babbage, Claude Shannon: information theory, clean water, cognitive bias, computer age, CRISPR, crowdsourcing, dark matter, DeepMind, Edward Jenner, Elon Musk, Eratosthenes, estate planning, financial independence, first square of the chessboard, first square of the chessboard / second half of the chessboard, flying shuttle, full employment, Hans Moravec, Hans Rosling, income inequality, invention of agriculture, invention of movable type, invention of the printing press, invention of writing, Isaac Newton, Islamic Golden Age, James Hargreaves, job automation, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, lateral thinking, life extension, Louis Pasteur, low interest rates, low skilled workers, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Mary Lou Jepsen, Moravec's paradox, Nick Bostrom, On the Revolutions of the Heavenly Spheres, OpenAI, pattern recognition, profit motive, quantum entanglement, radical life extension, Ray Kurzweil, recommendation engine, Rodney Brooks, Sam Altman, self-driving car, seminal paper, Silicon Valley, Skype, spinning jenny, Stephen Hawking, Steve Wozniak, Steven Pinker, strong AI, technological singularity, TED Talk, telepresence, telepresence robot, The Future of Employment, the scientific method, Timothy McVeigh, Turing machine, Turing test, universal basic income, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, women in the workforce, working poor, Works Progress Administration, Y Combinator

In addition to the computer’s memory, there might also be external storage to hold data and information not currently needed. Throw in input and output devices, and one has a von Neumann setup. If, when you were reading that, your brain mapped it to your computer’s CPU, memory, hard drive, keyboard, and monitor, then move to the head of the class. Finally, in 1949, Claude Shannon wrote a paper entitled “Programming a Computer for Playing Chess” in which he described a way to reduce chess to a series of calculations that could be performed on a computer. While this may not sound like it should earn Shannon one of the four spots on the Mount Rushmore of computer history, for the first time, in a practical and realistic way, computers were thought of as not just machines to perform mathematical calculations.

Has the Fourth Age begun yet? Well, when just a few humans learned to farm, or a few isolated places developed writing, did that mark the beginning of a new age or the beginning of the end of an old one? It matters very little where we draw the line. Whether it already happened decades ago when Claude Shannon explained how a computer could be programmed to play chess, or whether it will happen in a few years when a computer can carry on a complex conversation with a human using natural language, this kind of hairsplitting is not particularly meaningful. Let’s say that the transition began no earlier than 1950 and will complete no later than 2050.

The goal was to “find how to make machines use language, form abstractions and concepts, solve [the] kinds of problems now reserved for humans, and improve themselves.” To do this, he put together a group of four computer scientists who were already thinking about machines that could think: himself, Marvin Minsky, Nathaniel Rochester, and Claude Shannon, whom we met earlier when we discussed the first chess-playing computer program. Their proposal then added a very, shall we say, “optimistic” prediction, especially given that it was 1955: “We think that a significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for a summer.”


The Myth of Artificial Intelligence: Why Computers Can't Think the Way We Do by Erik J. Larson

AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Alignment Problem, AlphaGo, Amazon Mechanical Turk, artificial general intelligence, autonomous vehicles, Big Tech, Black Swan, Bletchley Park, Boeing 737 MAX, business intelligence, Charles Babbage, Claude Shannon: information theory, Computing Machinery and Intelligence, conceptual framework, correlation does not imply causation, data science, deep learning, DeepMind, driverless car, Elon Musk, Ernest Rutherford, Filter Bubble, Geoffrey Hinton, Georg Cantor, Higgs boson, hive mind, ImageNet competition, information retrieval, invention of the printing press, invention of the wheel, Isaac Newton, Jaron Lanier, Jeff Hawkins, John von Neumann, Kevin Kelly, Large Hadron Collider, Law of Accelerating Returns, Lewis Mumford, Loebner Prize, machine readable, machine translation, Nate Silver, natural language processing, Nick Bostrom, Norbert Wiener, PageRank, PalmPilot, paperclip maximiser, pattern recognition, Peter Thiel, public intellectual, Ray Kurzweil, retrograde motion, self-driving car, semantic web, Silicon Valley, social intelligence, speech recognition, statistical model, Stephen Hawking, superintelligent machines, tacit knowledge, technological singularity, TED Talk, The Coming Technological Singularity, the long tail, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, theory of mind, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, Yochai Benkler

To win at chess, it is not enough to apply the rules; you have to know which rules to select in the first place. Turing saw chess as a handy (and no doubt entertaining) way to think about machines and the possibility of giving them intuition. Across the Atlantic, the founder of modern information theory, Turing’s colleague and friend Claude Shannon at Bell Labs, was also thinking about chess. He ­later built one of the first chess-­playing computers, an extension of work he had done ­earlier on a proto-­computer called the “differential analyzer,” which could convert certain prob­ lems in calculus into mechanical procedures.1 T H E SI M PLI FICAT ION OF I N TELLIGENCE BEGI NS Chess fascinated Turing and his colleagues in part b­ ecause it seemed that a computer could be programmed to play it, without the ­human programmer needing to know every­thing in advance. ­

The shift away from linguistics and rule-­based approaches to data-­ driven or “empirical” methods seemed to liberate AI from t­ hose early, cloudy days of work on machine translation, when seemingly endless prob­lems with capturing meaning and context plagued engineering efforts. In fact, machine translation itself was l­ ater cracked by a group of IBM researchers using a statistical (that is, not grammar-­based) approach that was essentially an ingenious application of Claude Shannon’s early work on information theory. Called the “noisy channel” approach, it viewed sentences from a source language (say, French) and a target language (say, En­glish) as an information exchange in which bad translations constituted a form of noise—­making it the system’s task to reduce the noise in the translation channel between source and target sentences.

Bletchley, 22 T he S implified W orld meanwhile, also proved a haven for thinking about computation: Bombes w ­ ere machines, and they ran programs to solve prob­lems that ­humans, by themselves, could not. I N T U I T I V E M AC H I N E S? NO. For Turing, Bletchley played a major role in crystallizing his ideas about the possibility of intelligent machines. Like his colleagues Jack Good and Claude Shannon, Turing saw the power and utility of their “brain games” as cryptanalysts during the war: they could decipher messages that ­were other­w ise completely opaque to the military. The new methods of computation ­were not just in­ter­est­ing for considering automated chess-­playing. Computation could, quite literally, sink warships.


pages: 317 words: 101,074

The Road Ahead by Bill Gates, Nathan Myhrvold, Peter Rinearson

Albert Einstein, Apple's 1984 Super Bowl advert, Berlin Wall, Bill Gates: Altair 8800, Bob Noyce, Bonfire of the Vanities, business process, California gold rush, Charles Babbage, Claude Shannon: information theory, computer age, Donald Knuth, first square of the chessboard, first square of the chessboard / second half of the chessboard, glass ceiling, global village, informal economy, invention of movable type, invention of the printing press, invention of writing, John von Neumann, knowledge worker, medical malpractice, Mitch Kapor, new economy, packet switching, popular electronics, Richard Feynman, Ronald Reagan, SimCity, speech recognition, Steve Ballmer, Steve Jobs, Steven Pinker, Ted Nelson, telemarketer, the scientific method, The Wealth of Nations by Adam Smith, transaction costs, Turing machine, Turing test, Von Neumann architecture

This is why a computer's capacity to compress digital data, store or transmit it, then expand it back into its original form is so useful and will become more so. Quickly, here's how the computer accomplishes these feats. It goes back to Claude Shannon, the mathematician who in the 1930s recognized how to express information in binary form. During World War II, he began developing a mathematical description of information and founded a field that later became known as information theory. Shannon defined information as the reduction of uncertainty. By this definition, if you already know it is Saturday and someone tells you it is Saturday, you haven't been given any information.

It is hard to sort out the paternity of the modern computer, because much of the thinking and work was done in the United States and Britain during World War II under the cloak of wartime secrecy. Three major contributors were Alan Turing, Claude Shannon, and John von Neumann. In the mid-1930s, Alan Turing, like Babbage a superlative Cambridge-trained British mathematician, proposed what is known today as a Turing machine. It was his version of a completely general-purpose calculating machine that could be instructed to work with almost any kind of information. In the late 1930s, when Claude Shannon was still a student, he demonstrated that a machine executing logical instructions could manipulate information. His insight, the subject of his master's thesis, was about how computer circuits—closed for true and open for false—could perform logical operations, using the number 1 to represent "true" and 0 to represent "false."

By this definition, if you already know it is Saturday and someone tells you it is Saturday, you haven't been given any information. On the other hand, if you're not sure of the day and someone tells you it is Saturday, you've been given information, because your uncertainty has been reduced. Shannon's information theory eventually led to other break-throughs. One was effective data-compression, vital to both computing and communications. On the face of it what he said is obvious: Those parts of data that don't provide unique information are redundant and can be eliminated. Headline writers leave out nonessential words, as do people paying by the word to send a telegraph message or place a classified advertisement.


pages: 223 words: 52,808

Intertwingled: The Work and Influence of Ted Nelson (History of Computing) by Douglas R. Dechow

3D printing, Apple II, Bill Duvall, Brewster Kahle, Buckminster Fuller, Claude Shannon: information theory, cognitive dissonance, computer age, Computer Lib, conceptual framework, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Snowden, game design, HyperCard, hypertext link, Ian Bogost, information retrieval, Internet Archive, Ivan Sutherland, Jaron Lanier, knowledge worker, linked data, Marc Andreessen, Marshall McLuhan, Menlo Park, Mother of all demos, pre–internet, Project Xanadu, RAND corporation, semantic web, Silicon Valley, software studies, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, TED Talk, The Home Computer Revolution, the medium is the message, Vannevar Bush, Wall-E, Whole Earth Catalog

Xanadu is a daring design that presented awesome challenges: how to link to evolving documents, how to track changes to a document, how to manipulate linkages, how to organize archival storage, how to name the target of a link, how to track micro-copyright royalties, how to organize the physical storage of a universe of discourse and how to scale storage and processing around the world. Many people are still skeptical of the need for bi-directional links. I am one who suspects links might only occasionally need to be bi-directional, or that a pair of one-way links could simulate a bi-directional link. Claude Shannon’s popular demonstration of his computer-controlled maze-navigating mouse was essential to the success of his project. Shannon’s demonstration appeared as a segment in the television show, Time Machine: Robots [2]. Shannon went to a lot of trouble to prepare a tabletop maze and to eliminate any arm or cord connecting the mouse to the computer.

Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited. References 1. Nelson TH (1974) Computer lib: you can and must understand computers now/dream Machines. Hugo’s Book Service, Chicago 2. Time Machine: Robots. Robots. History Channel. Aug.–Sept. 2000. Television. See segment on Claude, Shannon. 1952. Theseus Maze-Solving Mouse. (begins at 9:16 in the video). http://​youtu.​be/​KmURvu4x0Do Part III Hypertext and Ted Nelson-Influenced Research © The Author(s) 2015 Douglas R. Dechow and Daniele C. Struppa (eds.)IntertwingledHistory of Computing10.1007/978-3-319-16925-5_9 9.

From Nelson, Computer Lib/Dream Machines (Courtesy of Theodor Holm Nelson) The problem of the relationship between coding and thinking has always been central to the work of Theodor Holm Nelson, and a key aspect of his influence both inside and outside computer fields has been his unwavering insistence on the epistemological consequences of this relationship, often discussed under the rubric he calls “systems humanism.” While there is every reason to read Nelson as a figure in the modern history of information theory and design, there are as many reasons to read him in the stream of the contemporary humanities. More concretely, there are excellent reasons to consider Nelson’s work—from his earliest efforts such as the literary journal, Nothing, through to his visionary samizdat manifesto, Computer Lib/Dream Machines, and his recent work reconceptualizing the spreadsheet—as a guide to the universe of paper as it is to that of the screen.


pages: 174 words: 56,405

Machine Translation by Thierry Poibeau

Alignment Problem, AlphaGo, AltaVista, augmented reality, call centre, Claude Shannon: information theory, cloud computing, combinatorial explosion, crowdsourcing, deep learning, DeepMind, easy for humans, difficult for computers, en.wikipedia.org, geopolitical risk, Google Glasses, information retrieval, Internet of things, language acquisition, machine readable, machine translation, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, natural language processing, Necker cube, Norbert Wiener, RAND corporation, Robert Mercer, seminal paper, Skype, speech recognition, statistical model, technological singularity, Turing test, wikimedia commons

These propositions were the first step toward a global approach to automatic translation but were quickly recognized as too simplistic, particularly by Weaver. Weaver’s Memorandum The father of machine translation—and more generally of natural language processing—is unquestionably Warren Weaver. Along with Claude Shannon, he was the author of a mathematical model of communication in 1949. His proposal was very general and therefore applicable to many contexts. In Weaver and Shannon’s model, a message is first encoded by a source (which can be a human or a machine), sent, and then decoded by a receiver. For example, a message can be coded in Morse code, transmitted by radio, and then decoded in order to be comprehensible by a human.

The key focus of this short report was in fact translation needs: the usefulness of translation for relevant agencies—mostly the public sector and businesses related to security and defense; the report observes that the majority of requested translations are of negligible interest, and ultimately are either partially read or not read at all—and the costs associated with these translations. The discussion on machine translations takes up only a short five-page chapter. The Automatic Language Processing Advisory Committee (ALPAC) was directed by John R. Pierce, an information and communication theory specialist (he had worked with Claude Shannon in particular; see chapter 5). In addition to Pierce, the committee was made up of linguists, artificial intelligence specialists, and a psychologist. None of the committee members were working on machine translation at the time of the report, though two of the members (David G. Hays and Anthony G.

We must also keep in mind the relative lack of computers and their limited capabilities—at the time of punch cards—which drastically restricted possibilities for experimentation. However, Bar-Hillel’s report raised doubts not only for those funding the research, but also for researchers themselves. Several leading figures left the field at the beginning of the 1960s and moved to research in linguistics, computer science, or information theory. Certain researchers were even more negative than Bar-Hillel himself on machine translation. Alternatively, the demonstration gave a glimpse of the numerous problems the first projects had underestimated. Georgetown’s and IBM’s attempt to industrialize practical solutions yielded very poor results.


pages: 180 words: 55,805

The Price of Tomorrow: Why Deflation Is the Key to an Abundant Future by Jeff Booth

3D printing, Abraham Maslow, activist fund / activist shareholder / activist investor, additive manufacturing, AI winter, Airbnb, Albert Einstein, AlphaGo, Amazon Web Services, artificial general intelligence, augmented reality, autonomous vehicles, basic income, bitcoin, blockchain, Bretton Woods, business intelligence, butterfly effect, Charles Babbage, Claude Shannon: information theory, clean water, cloud computing, cognitive bias, collapse of Lehman Brothers, Computing Machinery and Intelligence, corporate raider, creative destruction, crony capitalism, crowdsourcing, cryptocurrency, currency manipulation / currency intervention, dark matter, deep learning, DeepMind, deliberate practice, digital twin, distributed ledger, Donald Trump, Elon Musk, fiat currency, Filter Bubble, financial engineering, full employment, future of work, game design, gamification, general purpose technology, Geoffrey Hinton, Gordon Gekko, Great Leap Forward, Hyman Minsky, hype cycle, income inequality, inflation targeting, information asymmetry, invention of movable type, Isaac Newton, Jeff Bezos, John Maynard Keynes: Economic Possibilities for our Grandchildren, John von Neumann, Joseph Schumpeter, late fees, low interest rates, Lyft, Maslow's hierarchy, Milgram experiment, Minsky moment, Modern Monetary Theory, moral hazard, Nelson Mandela, Network effects, Nick Bostrom, oil shock, OpenAI, pattern recognition, Ponzi scheme, quantitative easing, race to the bottom, ride hailing / ride sharing, self-driving car, software as a service, technoutopianism, TED Talk, the long tail, the scientific method, Thomas Bayes, Turing test, Uber and Lyft, uber lyft, universal basic income, winner-take-all economy, X Prize, zero-sum game

In the test, a human evaluator would have a conversation with two others, one being a machine and one a human, and the test would be passed when the human evaluator could not distinguish between the human and machine—in short, when humans can’t distinguish artificial from real intelligence. Around the same time that Turing was publishing “Computing Machinery and Intelligence,” another eminent thinker named Claude Shannon (1916–2001) was breaking barriers that enabled many of the advances in computers and artificial intelligence that we now take for granted. Shannon was an American mathematician and one of the main architects of the Information Age. Although not as well known, his breakthroughs rival Albert Einstein’s in that he changed the way we think about information.

Karl Popper, as quoted by Mark Damazer, “In Our Time’s Greatest Philosopher Vote,” In Our Time (BBC 4). 47. “The Babbage Engine,” Computer History Museum. computer history.org/babbage. 48. Claude E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal, 1948. 49. John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon, “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” August 31, 1955. Available at www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html. 50. Jack Copeland, “Biography of Turing,” AlanTuring.net, July 2000. alanturing.net/turing_archive/pages/Reference%20Articles/Bio%20of%20Alan%20Turing.html. 51.

By doing so, he invented a unit of measure for information, the bit. In partial messages, one bit of information cuts the number of possibilities in half for the receiver. A message that doesn’t reduce the possibilities for the receiver transmits zero bits of information. Because of Shannon’s information theory, for the first time, information became quantifiable. Measuring information and its growth became as easy as measuring anything else, and information processing, storage, and retrieval were born. As computers and storage of information made it possible to analyze more information, artificial intelligence research was born at a workshop at Dartmouth College in 1956.


pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity by Amy Webb

"Friedman doctrine" OR "shareholder theory", Ada Lovelace, AI winter, air gap, Airbnb, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, AlphaGo, Andy Rubin, artificial general intelligence, Asilomar, autonomous vehicles, backpropagation, Bayesian statistics, behavioural economics, Bernie Sanders, Big Tech, bioinformatics, Black Lives Matter, blockchain, Bretton Woods, business intelligence, Cambridge Analytica, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, Computing Machinery and Intelligence, CRISPR, cross-border payments, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, Demis Hassabis, Deng Xiaoping, disinformation, distributed ledger, don't be evil, Donald Trump, Elon Musk, fail fast, fake news, Filter Bubble, Flynn Effect, Geoffrey Hinton, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Herman Kahn, high-speed rail, Inbox Zero, Internet of things, Jacques de Vaucanson, Jeff Bezos, Joan Didion, job automation, John von Neumann, knowledge worker, Lyft, machine translation, Mark Zuckerberg, Menlo Park, move fast and break things, Mustafa Suleyman, natural language processing, New Urbanism, Nick Bostrom, one-China policy, optical character recognition, packet switching, paperclip maximiser, pattern recognition, personalized medicine, RAND corporation, Ray Kurzweil, Recombinant DNA, ride hailing / ride sharing, Rodney Brooks, Rubik’s Cube, Salesforce, Sand Hill Road, Second Machine Age, self-driving car, seminal paper, SETI@home, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart cities, South China Sea, sovereign wealth fund, speech recognition, Stephen Hawking, strong AI, superintelligent machines, surveillance capitalism, technological singularity, The Coming Technological Singularity, the long tail, theory of mind, Tim Cook: Apple, trade route, Turing machine, Turing test, uber lyft, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero day

There wasn’t a way to build a thinking machine—the processes, materials, and power weren’t yet available—and so the theory couldn’t be tested. The leap from theoretical thinking machines to computers that began to mimic human thought happened in the 1930s with the publication of two seminal papers: Claude Shannon’s “A Symbolic Analysis of Switching and Relay Circuits” and Alan Turing’s “On Computable Numbers, with an Application to the Entscheidungsproblem.” As an electrical engineering student at MIT, Shannon took an elective course in philosophy—an unusual diversion. Boole’s An Investigation of the Laws of Thought became the primary reference for Shannon’s thesis.

This would require his fellow researchers to observe cognition without spiritualism and to believe in the plausibility of intelligent machines that, unlike people, would make decisions in a nonconscious way. The Summer and Winter of AI In 1955, professors Marvin Minsky (mathematics and neurology) and John McCarthy (mathematics), along with Claude Shannon (a mathematician and cryptographer at Bell Labs) and Nathaniel Rochester (a computer scientist at IBM), proposed a two-month workshop to explore Turing’s work and the promise of machine learning. Their theory: if it was possible to describe every feature of human intelligence, then a machine could be taught to simulate it.17 But it was going to take a broad, diverse group of experts in many different fields.

Professors Allen Newell, Herbert Simon, and Cliff Shaw came up with a way to discover proofs of logical theorems and simulated the process by hand—a program they called Logic Theorist—at one of the general sessions. It was the first program to mimic the problem-solving skills of a human. (Eventually, it would go on to prove 38 of the first 52 theorems in Alfred North Whitehead and Bertrand Russell’s Principia Mathematica, a standard text on the foundations of mathematics.) Claude Shannon, who had several years earlier proposed teaching computers to play chess against humans, got the opportunity to show a prototype of his program, which was still under construction.20 McCarthy and Minsky’s expectations for groundbreaking advancements in AI didn’t materialize that summer at Dartmouth.


pages: 350 words: 98,077

Artificial Intelligence: A Guide for Thinking Humans by Melanie Mitchell

Ada Lovelace, AI winter, Alignment Problem, AlphaGo, Amazon Mechanical Turk, Apple's 1984 Super Bowl advert, artificial general intelligence, autonomous vehicles, backpropagation, Bernie Sanders, Big Tech, Boston Dynamics, Cambridge Analytica, Charles Babbage, Claude Shannon: information theory, cognitive dissonance, computer age, computer vision, Computing Machinery and Intelligence, dark matter, deep learning, DeepMind, Demis Hassabis, Douglas Hofstadter, driverless car, Elon Musk, en.wikipedia.org, folksonomy, Geoffrey Hinton, Gödel, Escher, Bach, I think there is a world market for maybe five computers, ImageNet competition, Jaron Lanier, job automation, John Markoff, John von Neumann, Kevin Kelly, Kickstarter, license plate recognition, machine translation, Mark Zuckerberg, natural language processing, Nick Bostrom, Norbert Wiener, ought to be enough for anybody, paperclip maximiser, pattern recognition, performance metric, RAND corporation, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Rodney Brooks, self-driving car, sentiment analysis, Silicon Valley, Singularitarianism, Skype, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, tacit knowledge, tail risk, TED Talk, the long tail, theory of mind, There's no reason for any individual to have a computer in his home - Ken Olsen, trolley problem, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, world market for maybe five computers

In graduate school in the mathematics department at Princeton, McCarthy had met a fellow student, Marvin Minsky, who shared his fascination with the potential of intelligent computers. After graduating, McCarthy had short-lived stints at Bell Labs and IBM, where he collaborated, respectively, with Claude Shannon, the inventor of information theory, and Nathaniel Rochester, a pioneering electrical engineer. Once at Dartmouth, McCarthy persuaded Minsky, Shannon, and Rochester to help him organize “a 2 month, 10 man study of artificial intelligence to be carried out during the summer of 1956.”1 The term artificial intelligence was McCarthy’s invention; he wanted to distinguish this field from a related effort called cybernetics.2 McCarthy later admitted that no one really liked the name—after all, the goal was genuine, not “artificial,” intelligence—but “I had to call it something, so I called it ‘Artificial Intelligence.’”3 The four organizers submitted a proposal to the Rockefeller Foundation asking for funding for the summer workshop.

In particular, the best-known reinforcement-learning successes have been in the domain of game playing. Applying reinforcement learning to games is the topic of the next chapter. 9 Game On Since the earliest days of AI, enthusiasts have been obsessed with creating programs that can beat humans at games. In the late 1940s, both Alan Turing and Claude Shannon, two founders of the computer age, wrote programs to play chess before there were even computers that could run their code. In the decades that followed, many a young game fanatic has been driven to learn to program in order to get computers to play their favorite game, whether it be checkers, chess, backgammon, Go, poker, or, more recently, video games.

Like Samuel’s checkers player before it, Deep Blue’s defeat of Kasparov spurred a significant increase in IBM’s stock price.16 This defeat also generated considerable consternation in the media about the implications for superhuman intelligence as well as doubts about whether humans would still be motivated to play chess. But in the decades since Deep Blue, humanity has adapted. As Claude Shannon wrote presciently in 1950, a machine that can surpass humans at chess “will force us either to admit the possibility of mechanized thinking or to further restrict our concept of thinking.”17 The latter happened. Superhuman chess playing is now seen as something that doesn’t require general intelligence.


pages: 505 words: 161,581

The Founders: The Story of Paypal and the Entrepreneurs Who Shaped Silicon Valley by Jimmy Soni

activist fund / activist shareholder / activist investor, Ada Lovelace, AltaVista, Apple Newton, barriers to entry, Big Tech, bitcoin, Blitzscaling, book value, business logic, butterfly effect, call centre, Carl Icahn, Claude Shannon: information theory, cloud computing, Colonization of Mars, Computing Machinery and Intelligence, corporate governance, COVID-19, crack epidemic, cryptocurrency, currency manipulation / currency intervention, digital map, disinformation, disintermediation, drop ship, dumpster diving, Elon Musk, Fairchild Semiconductor, fear of failure, fixed income, General Magic , general-purpose programming language, Glass-Steagall Act, global macro, global pandemic, income inequality, index card, index fund, information security, intangible asset, Internet Archive, iterative process, Jeff Bezos, Jeff Hawkins, John Markoff, Kwajalein Atoll, Lyft, Marc Andreessen, Mark Zuckerberg, Mary Meeker, Max Levchin, Menlo Park, Metcalfe’s law, mobile money, money market fund, multilevel marketing, mutually assured destruction, natural language processing, Network effects, off-the-grid, optical character recognition, PalmPilot, pattern recognition, paypal mafia, Peter Thiel, pets.com, Potemkin village, public intellectual, publish or perish, Richard Feynman, road to serfdom, Robert Metcalfe, Robert X Cringely, rolodex, Sand Hill Road, Satoshi Nakamoto, seigniorage, shareholder value, side hustle, Silicon Valley, Silicon Valley startup, slashdot, SoftBank, software as a service, Startup school, Steve Ballmer, Steve Jobs, Steve Jurvetson, Steve Wozniak, technoutopianism, the payments system, transaction costs, Turing test, uber lyft, Vanguard fund, winner-take-all economy, Y Combinator, Y2K

“To this day,” remarked a fraud analyst, Jeremy Roybal, “I still bleed PayPal blue.” * * * Many who ended up working at PayPal came to the company circuitously. This project emerged in a similar way. In the course of writing my last book—a biography of the late Dr. Claude Shannon, the founder of the field of information theory and one of the great, forgotten geniuses of the twentieth century—I examined his employer, Bell Laboratories. Bell Labs was the research arm of the Bell Telephone company, and as a group, Bell’s scientists and engineers won six Nobel Prizes and invented, among other things, touch-tone dialing, the laser, cellular networks, communications satellites, solar cells, and the transistor.

One day, Musk hoped X.com would serve as the “global center for all money” and store the world’s dollars, deutschmarks (soon to be euros), and yen in one place. To Musk, this trajectory wasn’t revolutionary—it was obvious. Musk thought about currencies “from an information theory standpoint,” a reference to the field founded by Dr. Claude Shannon in 1948. “Money is an information system,” he explained. “Most people think money has power in and of itself. But actually, it’s really just an information system, so that we don’t have to engage in barter and that we can time-shift value in the form of loans and equity and stuff like that.”

I hope you map out some new ones in your search, and if I’m still kicking when you do, drop me a line. I will join you in nerding out on the PayPal years with gusto. J.S. More from the Author A Mind at Play About the Author © DAMON DAHLEN JIMMY SONI is an award-winning author. His book, A Mind at Play: How Claude Shannon Invented the Information Age, won the 2017 Neumann Prize, awarded by the British Society for the History of Mathematics for the best book on the history of mathematics for a general audience, and the Middleton Prize from the Institute of Electrical and Electronics Engineers (IEEE). His most recent work, Jane’s Carousel, completed with the late Jane Walentas, captured one woman’s remarkable twenty-five-year journey to restore a beloved carousel in Brooklyn Bridge Park.


pages: 272 words: 19,172

Hedge Fund Market Wizards by Jack D. Schwager

asset-backed security, backtesting, banking crisis, barriers to entry, Bear Stearns, beat the dealer, Bernie Madoff, Black-Scholes formula, book value, British Empire, business cycle, buy and hold, buy the rumour, sell the news, Claude Shannon: information theory, clean tech, cloud computing, collateralized debt obligation, commodity trading advisor, computerized trading, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, diversification, diversified portfolio, do what you love, Edward Thorp, family office, financial independence, fixed income, Flash crash, global macro, hindsight bias, implied volatility, index fund, intangible asset, James Dyson, Jones Act, legacy carrier, Long Term Capital Management, managed futures, margin call, market bubble, market fundamentalism, Market Wizards by Jack D. Schwager, merger arbitrage, Michael Milken, money market fund, oil shock, pattern recognition, pets.com, Ponzi scheme, private sector deleveraging, proprietary trading, quantitative easing, quantitative trading / quantitative finance, Reminiscences of a Stock Operator, Right to Buy, risk free rate, risk tolerance, risk-adjusted returns, risk/return, riskless arbitrage, Rubik’s Cube, Savings and loan crisis, Sharpe ratio, short selling, statistical arbitrage, Steve Jobs, systematic trading, technology bubble, transaction costs, value at risk, yield curve

For example, the basic strategy indicates standing with 16 if the dealer’s card is between two and six, and hitting otherwise. The basic strategy does not involve any card counting. 7The opening paragraph in the Wikipedia entry for Claude Shannon provides the following synopsis: Claude Elwood Shannon (April 30, 1916–February 24, 2001) was an American mathematician, electronics engineer, and cryptographer known as “the father of information theory.” Shannon is famous for having founded information theory with one landmark paper published in 1948. But he is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master’s student at MIT, he wrote a thesis demonstrating that electrical application of Boolean algebra could construct and resolve any logical, numerical relationship.

Track records such as Thorp’s prove conclusively that it is possible to beat the market and that the large group of economists who insist otherwise are choosing to believe theory over evidence.2 The contention that it is possible to beat the markets, however, does not say anything about the difficulty of the task. In fact, it is the difficulty in beating the market (the vast majority of market participants fail to do so) that helps create the illusion that markets are efficient. Thorp’s career encompasses an extraordinary number of first achievements: He co-developed (along with Claude Shannon) the first wearable computer that could be used to win at roulette. He developed the first blackjack betting strategy that provided a positive edge to the player, which he divulged in his global best seller, Beat the Dealer. The book changed the way casinos operate. Thorp along with Sheen Kassouf developed the first known systematic approach to trading warrants and other convertible securities (e.g., options, convertible bonds, convertible preferred stocks) by hedging them with offsetting stock positions, an approach they detailed in their book, Beat the Market.3 He was the first to formulate an option-pricing model that was equivalent to the Black-Scholes model.

The best way to do that was to get it published in the National Academy of Sciences, but you had to find a member who would submit the paper for you or else they wouldn’t take it. I researched the Cambridge area where I was located and found that there were two members. One member was an algebraist at Harvard who wouldn’t have any idea what I was talking about and probably wouldn’t have cared if he did. The other member was Claude Shannon at MIT. Shannon was a joint professor of mathematics and engineering, and one of only two Distinguished Professors at MIT. I went to his secretary and asked if I could get an appointment. She said, “He might see you for five minutes, but he doesn’t talk to people if he is not interested. So don’t expect more than a very brief interview.”


pages: 306 words: 82,765

Skin in the Game: Hidden Asymmetries in Daily Life by Nassim Nicholas Taleb

anti-fragile, availability heuristic, behavioural economics, Benoit Mandelbrot, Bernie Madoff, Black Swan, Brownian motion, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, cellular automata, Claude Shannon: information theory, cognitive dissonance, complexity theory, data science, David Graeber, disintermediation, Donald Trump, Edward Thorp, equity premium, fake news, financial independence, information asymmetry, invisible hand, knowledge economy, loss aversion, mandelbrot fractal, Mark Spitznagel, mental accounting, microbiome, mirror neurons, moral hazard, Murray Gell-Mann, offshore financial centre, p-value, Paradox of Choice, Paul Samuelson, Ponzi scheme, power law, precautionary principle, price mechanism, principal–agent problem, public intellectual, Ralph Nader, random walk, rent-seeking, Richard Feynman, Richard Thaler, Ronald Coase, Ronald Reagan, Rory Sutherland, Rupert Read, Silicon Valley, Social Justice Warrior, Steven Pinker, stochastic process, survivorship bias, systematic bias, tail risk, TED Talk, The Nature of the Firm, Tragedy of the Commons, transaction costs, urban planning, Yogi Berra

For, in the quarter millennia since an initial formulation of decision making under uncertainty by the mathematician Jacob Bernoulli, one that has since become standard, almost all people involved in the field have made the severe mistake of missing the effect of the difference between ensemble and time.fn1 Everyone? Not quite: every economist maybe, but not everyone: the applied mathematicians Claude Shannon and Ed Thorp, and the physicist J. L. Kelly of the Kelly Criterion got it right. They also got it in a very simple way. The father of insurance mathematics, the Swedish applied mathematician Harald Cramér, also got the point. And, more than two decades ago, practitioners such as Mark Spitznagel and myself built our entire business careers around it.

Unless one is a genius, that is, has the clarity of mind to see through the mud, or has a sufficiently profound command of probability theory to cut through the nonsense. Now, certifiably, Murray Gell-Mann is a genius (and, likely, Peters). Gell-Mann discovered the subatomic particles he himself called quarks (which got him the Nobel). Peters said that when he presented the idea to Gell-Mann, “he got it instantly.” Claude Shannon, Ed Thorp, J. L. Kelly, and Harald Cramér are, no doubt, geniuses—I can personally vouch for Thorp, who has an unmistakable clarity of mind combined with a depth of thinking that juts out in conversation. These people could get it without skin in the game. But economists, psychologists, and decision theorists have no geniuses among them (unless one counts the polymath Herb Simon, who did some psychology on the side), and odds are they never will.

All these risks add up, and the attitude of the subject reflects them all. Ruin is indivisible and invariant to the source of randomness that may cause it. Another common error in the psychology literature concerns what is called “mental accounting.” The Thorp, Kelly, and Shannon school of information theory requires that, for an investment strategy to be ergodic and eventually capture the return of the market, agents increase their risks as they are winning, but contract after losses, a technique called “playing with the house money.” In practice, it is done by threshold, for ease of execution, not complicated rules: you start betting aggressively whenever you have a profit, never when you have a deficit, as if a switch was turned on or off.


pages: 360 words: 85,321

The Perfect Bet: How Science and Math Are Taking the Luck Out of Gambling by Adam Kucharski

Ada Lovelace, Albert Einstein, Antoine Gombaud: Chevalier de Méré, beat the dealer, behavioural economics, Benoit Mandelbrot, Bletchley Park, butterfly effect, call centre, Chance favours the prepared mind, Claude Shannon: information theory, collateralized debt obligation, Computing Machinery and Intelligence, correlation does not imply causation, diversification, Edward Lorenz: Chaos theory, Edward Thorp, Everything should be made as simple as possible, Flash crash, Gerolamo Cardano, Henri Poincaré, Hibernia Atlantic: Project Express, if you build it, they will come, invention of the telegraph, Isaac Newton, Johannes Kepler, John Nash: game theory, John von Neumann, locking in a profit, Louis Pasteur, Nash equilibrium, Norbert Wiener, p-value, performance metric, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Richard Feynman, Ronald Reagan, Rubik’s Cube, statistical model, The Design of Experiments, Watson beat the top human players on Jeopardy!, zero-sum game

They eventually found him inside rolling marbles along the kitchen floor in the midst of an experiment to find out how far each would travel. After completing his PhD, Thorp headed east to work at the Massachusetts Institute of Technology. There he met Claude Shannon, one of the university’s academic giants. Over the previous decade, Shannon had pioneered the field of “information theory,” which revolutionized how data are stored and communicated; the work would later help pave the way for space missions, mobile phones, and the Internet. Thorp told Shannon about the roulette predictions, and the professor suggested they continue the work at his house a few miles outside the city.

After Thorp put together his winning blackjack system, he turned his attention to the problem of such bankroll management. Given a particular edge over the casino, what was the optimal amount to bet? He found the answer in a formula known as the Kelly criterion. The formula is named after John Kelly, a gunslinging Texan physicist who worked with Claude Shannon in the 1950s. Kelly argued that, in the long run, you should wager a percentage of your bankroll equal to your expected profit divided by the amount you’ll receive if you win. For the coin toss above, the Kelly criterion would be the expected payoff ($0.50) divided by the potential winnings ($2.00).

.), 188 heads-up poker, 172, 186, 188, 195 See also Texas hold’em poker hedge funds, 96, 97, 99–100, 101, 120, 122, 204 hedging bets, 95, 96, 99 Heeb, Randal, 198, 199, 200, 201 Henry, Thierry, 104 Heuer, Andreas, 78 Hibbs, Albert, 3–4, 7, 21 Hibernia Atlantic, 113 Hilbert, David, 150 hockey, 85, 204, 205, 207 Holt, Matthew, 88 Hong Kong Jockey Club, 43, 90 horse racing bankroll management in, 65–67 betting syndicates and, 35, 52–53, 54, 56, 57–58, 64, 66–69, 103, 206 compared to golf, 84 fading of data availability limitations in, 73 favorite-long-shot bias in, 45, 46, 57 gambling law and, 198 and the Hong Kong tracks, 43–44, 54–55, 66–69, 74, 114 measuring individual performance in, 103 and the Monte Carlo method, 61–62 pari-mutuel system in, 43–45, 66, 114 Pinnacle Sports and, 93 prediction methods in, 46, 49–50, 51–54, 55–58, 64, 68, 69, 74, 206, 207, 216, 218 and quality measurement, 50–51, 74 and regression analysis, 49, 50 robotic betting and, 115–117 university courses studying, 215 “How to Gamble If You Must” (MIT course), 213–214 human insight, importance of, 105 human language, analyzing, 166 hybrid bots, 184 hydrogen bomb, 58, 59, 60, 61, 83, 202 IBM, 166–167 iceberg order, 115 ignorance first degree of, 2 second degree of, 3, 8, 11, 21–22 third degree of, 3, 8, 12, 21 illegal betting/gambling, 42, 81–82, 90–91, 101, 198 Illegal Gambling Business Act, 200 illusion, 218 imitation game, 170–171, 177, 194 incompleteness, 150, 168, 173 incompleteness theorem, 150, 176 inconsistencies, 150 Indian Premier League, 90 infinite monkey theorem, 156–157, 199 information theory, 11 information-processing problem, 168 inheritance, 47–49, 51, 106 in-play analysis, 97–98 Internet, benefits of, 72–73 investing, 96, 97–100, 204 See also hedge funds; stock/financial markets Irish National Lottery, 33 Jackson, Eric, 167 Jagger, Joseph, 7 Jennings, Ken, 165–166 Jeopardy!


Concentrated Investing by Allen C. Benello

activist fund / activist shareholder / activist investor, asset allocation, barriers to entry, beat the dealer, Benoit Mandelbrot, Bob Noyce, Boeing 747, book value, business cycle, buy and hold, carried interest, Claude Shannon: information theory, corporate governance, corporate raider, delta neutral, discounted cash flows, diversification, diversified portfolio, Dutch auction, Edward Thorp, family office, fixed income, Henry Singleton, high net worth, index fund, John Bogle, John von Neumann, junk bonds, Louis Bachelier, margin call, merger arbitrage, Paul Samuelson, performance metric, prudent man rule, random walk, risk tolerance, risk-adjusted returns, risk/return, Robert Shiller, shareholder value, Sharpe ratio, short selling, survivorship bias, technology bubble, Teledyne, transaction costs, zero-sum game

Sure, there would be swings up and down, but over a large number of hands, that edge, properly exploited, would probably allow the player to beat the game. But how heavily should the player bet when he held an edge? And how should he bet when the deck was stacked against him? Thorp discussed his blackjack findings with MIT colleague Claude Shannon, the brilliant mathematician whose 1948 master’s thesis single‐ handedly invented information theory and ushered in the age of the digital circuit and computer. Shannon’s thesis dealt with the transmission of a signal over a noisy line. The problem: Boosting a signal also boosts the noise. How then to transmit a message without losing its meaning due to the signal being misheard or unheard?

How could he justify risking such a titanic sum for such a minuscule payoff? Thorp had a secret. He employed a little-known formula for optimal position sizing—the mysteriously named Kelly Criterion. However he plugged the details of the trade into Kelly’s formula, the answer was unavoidable: Bet the farm. Claude Shannon and Ed Thorp While working as a post‐doctoral researcher at the Massachusetts Institute of Technology (MIT) in November 1960, Thorp submitted an abstract for a talk to the annual meeting of the American Mathematical Society. The abstract, Fortune’s Formula: The Game of Blackjack, described a method for beating the casino at blackjack.

What was the optimal bet size to maximize Kelly, Shannon, and Thorp: Mathematical Investors 75 the bettor’s return? Kelly wondered if he could apply Shannon’s information theory to the problem. He discussed it with Shannon, who urged him to publish the work. Kelly’s paper appeared in the 1956 issue of the Bell System Technical Journal under the anodyne title, A New Interpretation of the Information Rate. Kelly had wanted to call the paper Information Theory and Gambling, but some AT&T executives feared that the title and his discussion of a “private wire” would remind readers that AT&T used to lease wires to organized crime figures who ran wire services reporting racetrack results to bookies.


pages: 329 words: 88,954

Emergence by Steven Johnson

A Pattern Language, agricultural Revolution, AOL-Time Warner, Brewster Kahle, British Empire, Claude Shannon: information theory, complexity theory, Danny Hillis, Douglas Hofstadter, edge city, epigenetics, game design, garden city movement, Gödel, Escher, Bach, hive mind, Howard Rheingold, hypertext link, invisible hand, Jane Jacobs, Kevin Kelly, late capitalism, Lewis Mumford, Marshall McLuhan, mass immigration, Menlo Park, mirror neurons, Mitch Kapor, Murano, Venice glass, Naomi Klein, new economy, New Urbanism, Norbert Wiener, PalmPilot, pattern recognition, pez dispenser, phenotype, Potemkin village, power law, price mechanism, profit motive, Ray Kurzweil, SimCity, slashdot, social intelligence, Socratic dialogue, stakhanovite, Steven Pinker, The Death and Life of Great American Cities, The Wealth of Nations by Adam Smith, theory of mind, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, trickle-down economics, Turing machine, Turing test, urban planning, urban renewal, Vannevar Bush

Early in his visit to Bell Labs, Turing hit upon the idea of using another Bell invention, the Vocoder—later used by rock musicians such as Peter Frampton to combine the sounds of a guitar and the human voice—as a way of encrypting speech. (By early 1943, Turing’s ideas had enabled the first secure voice transmission to cross the Atlantic, unintelligible to German eavesdroppers.) Bell Labs was the home base for another genius, Claude Shannon, who would go on to found the influential discipline of information theory, and whose work had explored the boundaries between noise and information. Shannon had been particularly intrigued by the potential for machines to detect and amplify patterns of information in noisy communication channels—a line of inquiry that promised obvious value to a telephone company, but could also save thousands of lives in a war effort that relied so heavily on the sending and breaking of codes.

Second, problems of “disorganized complexity”: problems characterized by millions or billions of variables that can only be approached by the methods of statistical mechanics and probability theory. These tools helped explain not only the behavior of molecules in a gas, or the patterns of heredity in a gene pool, but also helped life insurance companies turn a profit despite their limited knowledge about any individual human’s future health. Thanks to Claude Shannon’s work, the statistical approach also helped phone companies deliver more reliable and intelligible longdistance service. But there was a third phase to this progression, and we were only beginning to understand. “This statistical method of dealing with disorganized complexity, so powerful an advance over the earlier two-variable methods, leaves a great field untouched,” Weaver wrote.

Five years after his interactions with Turing, Shannon published a long essay in the Bell System Technical Journal that was quickly repackaged as a book called The Mathematical Theory of Communication. Dense with equations and arcane chapter titles such as “Discrete Noiseless Systems,” the book managed to become something of a cult classic, and the discipline it spawned—information theory—had a profound impact on scientific and technological research that followed, on both a theoretical and practical level. The Mathematical Theory of Communication contained an elegant, layman’s introduction to Shannon’s theory, penned by the esteemed scientist Warren Weaver, who had early on grasped the significance of Shannon’s work.


The End of Accounting and the Path Forward for Investors and Managers (Wiley Finance) by Feng Gu

active measures, Affordable Care Act / Obamacare, Alan Greenspan, barriers to entry, book value, business cycle, business process, buy and hold, carbon tax, Claude Shannon: information theory, Clayton Christensen, commoditize, conceptual framework, corporate governance, creative destruction, Daniel Kahneman / Amos Tversky, discounted cash flows, disruptive innovation, diversified portfolio, double entry bookkeeping, Exxon Valdez, financial engineering, financial innovation, fixed income, geopolitical risk, hydraulic fracturing, index fund, information asymmetry, intangible asset, inventory management, Joseph Schumpeter, junk bonds, Kenneth Arrow, knowledge economy, moral hazard, new economy, obamacare, quantitative easing, quantitative trading / quantitative finance, QWERTY keyboard, race to the bottom, risk/return, Robert Shiller, Salesforce, shareholder value, Steve Jobs, tacit knowledge, The Great Moderation, value at risk

Simply, because as far as information usefulness is concerned, newness and timeliness are of the essence. This is a subtle issue that requires elaboration. Pardon the following brief tutorial, aimed at clarifying an important information principle, central to information (communication) theory, which was developed in the 1940s by Claude Shannon and Warren Weaver and played an important role in the development of computers and communication systems.1 The theory provides a measure of the amount of information conveyed by a message. For example: “It will start raining at 3:00 p.m. tomorrow.” This measure is based on the extent of surprise, or unexpectedness of the message to the receiver.

., we exclude managers’ forecasts that occur on the same day as quarterly earnings are announced). Sample firms included in Figure 4.1 are all US-listed companies with the required data, obtained from Compustat, CRSP, I/B/E/S First Call, and the S&P SEC Filings Database. Worse Than at First Sight 49 NOTES 1. Claude Shannon and Warren Weaver, The Mathematical Theory of Communication (Champaign–Urbana: University of Illinois Press, 1949). 2. Mathematically, the amount of information conveyed by a message is measured in communication theory by the logarithm of the ratio of the prior (before the message was received) to the posterior (after the message reception) probabilities of the event (e.g., rain at 3:00 pm tomorrow) occurring.

See Book value of equity Subject Index Capitalization 89 Capital lease 222 Capital market-free test 67 Cash burn rate 175 Cash flows analyst prediction 24f concept, accounting impact (intention) 17–18 earnings, domination 18–19 metric 17 prediction, ease 22–24 increase 23f residual cash flows, usage 238f statement 5 strategies, earnings (contrast) 19f Cash flows per share (CPS) 23–24 Causal factors 231–232 Causation, examination 107–108 Cease and desist agreement 174 China National Petroleum, household name 180 Cisco, stockholder equity 126 Claims expenses 154 Coca-Cola assets 83 consumer product business 88 patents 78 Combined ratio 148 increase 150–151 Commission, sins 104 Commodities 78 Communication theory (information theory) 42 Companies balance sheet, inventory account 233 company-related information sources 44 earnings investor reaction 30–31 prediction 16f 8-K filings 106–107 estimates (above-median number), ROE prediction errors (increase) 101f market values 32–33 differences 33 249 patents information 171, 173 portfolios 218 performance prediction, ability (decline) 56–57 prospects, uncertainty (decrease) 65 trademarks, information 171, 173 Comparability 203 Competition enhancement 134 implementation concerns 205–206 Competitive advantage, achievement 120–121, 134 Competitive edge, assessment 232 Complexity.


pages: 268 words: 109,447

The Cultural Logic of Computation by David Golumbia

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, American ideology, Benoit Mandelbrot, Bletchley Park, borderless world, business process, cellular automata, citizen journalism, Claude Shannon: information theory, computer age, Computing Machinery and Intelligence, corporate governance, creative destruction, digital capitalism, digital divide, en.wikipedia.org, finite state, folksonomy, future of work, Google Earth, Howard Zinn, IBM and the Holocaust, iterative process, Jaron Lanier, jimmy wales, John von Neumann, Joseph Schumpeter, late capitalism, Lewis Mumford, machine readable, machine translation, means of production, natural language processing, Norbert Wiener, One Laptop per Child (OLPC), packet switching, RAND corporation, Ray Kurzweil, RFID, Richard Stallman, semantic web, Shoshana Zuboff, Slavoj Žižek, social web, stem cell, Stephen Hawking, Steve Ballmer, Stewart Brand, strong AI, supply-chain management, supply-chain management software, technological determinism, Ted Nelson, telemarketer, The Wisdom of Crowds, theory of mind, Turing machine, Turing test, Vannevar Bush, web application, Yochai Benkler

Weaver’s combinatoric argument fails to address Wiener’s chief points, namely that human language is able to manage ambiguity and approximation in a way quite different from the way that computers handle symbols. The persistent belief that philosophical skeptics must be wrong about the potential for machine translation is characteristic of computational thinking from the 1950s to the present. Only Claude Shannon himself—again, a dedicated scientist and engineer with limited experience in the study of language—is accorded authority by Weaver, so that “only Shannon himself, at this stage, can be a good judge of the possibilities in this direction”; remarkably, Weaver suggests that “a book written in Chinese is simply a book written in English which was coded into the ‘Chinese Code’ ” (22).

While it is no doubt inevitable that forms of long-distance communication would develop in any plausible human history, I am not persuaded that the exact forms of the telephone, telegraph, etc., are metaphysically necessary. Computation hovers provocatively between invention and discovery. Perhaps some of the most extreme computer scientists (Ray Kurzweil, Stephen Wolfram, Claude Shannon, Konrad Zuse) believe that digital computation in particular is a fundamental part of the physical universe; certainly there is at least some interesting evidence to support this view. At the same time, it seems equally if not more plausible that a wide range of calculating, quasilogical, and simulative mechanisms exist in the physical world, and that digital computation is simply one means of replicating some of these phenomena, perhaps the means that is most available to us for cultural reasons—in other words, we found digital computation because our society is already so oriented toward binarisms, hierarchy, and instrumental rationality.

Paul Edwards reports on a telling moment in the history of these laboratories: [George] Miller himself marks the year 1956, when he returned to Harvard, as the great transition. In that year his studies of language, information theory, The Cultural Logic of Computation p 36 and behavior crystallized into a new research paradigm. In an unpublished essay, Miller recounts his experience of the second Symposium on Information Theory, held at MIT on September 10–12, 1956. There he had his first realization, “more intuitive than rational, that human experimental psychology, theoretical linguistics, and the computer simulation of cognitive processes were all pieces from a larger whole.”


pages: 665 words: 159,350

Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy, and Everything Else by Jordan Ellenberg

Albert Einstein, AlphaGo, Andrew Wiles, autonomous vehicles, British Empire, Brownian motion, Charles Babbage, Claude Shannon: information theory, computer age, coronavirus, COVID-19, deep learning, DeepMind, Donald Knuth, Donald Trump, double entry bookkeeping, East Village, Edmond Halley, Edward Jenner, Elliott wave, Erdős number, facts on the ground, Fellow of the Royal Society, Geoffrey Hinton, germ theory of disease, global pandemic, government statistician, GPT-3, greed is good, Henri Poincaré, index card, index fund, Isaac Newton, Johannes Kepler, John Conway, John Nash: game theory, John Snow's cholera map, Louis Bachelier, machine translation, Mercator projection, Mercator projection distort size, especially Greenland and Africa, Milgram experiment, multi-armed bandit, Nate Silver, OpenAI, Paul Erdős, pets.com, pez dispenser, probability theory / Blaise Pascal / Pierre de Fermat, Ralph Nelson Elliott, random walk, Rubik’s Cube, self-driving car, side hustle, Snapchat, social distancing, social graph, transcontinental railway, urban renewal

ON can’t go to just any bigram; what follows has to be a bigram starting with N. (The most common follow-up, Norvig’s tables tell us, is NS, which happens 14.7% of the time, followed by NT at 11.3%.) This gives a yet more refined picture of the structure of English text. It was the engineer and mathematician Claude Shannon who first realized that the Markov chain could be used not only to analyze text, but to generate it. Suppose you want to produce a passage of text with the same statistical properties as written English, and it starts with ON. Then you can use a random number generator to select the next letter; there should be a 14.7% chance it is S, an 11.3% chance it is T, and so on.

But all the outputs somehow do sound like they come from the book you’re reading, which, let me tell you, is somewhat unsettling for the human being writing the book, even when the sentences make no literal sense at all, as in this GPT-3 output: If you’re familiar with the concept of Bayes’ theorem, then this should be easy for you. If there’s a 50% chance that the next word will be “the” and a 50% chance that it’ll be “geometry,” then the probability that the next word is either “the geometry” or “graupel” is (50/50)2 = 0. There’s a really big difference between this problem and Shannon’s text machine. Imagine a Claude Shannon with a much bigger library, trying to produce English sentences using this method, starting with five hundred words of what you’ve just read. He looks through his books until he finds one where those exact words appear in that exact order, so that he can record what word comes next. But of course he doesn’t find one!

Nim with two piles of two stones is a loss. Connect Four is a win. (Pretty dispiriting, sis!) But we don’t know whether chess is a win, a loss, or a draw. We may never know. The tree of chess has many, many leaves. We don’t know exactly how many, but it’s more than an eight-foot robot can contemplate, that’s for sure. Claude Shannon, who we last saw generating faux English text with a Markov chain, also wrote one of the first papers to take machine chess seriously; he thought the number of leaves was on the order of 1 with 120 zeroes after it, a hundred million trillion googols. That’s more than the number of . . . okay, actually, it’s more than the number of anything in the universe, and it is certainly not a number of things you’re going to comb through one by one and write little W’s, L’s, and D’s next to.


pages: 337 words: 103,522

The Creativity Code: How AI Is Learning to Write, Paint and Think by Marcus Du Sautoy

3D printing, Ada Lovelace, Albert Einstein, algorithmic bias, AlphaGo, Alvin Roth, Andrew Wiles, Automated Insights, Benoit Mandelbrot, Bletchley Park, Cambridge Analytica, Charles Babbage, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, crowdsourcing, data is the new oil, data science, deep learning, DeepMind, Demis Hassabis, Donald Trump, double helix, Douglas Hofstadter, driverless car, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, Flash crash, Gödel, Escher, Bach, Henri Poincaré, Jacquard loom, John Conway, Kickstarter, Loebner Prize, machine translation, mandelbrot fractal, Minecraft, move 37, music of the spheres, Mustafa Suleyman, Narrative Science, natural language processing, Netflix Prize, PageRank, pattern recognition, Paul Erdős, Peter Thiel, random walk, Ray Kurzweil, recommendation engine, Rubik’s Cube, Second Machine Age, Silicon Valley, speech recognition, stable marriage problem, Turing test, Watson beat the top human players on Jeopardy!, wikimedia commons

When we listen to music or explore creative mathematics, we are being exposed to the purest forms of structure and our bodies respond emotionally, to mark the recognition of this structure against the white noise of everyday life. What accounts for the difference we perceive between a random sequence of notes and a sequence we regard as music? According to the work of Claude Shannon, the father of information theory, part of our response comes down to the fact that a non-random sequence has some algorithm at its base that can compress the data, while the random sequence does not. Music is distinct from noise by virtue of its underlying algorithms. The question is which algorithms will make music that humans feel is worth listening to?

And as more stones were laid down on the board, the game seemed to get more complicated, unlike chess, where as pieces are gradually removed the game starts to simplify. The American Go Association estimates that it would take a number with 300 digits to count the number of games of Go that are legally possible. In chess the computer scientist Claude Shannon estimated that a number with 120 digits (now called the Shannon number) would suffice. These are not small numbers in either case, but they give you a sense of the wide range of possible permutations. I had played a lot of chess as a kid. I enjoyed working through the logical consequences of a proposed move.

.: A Mathematician’s Apology 150–1, 155, 178, 245–6, 247, 252 Harry Potter 234, 285, 286, 289 Hassabis, Demis 22–5, 26, 28–9, 39–40, 41–2, 43, 66, 149, 168, 233, 234, 235 Heatherwick, Thomas 234 Hello Games 116 Her (film) 220 hermeneutic code 251 heuristics 171 Hilbert, David 164, 167 Hirst, Damien 118 Hofstadter, Douglas 201, 202, 204, 206, 298 Homo erectus 104 Homo sapiens 104, 231, 288 House of Wisdom, Baghdad 46, 159 Howard, Emily 187, 209 Huang, Aja 32, 33–4 Hubbard, Freddie 214 Hubble Telescope 43 Huet, Pierre 173 Hugo, Victor 213 human code 2–3, 4–5, 6, 20, 36, 80, 87, 103, 105, 125, 133, 142, 203, 230–1, 260, 300, 301; AI and extending the 298–9; AI and picking up traits in 87–8, 98, 142–5; data and 91–2, 95; evolution of 298–9; free will and 112–13 Hume, David: Treatise on Human Nature 173, 182–3 Hut, Piet 29 IBM 196, 260–7; DeepBlue 29, 214, 260–1; DeepQA Project 267–8; Watson 261–8 ICA 118–19 imaginary numbers 12, 246 Imperial College London 291 Impressionism 13, 138 inceptionism 145 induction 173, 174–5 information theory 203 Ingels, Bjarke 234 Instagram 77 Institute for Advanced Study, Princeton 182, 184 intelligence, solving 98 intentional stance 147 Internet Movie Database 88 intuition 7, 18, 20–1, 61, 91, 151, 153, 154 James, Henry 245 Java, island of 104 jazz/musical improvisation 201, 213–14 218–21, 238, 286, 298, 299 Jennings, Ken 261, 262, 263, 266 Jeopardy!


pages: 407 words: 104,622

The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution by Gregory Zuckerman

affirmative action, Affordable Care Act / Obamacare, Alan Greenspan, Albert Einstein, Andrew Wiles, automated trading system, backtesting, Bayesian statistics, Bear Stearns, beat the dealer, behavioural economics, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, Black Monday: stock market crash in 1987, blockchain, book value, Brownian motion, butter production in bangladesh, buy and hold, buy low sell high, Cambridge Analytica, Carl Icahn, Claude Shannon: information theory, computer age, computerized trading, Credit Default Swap, Daniel Kahneman / Amos Tversky, data science, diversified portfolio, Donald Trump, Edward Thorp, Elon Musk, Emanuel Derman, endowment effect, financial engineering, Flash crash, George Gilder, Gordon Gekko, illegal immigration, index card, index fund, Isaac Newton, Jim Simons, John Meriwether, John Nash: game theory, John von Neumann, junk bonds, Loma Prieta earthquake, Long Term Capital Management, loss aversion, Louis Bachelier, mandelbrot fractal, margin call, Mark Zuckerberg, Michael Milken, Monty Hall problem, More Guns, Less Crime, Myron Scholes, Naomi Klein, natural language processing, Neil Armstrong, obamacare, off-the-grid, p-value, pattern recognition, Peter Thiel, Ponzi scheme, prediction markets, proprietary trading, quantitative hedge fund, quantitative trading / quantitative finance, random walk, Renaissance Technologies, Richard Thaler, Robert Mercer, Ronald Reagan, self-driving car, Sharpe ratio, Silicon Valley, sovereign wealth fund, speech recognition, statistical arbitrage, statistical model, Steve Bannon, Steve Jobs, stochastic process, the scientific method, Thomas Bayes, transaction costs, Turing machine, Two Sigma

One year, he received eight As in a single semester and a 4.9 grade point average (on a 5.0 scale), weighed down by a single C in humanities. After winning a prestigious mathematics competition in his senior year to become a Putnam Fellow, Berlekamp began a PhD program at MIT. He focused on electrical engineering, studying with Peter Elias and Claude Shannon. Elias and Shannon were pioneers of information theory, the groundbreaking approach to quantifying, encoding, and transmitting telephone signals, text, pictures, and other kinds of information that would provide the underpinnings for computers, the internet, and all digital media. One afternoon, Shannon passed Berlekamp in the school’s hallway.

Rather than make a fortune trading himself, Rosenberg sold computerized programs to help other investors forecast stock behavior. Edward Thorp became the first modern mathematician to use quantitative strategies to invest sizable sums of money. Thorp was an academic who had worked with Claude Shannon, the father of information theory, and embraced the proportional betting system of John Kelly, the Texas scientist who had influenced Elwyn Berlekamp. First, Thorp applied his talents to gambling, gaining prominence for his large winnings as well as his bestselling book, Beat the Dealer. The book outlined Thorp’s belief in systematic, rules-based gambling tactics, as well as his insight that players can take advantage of shifting odds within games of chance.

To illustrate his ideas, Kelly developed a method he had devised to profit at the racetrack. Kelly’s system proposed ideal bets if one somehow obtained enough information to disregard the posted odds and could instead rely on a more accurate set of probabilities—the “true odds” for each race. Kelly’s formula had grown out of Shannon’s earlier work on information theory. Spending evenings at Kelly’s home playing bridge and discussing science, math, and more, Berlekamp came to see the similarities between betting on horses and investing in stocks, given that chance plays a huge role in both. They also discussed how accurate information and properly sized wagers can provide one with an advantage.


pages: 634 words: 185,116

From eternity to here: the quest for the ultimate theory of time by Sean M. Carroll

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, cellular automata, Claude Shannon: information theory, Columbine, cosmic microwave background, cosmological constant, cosmological principle, dark matter, dematerialisation, double helix, en.wikipedia.org, gravity well, Great Leap Forward, Harlow Shapley and Heber Curtis, heat death of the universe, Henri Poincaré, Isaac Newton, Johannes Kepler, John von Neumann, Lao Tzu, Laplace demon, Large Hadron Collider, lone genius, low earth orbit, New Journalism, Norbert Wiener, pets.com, Pierre-Simon Laplace, Richard Feynman, Richard Stallman, Schrödinger's Cat, Slavoj Žižek, Stephen Hawking, stochastic process, synthetic biology, the scientific method, time dilation, wikimedia commons

There’s nothing wrong with that; after all, Boltzmann and Gibbs were proposing definitions to supercede Clausius’s perfectly good definition of entropy, which is still used today under the rubric of “thermodynamic” entropy. After quantum mechanics came on the scene, John von Neumann proposed a formula for entropy that is specifically adapted to the quantum context. As we’ll discuss in the next chapter, Claude Shannon suggested a definition of entropy that was very similar in spirit to Gibbs’s, but in the framework of information theory rather than physics. The point is not to find the one true definition of entropy; it’s to come up with concepts that serve useful functions in the appropriate contexts. Just don’t let anyone bamboozle you by pretending that one definition or the other is the uniquely correct meaning of entropy.

Rather, they provide ways that we could appear to violate the Second Law, if we didn’t properly account for the crucial role played by information. The information collected and processed by the Demon must somehow be accounted for in any consistent story of entropy. The concrete relationship between entropy and information was developed in the 1940s by Claude Shannon, an engineer/mathematician working for Bell Labs.153 Shannon was interested in finding efficient and reliable ways of sending signals across noisy channels. He had the idea that some messages carry more effective information than others, simply because the message is more “surprising” or unexpected.

See also Prigogine (1955), Kauffman (1993), and Avery (2003). 160 A good recent book is Nelson (2007). 161 He would have been even more wary in modern times; a Google search on “free energy” returns a lot of links to perpetual-motion schemes, along with some resources on clean energy. 162 Informally speaking, the concepts of “useful” and “useless” energy certainly predate Gibbs; his contribution was to attach specific formulas to the ideas, which were later elaborated on by German physicist Hermann von Helmholtz. In particular, what we are calling the “useless” energy is (in Helmholtz’s formulation) simply the temperature of the body times its entropy. The free energy is then the total internal energy of the body minus that quantity. 163 In the 1950s, Claude Shannon built “The Ultimate Machine,” based on an idea by Marvin Minsky. In its resting state, the machine looked like a box with a single switch on one face. If you were to flip the switch, the box would buzz loudly. Then the lid would open and a hand would reach out, flipping the switch back to its original position, and retreat back into the box, which became quiet once more.


pages: 460 words: 107,712

A Devil's Chaplain: Selected Writings by Richard Dawkins

Albert Einstein, Alfred Russel Wallace, Boeing 747, Buckminster Fuller, butterfly effect, Claude Shannon: information theory, complexity theory, Desert Island Discs, double helix, Douglas Hofstadter, epigenetics, experimental subject, Fellow of the Royal Society, gravity well, Gregor Mendel, Necker cube, out of africa, Peoples Temple, phenotype, placebo effect, random walk, Richard Feynman, Silicon Valley, stem cell, Stephen Hawking, the scientific method

Rather than engage in further recriminations and disputes about exactly what happened at the time of the interview, I shall try to redress the matter now in constructive fashion by answering the original question, the ‘Information Challenge’, at adequate length – the sort of length you can achieve in a proper article. The technical definition of ‘information’ was introduced by the American engineer Claude Shannon in 1948. An employee of the Bell Telephone Company, Shannon was concerned to measure information as an economic commodity. It is costly to send messages along a telephone line. Much of what passes in a message is not information: it is redundant. You could save money by recoding the message to remove the redundancy.

You could spend a lifetime reading in this ancient library and die unsated by the wonder of it. 1 See ‘Unfinished Correspondence with a Darwinian Heavyweight’ (pp. 256–62). 2 The producers never deigned to send me a copy: I completely forgot about it until an American colleague called it to my attention. 3 See Barry Williams, ‘Creationist deception exposed’, the Skeptic 18 (1998), 3, pp. 7–10, for an account of how my long pause (trying to decide whether to throw them out) was made to look like hesitant inability to answer the question, followed by an apparently evasive answer to a completely different question. 4 It is important not to blame Shannon for my verbal and intuitive way of expressing what I think of as the essence of his idea. Mathematical readers should go straight to the original, C. Shannon and W. Weaver, The Mathematical Theory of Communication (University of Illinois Press, 1949). Claude Shannon, by the way, had an imaginative sense of humour. He once built a box with a single switch on the outside. If you threw the switch, the lid of the box slowly opened, a mechanical hand appeared, reached down and switched off the box. It then put itself away and the lid closed. As Arthur C. Clarke said, ‘There is something unspeakably sinister about a machine that does nothing – absolutely nothing – except switch itself off.’ 5 These round figures are all decimal approximations.

When the prior uncertainty is some mixture of alternatives that are not equiprobable, Shannon’s formula becomes a slightly more elaborate weighted average, but it is essentially similar. By the way, Shannon’s weighted average is the same formula as physicists have used, since the nineteenth century, for entropy. The point has interesting implications but I shall not pursue them here.8 That’s enough background on information theory. It is a theory which has long held a fascination for me, and I have used it in several of my research papers over the years. Let’s now think how we might use it to ask whether the information content of genomes increases in evolution. First, recall the three-way distinction between total information capacity, the capacity that is actually used, and the true information content when stored in the most economical way possible.


pages: 396 words: 112,748

Chaos: Making a New Science by James Gleick

Benoit Mandelbrot, business cycle, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Herbert Marcuse, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, military-industrial complex, Murray Gell-Mann, Norbert Wiener, pattern recognition, power law, Richard Feynman, scientific management, Stephen Hawking, stochastic process, trade route

The patterns revealed a stretching and folding that led back to the horseshoe map of Smale. THE MOST CHARACTERISTICALLY Santa Cruzian imprint on chaos research involved a piece of mathematics cum philosophy known as information theory, invented in the late 1940s by a researcher at the Bell Telephone Laboratories, Claude Shannon. Shannon called his work “The Mathematical Theory of Communication,” but it concerned a rather special quantity called information, and the name information theory stuck. The theory was a product of the electronic age. Communication lines and radio transmissions were carrying a certain thing, and computers would soon be storing this same thing on punch cards or magnetic cylinders, and the thing was neither knowledge nor meaning.

Isaac Newton has more than a cameo: he seems to be the antihero of chaos, or the god to be overthrown. I discovered only later, reading his notebooks and letters, how wrong I’d been about him. And for twenty years I’ve been pursuing a thread that began with something Rob Shaw told me, about chaos and information theory, as invented by Claude Shannon. Chaos is a creator of information—another apparent paradox. This thread connects with something Bernardo Hubemian said: that he was seeing complex behaviors emerge unexpectedly in information networks. Something was dawning, and we’re finally starting to see what it is. James Gleick Key West February 2008 Notes on Sources and Further Reading THIS BOOK DRAWS on the words of about two hundred scientists, in public lectures, in technical writing, and most of all in interviews conducted from April 1984 to December 1986.

Because information was stored in binary on-off switches newly designated as bits, bits became the basic measure of information. From a technical point of view, information theory became a handle for grasping how noise in the form of random errors interfered with the flow of bits. It gave a way of predicting the necessary carrying capacity of communication lines or compact disks or any technology that encoded language, sounds, or images. It offered a theoretical means of reckoning the effectiveness of different schemes for correcting errors—for example, using some bits as checks on others. It put teeth into the crucial notion of “redundancy.” In terms of Shannon’s information theory, ordinary language contains greater than fifty percent redundancy in the form of sounds or letters that are not strictly necessary to conveying a message.


pages: 406 words: 109,794

Range: Why Generalists Triumph in a Specialized World by David Epstein

Airbnb, Albert Einstein, Apollo 11, Apple's 1984 Super Bowl advert, Atul Gawande, Checklist Manifesto, Claude Shannon: information theory, Clayton Christensen, clockwork universe, cognitive bias, correlation does not imply causation, Daniel Kahneman / Amos Tversky, deep learning, deliberate practice, Exxon Valdez, fail fast, Flynn Effect, Freestyle chess, functional fixedness, game design, Gene Kranz, Isaac Newton, Johannes Kepler, knowledge economy, language acquisition, lateral thinking, longitudinal study, Louis Pasteur, Mark Zuckerberg, medical residency, messenger bag, meta-analysis, Mikhail Gorbachev, multi-armed bandit, Nelson Mandela, Netflix Prize, pattern recognition, Paul Graham, precision agriculture, prediction markets, premature optimization, pre–internet, random walk, randomized controlled trial, retrograde motion, Richard Feynman, Richard Feynman: Challenger O-ring, Silicon Valley, Silicon Valley billionaire, Stanford marshmallow experiment, Steve Jobs, Steve Wozniak, Steven Pinker, sunk-cost fallacy, systems thinking, Walter Mischel, Watson beat the top human players on Jeopardy!, Y Combinator, young professional

,” in The Science of Expertise, ed. D. Hambrick et al. (New York: Routledge, 2017 [Kindle ebook]). “When we were designing”: Steve Jobs’s 2005 commencement address at Stanford: https://news.stanford.edu/2005/06/14/jobs-061505. “no one else was familiar”: J. Horgan, “Claude Shannon: Tinkerer, Prankster, and Father of Information Theory,” IEEE Spectrum 29, no. 4 (1992): 72–75. For more depth on Shannon, see J. Soni and R. Goodman, A Mind at Play (New York: Simon & Schuster, 2017). “career streams”; “traveled on an eight-lane highway”: C. J. Connolly, “Transition Expertise: Cognitive Factors and Developmental Processes That Contribute to Repeated Successful Career Transitions Amongst Elite Athletes, Musicians and Business People” (PhD thesis, Brunel University, 2011).

Those findings are reminiscent of a speech Steve Jobs gave, in which he famously recounted the importance of a calligraphy class to his design aesthetics. “When we were designing the first Macintosh computer, it all came back to me,” he said. “If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts.” Or electrical engineer Claude Shannon, who launched the Information Age thanks to a philosophy course he took to fulfill a requirement at the University of Michigan. In it, he was exposed to the work of self-taught nineteenth-century English logician George Boole, who assigned a value of 1 to true statements and 0 to false statements and showed that logic problems could be solved like math equations.


pages: 374 words: 114,600

The Quants by Scott Patterson

Alan Greenspan, Albert Einstein, AOL-Time Warner, asset allocation, automated trading system, Bear Stearns, beat the dealer, Benoit Mandelbrot, Bernie Madoff, Bernie Sanders, Black Monday: stock market crash in 1987, Black Swan, Black-Scholes formula, Blythe Masters, Bonfire of the Vanities, book value, Brownian motion, buttonwood tree, buy and hold, buy low sell high, capital asset pricing model, Carl Icahn, centralized clearinghouse, Claude Shannon: information theory, cloud computing, collapse of Lehman Brothers, collateralized debt obligation, commoditize, computerized trading, Credit Default Swap, credit default swaps / collateralized debt obligations, diversification, Donald Trump, Doomsday Clock, Dr. Strangelove, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial engineering, Financial Modelers Manifesto, fixed income, Glass-Steagall Act, global macro, Gordon Gekko, greed is good, Haight Ashbury, I will remember that I didn’t make the world, and it doesn’t satisfy my equations, index fund, invention of the telegraph, invisible hand, Isaac Newton, Jim Simons, job automation, John Meriwether, John Nash: game theory, junk bonds, Kickstarter, law of one price, Long Term Capital Management, Louis Bachelier, low interest rates, mandelbrot fractal, margin call, Mark Spitznagel, merger arbitrage, Michael Milken, military-industrial complex, money market fund, Myron Scholes, NetJets, new economy, offshore financial centre, old-boy network, Paul Lévy, Paul Samuelson, Ponzi scheme, proprietary trading, quantitative hedge fund, quantitative trading / quantitative finance, race to the bottom, random walk, Renaissance Technologies, risk-adjusted returns, Robert Mercer, Rod Stewart played at Stephen Schwarzman birthday party, Ronald Reagan, Savings and loan crisis, Sergey Aleynikov, short selling, short squeeze, South Sea Bubble, speech recognition, statistical arbitrage, The Chicago School, The Great Moderation, The Predators' Ball, too big to fail, transaction costs, value at risk, volatility smile, yield curve, éminence grise

Claude Elwood Shannon, one of the most brilliant, and eccentric, minds on the planet. On a November afternoon in 1960, Ed Thorp walked briskly across MIT’s leaf-strewn campus. A cold wind whistled off the Charles River. The freshly minted mathematics professor shuddered, and his nerves jangled at the very thought of sitting down face-to-face with Claude Shannon. Few figures at MIT were more intimidating. Shannon was the brains behind two of the twentieth century’s greatest intellectual advances. The first was the application of the binary number system to electronic circuits, which laid the groundwork for the birth of the computer. Shannon’s great breakthrough had been to take a two-symbol logic in which problems are resolved by the manipulation of two numbers, 1 and 0, and apply it to a circuit in which a 1 is represented by a switch that is turned on and a 0 by a switch that is turned off.

Axcom was to act as the trading advisor for the fund, which was nominally run as an investing firm owned by a company Simons had founded in July 1982 called Renaissance Technologies. Soon Simons’s growing crew of quants added another math wizard, Elwyn Berlekamp, a game theory expert at Berkeley. Like Ed Thorp, Berlekamp had worked with Claude Shannon and John Kelly at MIT. He’d briefly met Simons during a stint at IDA in the 1960s. The fund put up solid returns for several years, even managing to trade through Black Monday with relatively little damage. In 1988, Ax and Simons renamed the fund Medallion in honor of a math award they’d both won.

Thorp started paying regular visits to Shannon’s home later that November as the two scientists set to work on the roulette problem. Shannon called his home “Entropy House,” a nod to a core concept in information theory, borrowed from the second law of thermodynamics. The law of entropy essentially means everything in the universe will eventually turn into a homogenous, undifferentiated goop. In information theory, Shannon used entropy as a way to discover order within the apparent chaos of strings of seemingly random numbers. Shannon’s three-story wooden house overlooked the Mystic Lakes, several miles northwest of Cambridge.


Human Frontiers: The Future of Big Ideas in an Age of Small Thinking by Michael Bhaskar

"Margaret Hamilton" Apollo, 3D printing, additive manufacturing, AI winter, Albert Einstein, algorithmic trading, AlphaGo, Anthropocene, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, barriers to entry, basic income, behavioural economics, Benoit Mandelbrot, Berlin Wall, Big bang: deregulation of the City of London, Big Tech, Bletchley Park, blockchain, Boeing 747, brain emulation, Brexit referendum, call centre, carbon tax, charter city, citizen journalism, Claude Shannon: information theory, Clayton Christensen, clean tech, clean water, cognitive load, Columbian Exchange, coronavirus, cosmic microwave background, COVID-19, creative destruction, CRISPR, crony capitalism, cyber-physical system, dark matter, David Graeber, deep learning, DeepMind, deindustrialization, dematerialisation, Demis Hassabis, demographic dividend, Deng Xiaoping, deplatforming, discovery of penicillin, disruptive innovation, Donald Trump, double entry bookkeeping, Easter island, Edward Jenner, Edward Lorenz: Chaos theory, Elon Musk, en.wikipedia.org, endogenous growth, energy security, energy transition, epigenetics, Eratosthenes, Ernest Rutherford, Eroom's law, fail fast, false flag, Fellow of the Royal Society, flying shuttle, Ford Model T, Francis Fukuyama: the end of history, general purpose technology, germ theory of disease, glass ceiling, global pandemic, Goodhart's law, Google Glasses, Google X / Alphabet X, GPT-3, Haber-Bosch Process, hedonic treadmill, Herman Kahn, Higgs boson, hive mind, hype cycle, Hyperloop, Ignaz Semmelweis: hand washing, Innovator's Dilemma, intangible asset, interchangeable parts, Internet of things, invention of agriculture, invention of the printing press, invention of the steam engine, invention of the telegraph, invisible hand, Isaac Newton, ITER tokamak, James Watt: steam engine, James Webb Space Telescope, Jeff Bezos, jimmy wales, job automation, Johannes Kepler, John von Neumann, Joseph Schumpeter, Kenneth Arrow, Kevin Kelly, Kickstarter, knowledge economy, knowledge worker, Large Hadron Collider, liberation theology, lockdown, lone genius, loss aversion, Louis Pasteur, Mark Zuckerberg, Martin Wolf, megacity, megastructure, Menlo Park, Minecraft, minimum viable product, mittelstand, Modern Monetary Theory, Mont Pelerin Society, Murray Gell-Mann, Mustafa Suleyman, natural language processing, Neal Stephenson, nuclear winter, nudge unit, oil shale / tar sands, open economy, OpenAI, opioid epidemic / opioid crisis, PageRank, patent troll, Peter Thiel, plutocrats, post scarcity, post-truth, precautionary principle, public intellectual, publish or perish, purchasing power parity, quantum entanglement, Ray Kurzweil, remote working, rent-seeking, Republic of Letters, Richard Feynman, Robert Gordon, Robert Solow, secular stagnation, shareholder value, Silicon Valley, Silicon Valley ideology, Simon Kuznets, skunkworks, Slavoj Žižek, sovereign wealth fund, spinning jenny, statistical model, stem cell, Steve Jobs, Stuart Kauffman, synthetic biology, techlash, TED Talk, The Rise and Fall of American Growth, the scientific method, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, Thomas Malthus, TikTok, total factor productivity, transcontinental railway, Two Sigma, Tyler Cowen, Tyler Cowen: Great Stagnation, universal basic income, uranium enrichment, We wanted flying cars, instead we got 140 characters, When a measure becomes a target, X Prize, Y Combinator

His colleagues, John Bardeen and Walter Brattain, had made the breakthrough itself: a relatively crude semiconductor device of gold foil, a germanium crystal and a paperclip, designed to boost or switch electronic signals. The three of them won the 1956 Nobel Prize in Physics. There was the ethereal but genial Claude Shannon, progenitor of one of the twentieth century's most significant insights: information theory, not only an underwriter of the digital age but now posited as a fundamental property of existence. There was a cadre of visionary administrators, like the steady hand of Missourian Mervin Kelly: capable of balancing the competing needs of a demanding corporate parent and wayward scientists in a mushrooming organisation working at the limits of technology.

One senior Bell Labs researcher, Andrew Odlyzko, making sense of what was happening to his institution over the 1980s and 1990s, called it ‘the decline of unfettered research’.6 The image is significant – society was placing institutional chains around the curiosity-driven research agenda that enabled people like Claude Shannon to explore information theory. Research had been commodified into narrow, directed chunks. Bell had changed; ‘rational’ economic decision making was in. Writing in 1995, Odlyzko had already seen the writing was on the wall for the industrial labs that had nurtured his career: ‘the prospects for a return to unfettered research in the near future are slim.

You find the above pattern with linguistic philosophy; the Internet; human rights; the concept of zero; the steam engine; the iPhone; utilitarianism; calculus; the periodic table; helicopters; entropy; double-entry bookkeeping; written constitutions; writing itself; deep machine learning techniques; Jacobean tragedy; Spacewar! and Grand Theft Auto; information theory, quantum theory and game theory; Cartesian grids, rationality and ego. This is an ecumenical approach to ideas, but only by taking such an approach can we see the overarching picture of change, or its absence. * There is something undeniably romantic about the notion of ideas as heroically catalytic moments.


pages: 397 words: 110,130

Smarter Than You Think: How Technology Is Changing Our Minds for the Better by Clive Thompson

4chan, A Declaration of the Independence of Cyberspace, Andy Carvin, augmented reality, barriers to entry, behavioural economics, Benjamin Mako Hill, butterfly effect, citizen journalism, Claude Shannon: information theory, compensation consultant, conceptual framework, context collapse, corporate governance, crowdsourcing, Deng Xiaoping, digital rights, discovery of penicillin, disruptive innovation, Douglas Engelbart, Douglas Engelbart, drone strike, Edward Glaeser, Edward Thorp, en.wikipedia.org, Evgeny Morozov, experimental subject, Filter Bubble, folksonomy, Freestyle chess, Galaxy Zoo, Google Earth, Google Glasses, Gunnar Myrdal, guns versus butter model, Henri Poincaré, hindsight bias, hive mind, Howard Rheingold, Ian Bogost, information retrieval, iterative process, James Bridle, jimmy wales, John Perry Barlow, Kevin Kelly, Khan Academy, knowledge worker, language acquisition, lifelogging, lolcat, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Netflix Prize, Nicholas Carr, Panopticon Jeremy Bentham, patent troll, pattern recognition, pre–internet, public intellectual, Richard Feynman, Ronald Coase, Ronald Reagan, Rubik’s Cube, sentiment analysis, Silicon Valley, Skype, Snapchat, Socratic dialogue, spaced repetition, superconnector, telepresence, telepresence robot, The future is already here, The Nature of the Firm, the scientific method, the strength of weak ties, The Wisdom of Crowds, theory of mind, transaction costs, Twitter Arab Spring, Two Sigma, Vannevar Bush, Watson beat the top human players on Jeopardy!, WikiLeaks, X Prize, éminence grise

He’d have nothing to study from. He wanted to split the difference—to have a computer that was so integrated into his body and his field of vision that he could keep looking at the lecturer while he typed. Starner knew about similar prototypes. The first wearable was cocreated in 1960 by Claude Shannon, the founder of information theory, for a purpose both whimsical and mathematically ambitious: He wanted to beat the Las Vegas roulette wheels. Using sensors built into his shoes, Shannon would “type” information about how the wheel had been spun, and a circuit in his pocket would send beeps to an earpiece telling him how to bet.

Perlow describes similarly positive effects in Sleeping with Your Smartphone (Boston: Harvard Business Press, 2012), in documenting how a group of consultants with the Boston Consulting Group agreed to stay off their devices for a set number of hours per day, which they called predictable time off, and which in saner times was referred to as evenings and weekends. “an overarching ability to watch and understand your own mind”: Maggie Jackson and Bill McKibben, Distracted: The Erosion of Attention and the Coming Dark Age (Amherst, NY: Prometheus Books, 2008), Kindle edition. The first wearable was cocreated in 1960 by Claude Shannon: Edward O. Thorp, “The Invention of the First Wearable Computer,” Proceedings of the 2nd IEEE International Symposium on Wearable Computers (1998): 4–8, accessed March 23, 2013, graphics.cs.columbia.edu/courses/mobwear/resources/thorp-iswc98.pdf. Critics have already noted how unsettling it might feel: Mark Hurst, “The Google Glass Feature No One Is Talking About,” Creative Good (blog), February 28, 2013, accessed March 24, 2013, creativegood.com/blog/the-google-glass-feature-no-one-is-talking-about/; Adrian Chen, “If You Wear Google’s New Glasses You Are an Asshole,” Gawker, March 3, 2013, accessed March 24, 2013, http://gawker.com/5990395.


pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise by Nathan L. Ensmenger

barriers to entry, business process, Charles Babbage, Claude Shannon: information theory, computer age, deskilling, Donald Knuth, Firefox, Frederick Winslow Taylor, functional programming, future of work, Grace Hopper, informal economy, information retrieval, interchangeable parts, Isaac Newton, Jacquard loom, job satisfaction, John von Neumann, knowledge worker, Larry Ellison, loose coupling, machine readable, new economy, no silver bullet, Norbert Wiener, pattern recognition, performance metric, Philip Mirowski, post-industrial society, Productivity paradox, RAND corporation, Robert Gordon, scientific management, Shoshana Zuboff, sorting algorithm, Steve Jobs, Steven Levy, systems thinking, tacit knowledge, technological determinism, the market place, The Theory of the Leisure Class by Thorstein Veblen, Thomas Kuhn: the structure of scientific revolutions, Thorstein Veblen, Turing machine, Von Neumann architecture, world market for maybe five computers, Y2K

Not only did it lay claim to the valuable intellectual territory suggested by the commonsense understanding of information as knowledge or data but it also linked the discipline to the specific formulation of information developed in the late 1940s by the mathematician Claude Shannon. In his seminal book with Warren Weaver from 1949, A Mathematical Theory of Communication, Shannon had defined information in terms of the physical concept of negative entropy.61 His information theory appealed to scientists in a wide variety of disciplines, and for a time it appeared as if information might serve as a broadly unifying concept in the sciences.62 But despite its intellectual appeal, Shannon’s mathematical definition of information was never widely applicable outside of communications engineering.

Michael Mahoney, “Software as Science–Science as Software,” in Mapping the History of Computing: Software Issues, ed. Ulf Hashagen, Reinhard Keil-Slawik, and Arthur Norberg (Berlin: Springer-Verlag, 2002), 25–48. 60. ACM Curriculum Committee, “An Undergraduate Program in Computer Science.” 61. Claude Shannon and Warren Weaver, A Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949). 62. Lily Kay, “Who Wrote the Book of Life? Information and the Transformation of Molecular Biology,” Science in Context 8 (1995): 609–634; Ronald Kline, “Cybernetics, Management Science, and Technology Policy: The Emergence of ‘Information Technology’ as a Keyword, 1948–1985,” Technology and Culture 47, no. 3 (2006): 513–535. 63.


pages: 413 words: 119,587

Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots by John Markoff

A Declaration of the Independence of Cyberspace, AI winter, airport security, Andy Rubin, Apollo 11, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, basic income, Baxter: Rethink Robotics, Bill Atkinson, Bill Duvall, bioinformatics, Boston Dynamics, Brewster Kahle, Burning Man, call centre, cellular automata, Charles Babbage, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, cognitive load, collective bargaining, computer age, Computer Lib, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deep learning, DeepMind, deskilling, Do you want to sell sugared water for the rest of your life?, don't be evil, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dr. Strangelove, driverless car, dual-use technology, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, Evgeny Morozov, factory automation, Fairchild Semiconductor, Fillmore Auditorium, San Francisco, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, General Magic , Geoffrey Hinton, Google Glasses, Google X / Alphabet X, Grace Hopper, Gunnar Myrdal, Gödel, Escher, Bach, Hacker Ethic, Hans Moravec, haute couture, Herbert Marcuse, hive mind, hype cycle, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Ivan Sutherland, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, Jeff Hawkins, job automation, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, John von Neumann, Kaizen: continuous improvement, Kevin Kelly, Kiva Systems, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, military-industrial complex, Mitch Kapor, Mother of all demos, natural language processing, Neil Armstrong, new economy, Norbert Wiener, PageRank, PalmPilot, pattern recognition, Philippa Foot, pre–internet, RAND corporation, Ray Kurzweil, reality distortion field, Recombinant DNA, Richard Stallman, Robert Gordon, Robert Solow, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, Seymour Hersh, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Strategic Defense Initiative, strong AI, superintelligent machines, tech worker, technological singularity, Ted Nelson, TED Talk, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Tony Fadell, trolley problem, Turing test, Vannevar Bush, Vernor Vinge, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, We are as Gods, Whole Earth Catalog, William Shockley: the traitorous eight, zero-sum game

He found that another graduate student was a double agent in their games, plotting with McCarthy against Nash while at the same time plotting with Nash against McCarthy. Game theory was in fashion at the time and Nash later received his Nobel Prize in economics for contributions to that field. During the summer of 1952 both McCarthy and Minsky were hired as research assistants by mathematician and electrical engineer Claude Shannon at Bell Labs. Shannon, known as the father of “information theory,” had created a simple chess-playing machine in 1950, and there was early interest in biological-growth simulating programs known as “automata,” of which John Conway’s 1970 Game of Life would become the most famous. Minsky was largely distracted by his impending wedding, but McCarthy made the most of his time at Bell Labs, working with Shannon on a collection of mathematical papers that was named at Shannon’s insistence Automata Studies.11 Using the word “automata” was a source of frustration for McCarthy because it shifted the focus of the submitted papers away from the more concrete artificial intelligence ideas and toward more esoteric mathematics.

For Weizenbaum, computing systems risked fundamentally diminishing the human experience. In very much the same vein that Marxist philosopher Herbert Marcuse attacked advanced industrial society, he was concerned that the approaching Information Age might bring about a “One-Dimensional Man.” In the wake of the creation of Eliza, a group of MIT scientists, including information theory pioneer Claude Shannon, met in Concord, Massachusetts, to discuss the social implications of the phenomenon.8 The seductive quality of the interactions with Eliza concerned Weizenbaum, who believed that an obsessive reliance on technology was indicative of a moral failing in society, an observation rooted in his experiences as a child growing up in Nazi Germany.


pages: 573 words: 142,376

Whole Earth: The Many Lives of Stewart Brand by John Markoff

A Pattern Language, air freight, Anthropocene, Apple II, back-to-the-land, Benoit Mandelbrot, Bernie Madoff, Beryl Markham, Big Tech, Bill Atkinson, Biosphere 2, Brewster Kahle, Buckminster Fuller, Burning Man, butterfly effect, Claude Shannon: information theory, cloud computing, complexity theory, computer age, Computer Lib, computer vision, Danny Hillis, decarbonisation, demographic transition, disinformation, Douglas Engelbart, Douglas Engelbart, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, en.wikipedia.org, experimental subject, feminist movement, Fillmore Auditorium, San Francisco, Filter Bubble, game design, gentrification, global village, Golden Gate Park, Hacker Conference 1984, Hacker Ethic, Haight Ashbury, Herman Kahn, housing crisis, Howard Rheingold, HyperCard, intentional community, Internet Archive, Internet of things, Jane Jacobs, Jaron Lanier, Jeff Bezos, John Gilmore, John Markoff, John Perry Barlow, Kevin Kelly, Kickstarter, knowledge worker, Lao Tzu, Lewis Mumford, Loma Prieta earthquake, Marshall McLuhan, megacity, Menlo Park, Michael Shellenberger, microdosing, Mitch Kapor, Morris worm, Mother of all demos, move fast and break things, New Urbanism, Norbert Wiener, Norman Mailer, North Sea oil, off grid, off-the-grid, paypal mafia, Peter Calthorpe, Ponzi scheme, profit motive, public intellectual, Ralph Nader, RAND corporation, Ray Kurzweil, Richard Stallman, Sand Hill Road, self-driving car, shareholder value, Silicon Valley, South of Market, San Francisco, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, systems thinking, technoutopianism, Ted Nelson, Ted Nordhaus, TED Talk, The Death and Life of Great American Cities, The Hackers Conference, Thorstein Veblen, traveling salesman, Turing test, upwardly mobile, Vernor Vinge, We are as Gods, Whole Earth Catalog, Whole Earth Review, young professional

Negroponte’s Architecture Machine Group, an early effort at an advanced computer system to prototype human-computer interaction, was also a descendent of the RLE. The Media Lab’s intellectual tradition was reminiscent of what Brand had first come in contact with as a college student when he encountered cybernetics through Wiener and Warren McCulloch and information theory from the writings of Claude Shannon and Robert Fano. At MIT, visitor and host discussed the idea of coauthoring a book, but Brand had already learned enough about Negroponte, who had a large ego and a forceful personality, to realize that would be problematic. Instead, he negotiated a contract that gave him access to the Media Lab and in return agreed to pay special attention to Negroponte’s research.

A little more than a decade later Brand would also become close to Bateson, and a conversation between them, titled “Both Sides of the Necessary Paradox,” appeared first in Harper’s Magazine before being reprinted in Brand’s first book, II Cybernetic Frontiers, along with his Rolling Stone article about computer hacker culture. Years later, Brand would return to the thinking of several others he encountered during this time, among them cyberneticist Warren McCulloch, and the MIT engineers Claude Shannon and Robert Fano, who pioneered theories regarding the measurement and transmission of information. For the moment, however, he was just another freelancer without takers. * * * He had fallen in love with the idea of becoming a biologist when he read Cannery Row and its portrayal of Ed Ricketts.

., 139–40 Huizinga, Johan, 220 human potential movement, 71, 73, 84 humans: freedom of choice of, 42–43 as morally responsible for care of natural world, 42, 347, 349, 360, 361 SB’s speculations about fate of, 38–39 Human Use of Human Beings, The (Wiener), 160 Hunger Show (Life-Raft Earth), 187–88, 189, 203, 263 Huxley, Aldous, 28, 33, 41, 72, 144, 226 hypertext, concept of, 172, 230, 292, 293 I IBM, 91, 92, 96, 108, 211 I Ching, 89–90, 117, 153, 197, 253 Idaho, University of, 21 identity, fake, cyberspace and, 266 II Cybernetic Frontiers (Brand), 46, 213, 217, 221 Iktomi (Ivan Drift), 96–97 Illich, Ivan, 196 Independent, 353 information, personalization of, 279 information sharing, 180 information technology, 299–300, 315 information theory, 273 “Information wants to be free,” 270, 299, 301 information warfare, 315 In Our Time (Hemingway), 11 Institute for International Relations (IIR), 27, 34, 35, 37 Institute for the Future, 315 intelligence augmentation (IA), 83, 185, 187 International Federation of Internal Freedom, 89 International Foundation for Advanced Study, LSD experiments at, 42, 72, 73, 76–82, 273 internet, 146, 151, 279, 293, 314, 316, 326 ARPANET as forerunner of, 212 impact of, 295–96, 323 libertarianism and, 5, 348 see also cyberspace Internet Archive, 330, 332 Internet of Things, 279 Interval Research Corporation, 321–23 “Is Environmentalism Dead?”


pages: 236 words: 50,763

The Golden Ticket: P, NP, and the Search for the Impossible by Lance Fortnow

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, Claude Shannon: information theory, cloud computing, complexity theory, Donald Knuth, Erdős number, four colour theorem, Gerolamo Cardano, Isaac Newton, James Webb Space Telescope, Johannes Kepler, John von Neumann, Large Hadron Collider, linear programming, new economy, NP-complete, Occam's razor, P = NP, Paul Erdős, quantum cryptography, quantum entanglement, Richard Feynman, Rubik’s Cube, seminal paper, smart grid, Stephen Hawking, traveling salesman, Turing machine, Turing test, Watson beat the top human players on Jeopardy!, William of Occam

We can actually express computation as a series of AND, OR, and NOT operations. Some problems need only a small “circuit” of ANDs, ORs, and NOTs and others require huge circuits to compute. Sergey Yablonsky in the early 1950s investigated this notion of what we now call circuit complexity. Claude Shannon, an American who founded the field of information theory, showed that some logical functions require very high circuit complexity. Yablonsky looked at the difficulty of generating such functions. This sounds like a difficult task, but actually a strong version of P ≠ NP would mean that some simple-to-state search problems do not have small circuits that compute them.

Computer-to-computer networks would become commonplace, and efficient and low-cost methods to secure communication over these networks would be needed. Talking about the development of the P versus NP problem earlier that decade, Diffie and Hellman stated, “At the same time, theoretical developments in information theory and computer science show promise of providing probably secure cryptosystems, changing this ancient art into a science.” Before Diffie and Hellman, to decrypt a secret message you needed the same key as the one used to encrypt the message. Both parties would have to meet in advance to agree on a secret key.

Zero-Knowledge Sudoku examples are taken from an August 3, 2006, post on my blog, Computational Complexity (http://blog.computationalcomplexity.org/2006/08/zero-knowledge-sudoku.html). Works Cited Whitfield Diffie and Martin Hellman, “New Directions in Cryptography,” IEEE Transactions on Information Theory 22, no. 6 (November 1976): 644–54. Craig Gentry, “Fully Homomorphic Encryption Using Ideal Lattices,” in Proceedings of the 41st Annual ACM Symposium on Theory of Computing (New York: ACM, 1979), 169–78. Ronald Rivest, Adi Shamir, and Leonard Adleman, “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems,” Communications of the ACM 21, no. 2 (February 1978): 120–26.


pages: 330 words: 59,335

The Outsiders: Eight Unconventional CEOs and Their Radically Rational Blueprint for Success by William Thorndike

Albert Einstein, AOL-Time Warner, Atul Gawande, Berlin Wall, book value, Checklist Manifesto, choice architecture, Claude Shannon: information theory, collapse of Lehman Brothers, compound rate of return, corporate governance, discounted cash flows, diversified portfolio, Donald Trump, Fall of the Berlin Wall, Gordon Gekko, Henry Singleton, impact investing, intangible asset, Isaac Newton, junk bonds, Louis Pasteur, low interest rates, Mark Zuckerberg, NetJets, Norman Mailer, oil shock, pattern recognition, Ralph Waldo Emerson, Richard Feynman, shared worldview, shareholder value, six sigma, Steve Jobs, stock buybacks, Teledyne, Thomas Kuhn: the structure of scientific revolutions, value engineering, vertical integration

It was an exceptionally talented group, however, and each member had a significant economic interest in the company. In addition to Singleton, Roberts, and Kozmetzky (who retired from Teledyne in 1966 to run the business school at the University of Texas), board members included Claude Shannon, Singleton’s MIT classmate and the father of information theory; Arthur Rock, the legendary venture capitalist; and Fayez Sarofim, the billionaire Houston-based fund manager. This group collectively owned almost 40 percent of the company’s stock by the end of the period. . . . Even in a book filled with CEOs who were aggressive in buying back stock, Singleton is in a league of his own.


pages: 230 words: 61,702

The Internet of Us: Knowing More and Understanding Less in the Age of Big Data by Michael P. Lynch

Affordable Care Act / Obamacare, Amazon Mechanical Turk, big data - Walmart - Pop Tarts, bitcoin, Cass Sunstein, Claude Shannon: information theory, cognitive load, crowdsourcing, data science, Edward Snowden, Firefox, Google Glasses, hive mind, income inequality, Internet of things, John von Neumann, meta-analysis, Nate Silver, new economy, Nick Bostrom, Panopticon Jeremy Bentham, patient HM, prediction markets, RFID, sharing economy, Steve Jobs, Steven Levy, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Twitter Arab Spring, WikiLeaks

But it is also true that in other respects we know less, that the walls of our digital life make real objective knowledge harder to come by, and that the Internet has promoted a more passive, more deferential way of knowing.13 Like our imaginary neuromedians, we are in danger of depending too much on one way of accessing the world and letting our other senses dull. Socrates on the Way to Larissa Data is not the same thing as information. As the founding father of information theory, Claude Shannon, put it back in the 1940s, data signals are noisy, and if you want to filter out what’s meaningful from those signals, you have to filter out at least some of that noise. For Shannon the noise was literal. His groundbreaking work concerned how to extract discernible information from the signals sent across telephone lines.14 But the moral is entirely general: bits of code aren’t themselves information; information is what we extract from those bits.

., 102 French Revolution, 58 Freud, Sigmund, 184 Fricker, Miranda, 146–48, 201 Galileo, 34, 68 Galton, Francis, 120 games, gaming, 20, 191 gatekeeping, 128, 134, 146 gender, 162 in marriage, 53–54, 72 in problem solving, 137 Georgetown University, 77–78 Gilbert, Margaret, 117–19, 200 Glass, Ira, 78 Glaucon, 54 Glauconian reasoning, 54–55, 56–58 global economy, 139, 142, 152 global warming, 56, 100, 124, 144, 185, 198 Goldberg, Sandy, 115 Goldman, Alvin, 194 Google, 5, 23, 30, 113, 128, 130, 135, 163, 174, 182, 203 business model of, 9 data collection and tracking by, 90, 155–56, 158, 161 as hypothetical “guy,” 24 monopolization by, 145–46 propaganda disseminated on, 66 in reinforcement of one’s own beliefs, 56 Google Complete, 155 Google Flu Trends, 158, 183 Google Glass, 149, 186 Google-knowing, xvi, 21–40, 25 defined, 23 limitations of, 174, 180 reliance on, 6–7, 23, 25–26, 30–31, 36, 113, 116, 153, 163, 179–80 Google Maps, 116 Google Street View, 23 Gordon, Lewis, 148 gorilla suit experiment, 30 government: autonomy limited by, 109 closed politics of, 144–45 data mining and analysis used by, 9, 90–91, 93, 104, 107 online manipulation used by, 81 purpose of, 38 transparency of, 137–38 Greece, classical philosophy of, 13, 47, 166–67, 171–72 Grimm, Stephen, 164 Guardian, 81 Gulf of Mexico, oil spill in, 118 H1N1 flu outbreak, tracking of, 158 Haidt, Jonathan, 51–54, 56, 57, 60, 196–97 Halpern, Sue, 106 Harvard Law Review, 89 Hazlett, Allan, 49 HBO GO, 145 Heidegger, Martin, 177 Hemingway, Mark, 46 Higher Order Thinking Skills (HOTS), 61 Hippocrates, 13 hive-mind, 4, 136 HM (patient), 168–69 Hobbes, Thomas, 38, 109 holiness, logical debate over, 166–67 homosexuality, changing attitudes toward, 53–54 Houla massacre, 83 Howe, Jeff, 136 Huffington Post, 43 human dignity: autonomy and, 58, 59–60 information technology as threat to, 187 interconnectedness and, 184–88 privacy and, 101–9 human rights, 54, 60 digital equality as, 142–48 protection of, 145 Hume, David, 48 hyperconnectivity, 184–88 identity: digital reshaping of, 73–74 manufactured online, 80–81 “scrubbing” of, 74 illegal searches, 93 illusion, distinguishing truth from, 67–74 incidental data collection, 95–96, 99 inclusivity, 135–37 income inequality, 142 inference, 29, 60, 172 information: accuracy and reliability of, 14, 27–30, 39–40, 44–45 collected pools of, 95–100, 107–9 distribution vs. creation of, 24 immediate, unlimited access to, 3–4, 23, 30, 42, 56, 113–16, 135–36, 141, 149, 153, 180 as interconnective, 184–88 vs. knowledge, 14 sorting and filtering of, 12, 26–29, 44–45, 127–28 information age, 111 information analysis, techniques of, 8–9 information cascades, 36, 66, 121 defined, 32 information coordination problem, 38–39, 56 information “glut,” 9–10, 44 information privacy, 94–100 and autonomy, 102–7 information sharing, coordination in, 4–5 information technology: costs of, 145 data trail in, 9 democratization through, 133–38, 148 devices and platforms of, xvii–xviii, 3, 7–8, 10, 41–43, 69, 70, 77–78, 90–91, 106–7, 144, 148–49, 156, 180, 185–87 disquieting questions about, 6 in education, 148–54 experience vs., 173–74 hypothetical loss of, 5 paradox of, 6, 12, 179 pool of data in, 95–100 surveillance and, 89–109 typified and dephysicalized objects in, 69 unequal distribution of, 144–45 see also Internet of Things information theory, 12 infosphere: defined, 10 feedback loop of social constructs in, 72–73 network of, 180 pollution of, 148 vastness of, 128 InnoCentive, 136–37, 141 institutions, cooperative, 60–61 intellectual labor, 139–40 International Telecommunications Union, 135 Internet: author’s experiment in circumventing, 21–24, 25, 35 in challenges to reasonableness, 41–63 changes wrought by, xv–xviii, 6–7, 10–11, 23, 180, 184–88 as a construction, 69 cost and profit debate over, 145 as epistemic resource, 143–45 expectations of, 80–83 as force for cohesion and democracy, 55–63 freedom both limited and enhanced by, 92–93 international rates of access to, 135, 144–45 monopolization and hegemony in, 145–46 as network, 111–13 “third wave” of, 7 see also World Wide Web; specific applications Internet of Everything, 184 Internet of Things: blurring of online and offline in, 71 defined, 7–8 integration of, 10 shared economy in, 140–41 threat from, 107, 153, 184–88 Internet of Us, digital form of life as, 10, 39, 73, 83–86, 106, 179–88 interracial marriage, 54 interrogation techniques, 105 In the Plex (Levy), 5–6 Intrade, 122–23, 136 intuition, 15, 51–53 iPhone, production of, 77–78, 80, 139, 144 IQ, 52 Iraq, 83 Iraq War, 137 ISIS, 128 isolation, polarization and, 42–43 I think, I exist, 127 James, William, 11 Jefferson, Thomas, 143 Jeppesen, Lars Bo, 137 joint commitments, defined, 117–18 journalism, truth and, 84 judgment, 51–55, 57 collective vs. individual, 117, 120–25 justice, 54 “just so” stories, 27–28 Kahneman, Daniel, 29, 51 Kant, Immanuel, 34, 58–60, 62, 85 Kitcher, Philip, 182 knowing-which, as term, 171 knowledge: in big data revolution, 87–190 changing structure of, 125–32 common, 117–19 defined and explained, xvii, 12–17 democratization of, 133–38 digital, see digital knowledge; Google-knowing distribution of, 134–35, 138, 141 diverse forms of, 130 economy of, 138–45 hyperconnectivity of, 184–88 individual vs. aggregate, 120–24 information vs., 14 Internet revolution in, xv–xviii minimal definition of, 14–15 as networked, 111–32 new aspects in old problems of, 1–86, 90 personal observation in, 33–35 political economy of, 133–54 as power, 9, 98–99, 133, 185–86 practical vs. theoretical, 169, 172 procedural, 167–74 recording and storage of, 127–28 reliability of sources of, 14, 27–31, 39–40, 44–45, 114–16 as a resource, 38–39 shared cognitive process in attainment of, 114–25 three forms of, 15–17 three simple points about, 14–17 truth and, 19, 126 understanding vs. other forms of, 6, 16–17, 90, 154, 155–73, 181 value and importance of, 12–13 knowledge-based education, 61 Kodak camera, 89 Koran, 48, 61 Kornblith, Hilary, 194 Krakauer, John, 169 Kuhn, Thomas, 159–60 Lakhani, Karim, 137 Larissa, Greece, 13, 15, 182 Leonhardt, David, 122–23 Levy, Steven, 5–6 liberals, 43 libraries, 22, 134, 153–54 of Alexandria, 8 digital form of life compared to, xvi, 17, 20, 44–45, 56, 63, 128 as epistemic resource, 145 Google treated as, 24 “Library of Babel” (Borges), 17 “Lies, Damned Lies, and ‘Fact-Checking’: The Liberal Media’s Latest Attempt to Control the Discourse” (Hemingway), 46 Lifespan of a Fact, The (D’Agata), 79 literacy, 35, 134 literal artifacts: defined, 69 social artifacts and, 71, 72 lobectomy, 168 Locke, John, 33–36, 39, 60, 67–70, 85, 127, 143 “Locke’s command,” 33–34 London Underground, mapping of, 112–13 machines, control by, 116 “mainstream” media, 32 censorship of, 66 majority rule, 120 manipulation: data mining and, 97, 104–6 of expectations, 80–82 persuasion and, 55, 57–58, 81–83, 86 manuals, 22 manufacturing, 138–39 maps, 21–22 marine chronometer, 137 marketing: bots in, 82 Glauconian, 58 targeted, 9, 90, 91, 105 marriage: changing attitudes toward, 53–54 civil vs. religious, 58–59 as social construct, 72 martial arts, 170 mass, as primary quality, 68 Massive Open Online Courses (MOOCs), 150–53 mathematics, in data analysis, 160, 161 Matrix, The, 18–19, 75 Mayer-Schönberger, Viktor, 8, 158–59 measles vaccine, 7, 124 Mechanical Turk, 136, 141 media, 134 diversity in, 42 opinion affected by, 53 sensationalist, 77 memory: accessing of, 114, 115 in educational models, 152 loss of, 168–69 superceded by information technology, xv–xvi, 3, 4, 6, 94, 149 trust in, 28, 33 Meno, 13 merchandising, online vs. brick and mortar, 70 Mercier, Hugo, 54 metrics, 112 Milner, Brenda, 168–69 mirror drawing experiment, 169 misinformation, 6–7, 31–32 in support of moral truth, 78–80, 82 mob mentality, 32–33 MOOCs (Massive Open Online Courses), 150–53 moral dumbfounding, 52 morality, moral values, xvii, 6, 44, 53–54, 195 “Moses Illusion,” 29–30 motor acuity, mastery of, 170–71, 173 motor skills, 167–74 Murray, Charles J., 147 music, as dephysicalized object, 69–70 Nagel, Thomas, 84 naming, identification by, 94 narrative license, truth and falsehood in, 78–79 National Endowment for the Humanities, 61 National Science Foundation, 61 Nature, 158, 161 Netflix, 69, 145 Net neutrality, defined, 145 netography, 112–13 of knowledge, 125–32 networked age, 111 networks, 111–32 collective knowledge of, 116–25, 180 knowledge reshaped and altered by, 125–32, 133, 140 in problem solving, 136 use of term, 111–12 neural system, 26 neural transplants, 3, 5 Neurath, Otto, 128–29 neuromedia, 3–5, 12, 17–19, 113–14, 132, 149, 168, 180–82, 184 limitations of, 174 as threat to education, 153–54 Newton, Isaac, 175 New Yorker, 25, 26 New York Times, 122, 174 Nietzsche, Friedrich, 111 Nobel laureates, 149 noble lie, 83, 86 nonfiction, 79–80 NPR, 78, 80 NSA: alleged privacy abuses by, 98–100, 138 data mining by, 9, 91, 95–96, 108, 167 proposed limitations on, 109 Ntrepid, 81 nuclear weapons technology, xvii nullius in verba (take nobody’s word for it), 34 Obama, Barack, 7, 100 administration, 109 objectivity, objective truth, 45, 74 as anchor for belief, 131 in constructed world, 83–86 as foundation for knowledge, 127 observation, 49, 60 affected by expectations, 159–60 behavior affected by, 91, 97 “oceanic feeling,” 184 “offlife,” 70 OkCupid, 157 “onlife,” 70 online identity creation, 73–74 online ranking, 119–21, 136 open access research sharing sites, 135–36 open society: closed politics vs., 144–45 values of, 41–43, 62 open source software, 135 Operation Earnest Voice, 81 Operation Ivy, ix opinion: knowledge vs., 13, 14, 126 in online ranking, 119–20 persuasion and, 50–51 truth as constructed by, 85–86 optical illusions, 67 Oracle of Delphi, 16–17, 171 Outcome-Based Education (OBE), 61–62 ownership, changing concept of, 73 ox, experiment on weight of, 120 Oxford, 168 Page, Larry, 5–6 Panopticon, 91, 92, 97 perception: acuity of, 173 distinguishing truth in, 67–74 expectations and, 159–60 misleading, 29–30, 67 as relative, 67–68 perceptual incongruity, 159–60 personal freedom, 101 persuasion, 50–51, 54–55, 56–58 by bots, 82 phone books, 22 phone data collection, 95, 108 photography: privacy and, 89, 93 sexually-explicit, 99 photo-sharing, manipulation in, 82–83 Plato, 13–14, 16–17, 54, 59, 83, 126, 165–67 polarization, 7 herd mentality in, 66 isolated tribes in, 43–46 politics, 162, 196 accessibility in, 23 activism in, 66, 67 bias in, 43–46 closed, 144–45 elections in, 120–23 of knowledge, 133–54 opposition to critical thinking in, 61–62 persuasion in, 57–58, 82–83 power in, 86, 133 prediction market in, 122–23 Politifact, 46 Popper, Karl, 41–43 Postman, L.


pages: 218 words: 63,471

How We Got Here: A Slightly Irreverent History of Technology and Markets by Andy Kessler

Albert Einstein, Andy Kessler, animal electricity, automated trading system, bank run, Big bang: deregulation of the City of London, Black Monday: stock market crash in 1987, Bletchley Park, Bob Noyce, Bretton Woods, British Empire, buttonwood tree, Charles Babbage, Claude Shannon: information theory, Corn Laws, cotton gin, Dennis Ritchie, Douglas Engelbart, Edward Lloyd's coffeehouse, Fairchild Semiconductor, fiat currency, fixed income, floating exchange rates, flying shuttle, Fractional reserve banking, full employment, GPS: selective availability, Grace Hopper, invention of the steam engine, invention of the telephone, invisible hand, Isaac Newton, Jacquard loom, James Hargreaves, James Watt: steam engine, John von Neumann, joint-stock company, joint-stock limited liability company, Joseph-Marie Jacquard, Ken Thompson, Kickstarter, Leonard Kleinrock, Marc Andreessen, Mary Meeker, Maui Hawaii, Menlo Park, Metcalfe's law, Metcalfe’s law, military-industrial complex, Mitch Kapor, Multics, packet switching, pneumatic tube, price mechanism, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, proprietary trading, railway mania, RAND corporation, Robert Metcalfe, Silicon Valley, Small Order Execution System, South Sea Bubble, spice trade, spinning jenny, Steve Jobs, Suez canal 1869, supply-chain management, supply-chain management software, systems thinking, three-martini lunch, trade route, transatlantic slave trade, tulip mania, Turing machine, Turing test, undersea cable, UUNET, Wayback Machine, William Shockley: the traitorous eight

And that’s how it goes at Bell Labs, you play around with one thing and invent another. But the task of Bell Labs was to improve communications. Even into the 1940s, communications was just two wires with electrical signals representing voices running down them. In 1941, a University of Michigan undergrad and MIT grad, Claude Shannon, joined Bell Labs. As per custom, no one told him what to do. So he started trying to apply mathematics to communications. He wasn’t interested so much in the signals, but in the probability that what one shoved into one end of a channel would come out the other end intact, through all the noise that impairs the signal.

How much information could you transmit? Or how clear could the voice signal be? Through a series of papers starting in 1948 came Shannon’s Law, which calculated the maximum throughput of error-free information through a channel with a certain amount of noise in it. Put another way, Shannon laid the foundation of modern information theory, turned encryption into a science (the fallibility of Enigma-like machines is behind us) and set limits on what wireless networks could be used for. Pretty cool for 1948. Even today, an entrepreneur will occasionally introduce some newfangled communications system or protocol that promises massive throughput and it usually takes 24 hours for someone to punch holes in it for violating Shannon’s Law.


pages: 254 words: 76,064

Whiplash: How to Survive Our Faster Future by Joi Ito, Jeff Howe

3D printing, air gap, Albert Michelson, AlphaGo, Amazon Web Services, artificial general intelligence, basic income, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, Black Swan, Bletchley Park, blockchain, Burning Man, business logic, buy low sell high, Claude Shannon: information theory, cloud computing, commons-based peer production, Computer Numeric Control, conceptual framework, CRISPR, crowdsourcing, cryptocurrency, data acquisition, deep learning, DeepMind, Demis Hassabis, digital rights, disruptive innovation, Donald Trump, double helix, Edward Snowden, Elon Musk, Ferguson, Missouri, fiat currency, financial innovation, Flash crash, Ford Model T, frictionless, game design, Gerolamo Cardano, informal economy, information security, interchangeable parts, Internet Archive, Internet of things, Isaac Newton, Jeff Bezos, John Harrison: Longitude, Joi Ito, Khan Academy, Kickstarter, Mark Zuckerberg, microbiome, move 37, Nate Silver, Network effects, neurotypical, Oculus Rift, off-the-grid, One Laptop per Child (OLPC), PalmPilot, pattern recognition, peer-to-peer, pirate software, power law, pre–internet, prisoner's dilemma, Productivity paradox, quantum cryptography, race to the bottom, RAND corporation, random walk, Ray Kurzweil, Ronald Coase, Ross Ulbricht, Satoshi Nakamoto, self-driving car, SETI@home, side project, Silicon Valley, Silicon Valley startup, Simon Singh, Singularitarianism, Skype, slashdot, smart contracts, Steve Ballmer, Steve Jobs, Steven Levy, Stewart Brand, Stuxnet, supply-chain management, synthetic biology, technological singularity, technoutopianism, TED Talk, The Nature of the Firm, the scientific method, The Signal and the Noise by Nate Silver, the strength of weak ties, There's no reason for any individual to have a computer in his home - Ken Olsen, Thomas Kuhn: the structure of scientific revolutions, Two Sigma, universal basic income, unpaid internship, uranium enrichment, urban planning, warehouse automation, warehouse robotics, Wayback Machine, WikiLeaks, Yochai Benkler

Although the project remained secret until the 1970s, and all of the records associated with it were destroyed, several of the people who had worked on the project went on to build the next generation of digital computers.31 Much of their work was informed by two papers published by Claude Shannon in the late 1940s, “A Mathematical Theory of Communication”32 and “Communication Theory of Secrecy Systems,” 33 which established the field of information theory and proved that any theoretically unbreakable cipher must share the characteristics of the one-time pad. Originally developed in the late nineteenth century, and rediscovered near the end of the First World War, the one-time pad requires that both the sender and the receiver have a key made up of a string of random digits at least the length of the message.

The most obvious example of an emergent system created by humans is the economy, which clearly exhibits attributes that no individual could control. We tend to think of markets as little more than the site at which buyers meet sellers to conduct their business. But as Austrian economist Friedrich Hayek observed in a 1945 paper regarded as one of the foundational texts of information theory, markets do something far more valuable: They gather and utilize knowledge which is “widely dispersed among individuals,” Hayek writes. “Each member of society can have only a small fraction of the knowledge possessed by all, and… each is therefore ignorant of most of the facts on which the working of society rests.”

The irony is that, as with the astrological explanation of the black plague, Sokal’s paper isn’t exactly wrong; it’s just right within a coherent and uselessly recondite system of understanding, like a winning argument in a language spoken only on some remote island. None of this is to dispute the central role theory has played in the effusion of knowledge that occurred over the last century and a half. But theory on its own can be as seductive as it is dangerous. Practice must inform theory just as theory must inform practice; in a world of rapid change, this is more important than ever. In the coming years some scientific discoveries are sure to test some of our most cherished beliefs. We need to make sure we don’t assume the role of the Vatican when confronted by evidence that we’re just another planet revolving around a star.


pages: 791 words: 85,159

Social Life of Information by John Seely Brown, Paul Duguid

Alvin Toffler, business process, Charles Babbage, Claude Shannon: information theory, computer age, Computing Machinery and Intelligence, cross-subsidies, disintermediation, double entry bookkeeping, Frank Gehry, frictionless, frictionless market, future of work, George Gilder, George Santayana, global village, Goodhart's law, Howard Rheingold, informal economy, information retrieval, invisible hand, Isaac Newton, John Markoff, John Perry Barlow, junk bonds, Just-in-time delivery, Kenneth Arrow, Kevin Kelly, knowledge economy, knowledge worker, lateral thinking, loose coupling, Marshall McLuhan, medical malpractice, Michael Milken, moral hazard, Network effects, new economy, Productivity paradox, Robert Metcalfe, rolodex, Ronald Coase, scientific management, shareholder value, Shoshana Zuboff, Silicon Valley, Steve Jobs, Superbowl ad, tacit knowledge, Ted Nelson, telepresence, the medium is the message, The Nature of the Firm, the strength of weak ties, The Wealth of Nations by Adam Smith, Thomas Malthus, transaction costs, Turing test, Vannevar Bush, Y2K

Sometime around 2012, it has been predicted, Moore's Law will come up against the physical limitations of current microchip components, though by then solid-state components may well have been replaced. 6. Kelly, 1997. 7. Negroponte, 995. John Tukey coined the term bit. It stands for "binary digit." 8. It's worth remembering that formal information theory, while it holds the bit as a central concern, is indifferent to meaning. Claude Shannon, who with Warren Weaver laid the foundations of modern information theory, is quite clear about this: "the semantic aspects of communication are irrelevant to the engineering aspects" (Shannon and Weaver, 1964, p. 8). 9. The pervasive image of the new open frontier only works if we forget the presence on the old frontier of the U.S.

And while it seems quite reasonable to say, "I've got the information, but I don't understand it," it seems less reasonable to say, "I know, but I don't understand," or "I have the knowledge, but I can't see what it means.'' (Indeed, while conventional uses of information don't necessarily coincide with the specialist uses, as we noted earlier, "information theory" holds information to be independent of meaning.)5 Where Is the Knower Lost in the Information? Knowledge's personal attributes suggest that the shift toward knowledge may (or should) represent a shift toward people. Focusing on process, as we argued, draws attention away from people, Page 121 concentrating instead on disembodied processes and the information that drives them.

So, even when people are learning about, in Bruner's terms, the identity they are developing determines what they pay attention to and what they learn. What people learn about, then, is always refracted through who they are and what they are learning to be. 27 So information, while a critical part of learning, is only one among many forces at work. Information theory portrays information as a change registered in an otherwise steady state. It's a light flashing out on a dark hillside (to borrow an example from the philosopher Fred Dretske28) or the splash of a pebble breaking the calm of a still lake. In either case, the result, as the anthropologist Gregory Bateson puts it neatly, is "a difference that makes a difference."29 The importance of disturbance or change makes it almost inevitable that we focus on these.


pages: 291 words: 80,068

Framers: Human Advantage in an Age of Technology and Turmoil by Kenneth Cukier, Viktor Mayer-Schönberger, Francis de Véricourt

Albert Einstein, Andrew Wiles, Apollo 11, autonomous vehicles, Ben Bernanke: helicopter money, Berlin Wall, bitcoin, Black Lives Matter, blockchain, Blue Ocean Strategy, circular economy, Claude Shannon: information theory, cognitive dissonance, cognitive load, contact tracing, coronavirus, correlation does not imply causation, COVID-19, credit crunch, CRISPR, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, deep learning, DeepMind, defund the police, Demis Hassabis, discovery of DNA, Donald Trump, double helix, Douglas Hofstadter, Elon Musk, en.wikipedia.org, fake news, fiat currency, framing effect, Francis Fukuyama: the end of history, Frank Gehry, game design, George Floyd, George Gilder, global pandemic, global village, Gödel, Escher, Bach, Higgs boson, Ignaz Semmelweis: hand washing, informal economy, Isaac Newton, Jaron Lanier, Jeff Bezos, job-hopping, knowledge economy, Large Hadron Collider, lockdown, Louis Pasteur, Mark Zuckerberg, Mercator projection, meta-analysis, microaggression, Mustafa Suleyman, Neil Armstrong, nudge unit, OpenAI, packet switching, pattern recognition, Peter Thiel, public intellectual, quantitative easing, Ray Kurzweil, Richard Florida, Schrödinger's Cat, scientific management, self-driving car, Silicon Valley, Steve Jobs, Steven Pinker, TED Talk, The Structural Transformation of the Public Sphere, Thomas Kuhn: the structure of scientific revolutions, TikTok, Tim Cook: Apple, too big to fail, transaction costs, Tyler Cowen

Ones are on display at the British Science Museum and the University of Cambridge. On Andrew Lo and economics: Andrew W. Lo, Adaptive Markets: Financial Evolution at the Speed of Thought (Princeton, NJ: Princeton University Press, 2017). Reframing the economy: For a fascinating reframing of economics through the lens of Claude Shannon’s information theory, see: George Gilder, Knowledge and Power: The Information Theory of Capitalism and How It Is Revolutionizing our World (Washington, DC: Regnery, 2013). The idea of a “circular economy” is another example, viewing products in terms of a life cycle. On being open-minded and curious: A good resource is David Epstein, Range: Why Generalists Triumph in a Specialized World (New York: Riverhead, 2019).


pages: 288 words: 86,995

Rule of the Robots: How Artificial Intelligence Will Transform Everything by Martin Ford

AI winter, Airbnb, algorithmic bias, algorithmic trading, Alignment Problem, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, artificial general intelligence, Automated Insights, autonomous vehicles, backpropagation, basic income, Big Tech, big-box store, call centre, carbon footprint, Chris Urmson, Claude Shannon: information theory, clean water, cloud computing, commoditize, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, data is the new oil, data science, deep learning, deepfake, DeepMind, Demis Hassabis, deskilling, disruptive innovation, Donald Trump, Elon Musk, factory automation, fake news, fulfillment center, full employment, future of work, general purpose technology, Geoffrey Hinton, George Floyd, gig economy, Gini coefficient, global pandemic, Googley, GPT-3, high-speed rail, hype cycle, ImageNet competition, income inequality, independent contractor, industrial robot, informal economy, information retrieval, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Jeff Bezos, job automation, John Markoff, Kiva Systems, knowledge worker, labor-force participation, Law of Accelerating Returns, license plate recognition, low interest rates, low-wage service sector, Lyft, machine readable, machine translation, Mark Zuckerberg, Mitch Kapor, natural language processing, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, Ocado, OpenAI, opioid epidemic / opioid crisis, passive income, pattern recognition, Peter Thiel, Phillips curve, post scarcity, public intellectual, Ray Kurzweil, recommendation engine, remote working, RFID, ride hailing / ride sharing, Robert Gordon, Rodney Brooks, Rubik’s Cube, Sam Altman, self-driving car, Silicon Valley, Silicon Valley startup, social distancing, SoftBank, South of Market, San Francisco, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, superintelligent machines, TED Talk, The Future of Employment, The Rise and Fall of American Growth, the scientific method, Turing machine, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, Uber and Lyft, uber lyft, universal basic income, very high income, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, WikiLeaks, women in the workforce, Y Combinator

The goals were both ambitious and optimistic; the conference proposal declared that “an attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves” and promised that the organizers’ believed a “significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for a summer.”3 Attendees included Marvin Minsky, who along with McCarthy became one of the world’s most celebrated AI researchers and founded the Computer Science and Artificial Intelligence Lab at MIT, and Claude Shannon, a legendary electrical engineer who formulated the principles of information theory that underlie electronic communication and make the internet possible. The brightest mind, however, was notably absent from the Dartmouth conference. Alan Turing had committed suicide two years earlier. Prosecuted for a same-sex relationship under the “indecency” laws then in force in Britain, Turing was given a choice between imprisonment or chemical castration through the forced introduction of estrogen.


pages: 285 words: 86,853

What Algorithms Want: Imagination in the Age of Computing by Ed Finn

Airbnb, Albert Einstein, algorithmic bias, algorithmic management, algorithmic trading, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, bitcoin, blockchain, business logic, Charles Babbage, Chuck Templeton: OpenTable:, Claude Shannon: information theory, commoditize, Computing Machinery and Intelligence, Credit Default Swap, crowdsourcing, cryptocurrency, data science, DeepMind, disruptive innovation, Donald Knuth, Donald Shoup, Douglas Engelbart, Douglas Engelbart, Elon Musk, Evgeny Morozov, factory automation, fiat currency, Filter Bubble, Flash crash, game design, gamification, Google Glasses, Google X / Alphabet X, Hacker Conference 1984, High speed trading, hiring and firing, Ian Bogost, industrial research laboratory, invisible hand, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Markoff, Just-in-time delivery, Kickstarter, Kiva Systems, late fees, lifelogging, Loebner Prize, lolcat, Lyft, machine readable, Mother of all demos, Nate Silver, natural language processing, Neal Stephenson, Netflix Prize, new economy, Nicholas Carr, Nick Bostrom, Norbert Wiener, PageRank, peer-to-peer, Peter Thiel, power law, Ray Kurzweil, recommendation engine, Republic of Letters, ride hailing / ride sharing, Satoshi Nakamoto, self-driving car, sharing economy, Silicon Valley, Silicon Valley billionaire, Silicon Valley ideology, Silicon Valley startup, SimCity, Skinner box, Snow Crash, social graph, software studies, speech recognition, statistical model, Steve Jobs, Steven Levy, Stewart Brand, supply-chain management, tacit knowledge, TaskRabbit, technological singularity, technological solutionism, technoutopianism, the Cathedral and the Bazaar, The Coming Technological Singularity, the scientific method, The Signal and the Noise by Nate Silver, The Structural Transformation of the Public Sphere, The Wealth of Nations by Adam Smith, transaction costs, traveling salesman, Turing machine, Turing test, Uber and Lyft, Uber for X, uber lyft, urban planning, Vannevar Bush, Vernor Vinge, wage slave

And yet only with probability—more important, a language of probability—can we begin to describe our relativistic universe. But far more unsettling, and the central thesis of the closely allied field of information theory, is the notion that probability applies to information as much as to material reality. By framing information as uncertainty, as surprise, as unpredicted new data, mathematician Claude Shannon created a quantifiable measurement of communication.30 Shannon’s framework has informed decades of work in signal processing, cryptography, and several other fields, but its starkly limited view of what counts has become a major influence in contemporary understandings of computational knowledge.

Wiener names Leibniz the patron saint of cybernetics: “The philosophy of Leibniz centers about two closely related concepts—that of a universal symbolism and that of a calculus of reasoning.”27 As the book’s title suggests, the aim of cybernetics in the 1940s and 1950s was to define and implement those two ideas: an intellectual system that could encompass all scientific fields, and a means of quantifying change within that system. Using them, the early cyberneticians sought to forge a synthesis between the nascent fields of computer science, information theory, physics, and many others (indeed, Wiener nominated his patron saint in part as the last man to have “full command of all the intellectual activity of his day”).28 The vehicle for this synthesis was, intellectually, the field of information theory and the ordering features of communication between different individual and collective entities, and pragmatically, the growing power of mechanical and computational systems to measure, modulate, and direct such communications.

Chapter 1 lays out a full reading of the algorithm as a critical concept across four intellectual strands, beginning with its foundations in computer science and the notion of “effective computability.” The second strand considers cybernetics and ongoing debates about embodiment, abstraction, and information theory. Third, I return to magic and its overlap with symbolism, engaging with notions of software, “sourcery,” and the power of metaphors to represent reality. Fourth, I draw in the long history of technicity and humanity’s coevolution with our cultural tools. Synthesizing these threads, I offer a definition of the algorithm as culture machine in the context of process and implementation.


pages: 374 words: 94,508

Infonomics: How to Monetize, Manage, and Measure Information as an Asset for Competitive Advantage by Douglas B. Laney

3D printing, Affordable Care Act / Obamacare, banking crisis, behavioural economics, blockchain, book value, business climate, business intelligence, business logic, business process, call centre, carbon credits, chief data officer, Claude Shannon: information theory, commoditize, conceptual framework, crowdsourcing, dark matter, data acquisition, data science, deep learning, digital rights, digital twin, discounted cash flows, disintermediation, diversification, en.wikipedia.org, endowment effect, Erik Brynjolfsson, full employment, hype cycle, informal economy, information security, intangible asset, Internet of things, it's over 9,000, linked data, Lyft, Nash equilibrium, Neil Armstrong, Network effects, new economy, obamacare, performance metric, profit motive, recommendation engine, RFID, Salesforce, semantic web, single source of truth, smart meter, Snapchat, software as a service, source of truth, supply-chain management, tacit knowledge, technological determinism, text mining, uber lyft, Y2K, yield curve

Others worthy of consideration as well include: Douglas Hubbard’s “applied information economics” methodology strictly for measuring the decision value of information.15 Bill Schmarzo’s “data economic valuation” approach also strictly for measuring information’s contribution to decision making.16 Paul Strassman’s macroeconomic method of comparing the competitive gains of organizations with similar tangible assets, after accounting for all other valuation premium factors.17 Dilip Krishna’s methods for attributing business outcomes to specific information initiatives (Business Impact Model); an adapted discount cash flow model (a Monte Carlo simulation method); and a comparative analysis approach similar to Strassman’s but using a pure-play information company for comparison.18 Tonie Leatherberry’s and Rena Mears’s “net business value” method that considers information’s present and discounted future value to each department, various risks, and total cost of ownership.19 Robert Schmidt’s and Jennifer Fisher’s promising but loosely defined and abandoned patent application for an amalgam of information cost, accuracy, and other quality factors, along with information usage/access.20 Mark Albala’s conceptual valuation model, similar to that of Schmidt and Fischer, in which information value is based on information requests, usage (royalties), and outcomes.21 Dave McCrory’s concept and formula for what he calls “data gravity” that defines the proximal relationship among data sources and applications.22 As well, software and services companies such as Schedule1, Pimsoft, Everedge, ThreatModeler, Alex Solutions, Datum, Alation, and Real Social Dynamics (RSD) [in collaboration with the Geneva School of Business Administration] have developed exclusive approaches for measuring information value and/or risk in economic terms specific to their core offerings. Of course, countless papers and books have been written on the value of information from an engineering, communications, and theoretical standpoint, dating back to Claude Shannon’s seminal work on information theory. Even the term “business intelligence” predates any Gartner or IBM paper, having been first used in the 1865 book, “Cyclopaedia of Commercial and Business Anecdotes” in describing the value of information on military battles.23 Understanding and Closing Information Value Gaps Enterprises repeatedly fall into the trap of getting diminishing returns from buying and applying more complex and leading-edge IT to the same narrow sets of data, rather than doing simpler things with the data that has the highest value gap.

Infonomics is a broad concept I conceived and first mentioned around the turn of the millennium to express information’s increasing behavior and importance as an economic asset. Over the years I have continued to research and develop the concept—exploring, relating, and integrating the disciplines of information theory, accounting, asset management, property ownership and rights, measurement, innovation, and economics. This has taken me from working with and interviewing clients, colleagues, information consultants, valuation experts, accountants, economists, and academics on these topics—to the depths of reading hundreds of pages of accounting standards body discussion papers on the recognition, and valuation and reporting of intangibles.

What could those of us in the decades-old information world possibly learn from them? A lot. One last note on ecosystems: if you think the whole concept of drawing parallels between biological and information ecosystems is a bit whacked, consider that the leading theoretical ecologist, Robert Ulanowicz, has employed modern information theory concepts to describe the complex structure of biological ecosystems themselves.11 Lessons from Sustainability When an ecosystem is healthy, scientists say it is in balance or sustainable. Since we want our information ecosystems to be sustainable and thrive, let’s take a look at how we might incorporate the principles of sustainability or conservation.


pages: 297 words: 89,820

The Perfect Thing: How the iPod Shuffles Commerce, Culture, and Coolness by Steven Levy

Apple II, Bill Atkinson, British Empire, Claude Shannon: information theory, en.wikipedia.org, General Magic , Herbert Marcuse, indoor plumbing, Internet Archive, Jeff Bezos, John Markoff, Joi Ito, Jony Ive, Kevin Kelly, reality distortion field, Sand Hill Road, Saturday Night Live, Silicon Valley, social web, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, technology bubble, the long tail, Thomas L Friedman, Tony Fadell

This is well known to cryptographers. A well-funded, sophisticated cryptanalyst will Shuffle seize on any variation from a random distribution as a means of attacking a code. This subject was most famously examined by Claude Shannon, arguably the Father of Randomness. Shannon himself expressed some random behavior: the MIT math professor was known for his eccentric habits, which included riding a unicy-cle. But his papers on information theory are rock solid. Basically, he defined randomness as a question of unpredictability. If a series of numbers is truly random, you have no possible way of guessing what comes next. If something isn't random (as in the case of what letter might follow another in a message written in English), you have a better chance of figuring out what comes next.


pages: 418 words: 102,597

Being You: A New Science of Consciousness by Anil Seth

AlphaGo, artificial general intelligence, augmented reality, backpropagation, carbon-based life, Claude Shannon: information theory, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, correlation does not imply causation, CRISPR, cryptocurrency, deep learning, deepfake, DeepMind, Drosophila, en.wikipedia.org, Filter Bubble, GPT-3, GPT-4, John Markoff, longitudinal study, Louis Pasteur, mirror neurons, Neil Armstrong, Nick Bostrom, Norbert Wiener, OpenAI, paperclip maximiser, pattern recognition, Paul Graham, Pierre-Simon Laplace, planetary scale, Plato's cave, precautionary principle, Ray Kurzweil, self-driving car, speech recognition, stem cell, systems thinking, technological singularity, TED Talk, telepresence, the scientific method, theory of mind, Thomas Bayes, TikTok, Turing test

Testing this requires measuring Φ for real systems, and this is where the trouble starts. It turns out that measuring Φ is extremely challenging and in most cases nearly or actually impossible. The main reason for this is that IIT treats ‘information’ in an unusual way. The standard use of information in mathematics, developed by Claude Shannon in the 1950s, is observer-relative. Observer-relative (or extrinsic) information is the degree to which uncertainty is reduced, from the perspective of an observer, by observing a system in a particular state. For example, imagine rolling a single die many times. Each time, you observe one outcome out of six possibilities: each time, five alternatives are ruled out.

‘Out-of-body experiences: from Penfield to present’. Trends in Cognitive Sciences, 7(3), 104–6. Tononi, G. (2008). ‘Consciousness as integrated information: a provisional manifesto’. Biological Bulletin, 215(3), 216–42. Tononi, G. (2012). ‘Integrated information theory of consciousness: an updated account’. Archives italiennes de biologie, 150(4), 293–329. Tononi, G., Boly, M., Massimini, M., et al. (2016). ‘Integrated information theory: from consciousness to its physical substrate’. Nature Reviews Neuroscience, 17(7), 450–61. Tononi, G., & Edelman, G. M. (1998). ‘Consciousness and complexity’. Science, 282(5395), 1846–51. Tononi, G., & Koch, C. (2015).

But this doesn’t mean that consciousness is integrated information, in the same way that temperature is mean molecular kinetic energy. To see what drives this intuition, we need to push the analogy between consciousness and temperature as far as it can go, to see whether and when it breaks down. It’s time to meet the ‘integrated information theory’ of consciousness. Notes Joachim Dalencé: www.encyclopedia.com/science/dictionaries-thesauruses-pictures-and-press-releases/dalence-joachim. Inventing Temperature: H. Chang (2004). Could the same approach work: Possibly the first person to ask, and answer, this question was Thomas Henry Huxley (‘Darwin’s Bulldog’).


pages: 571 words: 105,054

Advances in Financial Machine Learning by Marcos Lopez de Prado

algorithmic trading, Amazon Web Services, asset allocation, backtesting, behavioural economics, bioinformatics, Brownian motion, business process, Claude Shannon: information theory, cloud computing, complexity theory, correlation coefficient, correlation does not imply causation, data science, diversification, diversified portfolio, en.wikipedia.org, financial engineering, fixed income, Flash crash, G4S, Higgs boson, implied volatility, information asymmetry, latency arbitrage, margin call, market fragmentation, market microstructure, martingale, NP-complete, P = NP, p-value, paper trading, pattern recognition, performance metric, profit maximization, quantitative trading / quantitative finance, RAND corporation, random walk, risk free rate, risk-adjusted returns, risk/return, selection bias, Sharpe ratio, short selling, Silicon Valley, smart cities, smart meter, statistical arbitrage, statistical model, stochastic process, survivorship bias, transaction costs, traveling salesman

For example, the ML algorithm may find that momentum bets are more profitable when prices carry little information, and that mean-reversion bets are more profitable when prices carry a lot of information. In this chapter, we will explore ways to determine the amount of information contained in a price series. 18.2 Shannon's Entropy In this section we will review a few concepts from information theory that will be useful in the remainder of the chapter. The reader can find a complete exposition in MacKay [2003]. The father of information theory, Claude Shannon, defined entropy as the average amount of information (over long messages) produced by a stationary source of data. It is the smallest number of bits per character required to describe the message in a uniquely decodable way.

In order to determine the probability of adverse selection, we must determine how unpredictable the order flow imbalance is. We can determine this by applying information theory. Consider a long sequence of symbols. When that sequence contains few redundant patterns, it encompasses a level of complexity that makes it hard to describe and predict. Kolmogorov [1965] formulated this connection between redundancy and complexity. In information theory, lossless compression is the task of perfectly describing a sequence with as few bits as possible. The more redundancies a sequence contains, the greater compression rates can be achieved.

Kontoyiannis (1998): “Asymptotically optimal lossy Lempel-Ziv coding,” ISIT, Cambridge, MA, August 16–August 21. MacKay, D. (2003): Information Theory, Inference, and Learning Algorithms, 1st ed. Cambridge University Press. Meucci, A. (2009): “Managing diversification.” Risk Magazine, Vol. 22, pp. 74–79. Norwich, K. (2003): Information, Sensation and Perception, 1st ed. Academic Press. Ornstein, D.S. and B. Weiss (1993): “Entropy and data compression schemes.” IEEE Transactions on Information Theory, Vol. 39, pp. 78–83. Shannon, C. (1948): “A mathematical theory of communication.” Bell System Technical Journal, Vol. 27, No. 3, pp. 379–423.


pages: 385 words: 98,015

Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum by Lee Smolin

adjacent possible, Albert Einstein, Brownian motion, Claude Shannon: information theory, cosmic microwave background, cosmological constant, Ernest Rutherford, Isaac Newton, Jane Jacobs, Jaron Lanier, John von Neumann, Murray Gell-Mann, mutually assured destruction, quantum entanglement, Richard Feynman, Richard Florida, Schrödinger's Cat, Stephen Hawking, Stuart Kauffman, the scientific method, Turing machine

However, we should be careful to distinguish several different ideas about the relationship between physics and information, some of which are useful but also trivially true; others of which are radical and would need, in my view, more justification than they’ve been given. Let’s start by defining information. One useful definition was given by Claude Shannon, who may be considered the founder of information theory. His definition was set in the framework of communication, and contemplates a channel which carries a message from a sender to a receiver. These, it is assumed, share a language, by means of which they give meaning to a sequence of symbols. The amount of information in the message is defined to be the number of answers to a set of yes/no questions that the receiver learns from the sender by understanding what the message says.

Research groups began to spring up around the world, and they quickly filled with brilliant young researchers, many of whom had a dual research strategy in which they would attack the problems in quantum foundations while contributing to the development of quantum computing. As a result, a new language for quantum physics was invented that was based on information theory, which is a basic tool of computer science. This new language, called quantum information theory, is a hybrid of computer science and quantum physics and is well adapted to the challenges of building quantum computers. This has led to a powerful set of tools and concepts that have proved invaluable at sharpening our understanding of quantum physics. Nonetheless, quantum information theory is a purely operational approach that is most comfortable describing nature in the context of experiments, in which systems are prepared and then measured.

This is a long story, with some fascinating ins and outs, but the conclusion is simple: the wave function appears to capture an essential aspect of reality.13 The closest to success I know of is an approach by the mathematician Edward Nelson called stochastic quantum mechanics. For many years I thought this was the right way, but then I understood it requires a large amount of fine-tuning to avoid instabilities. This conclusion is upheld by a recent analysis by three specialists in quantum information theory, Matthew Pusey, Jonathan Barrett, and Terry Rudolph, who gave a new argument to the effect that the quantum state cannot be merely a representation of information an observer has about a system. It must be physically real, or represent something real.14 So we seem to have only two choices: keep the wave function itself as a beable, as it is in pilot wave theory and collapse models, or find another beable that captures, in some different form, the physical reality which the wave function represents.


Data and the City by Rob Kitchin,Tracey P. Lauriault,Gavin McArdle

A Declaration of the Independence of Cyberspace, algorithmic management, bike sharing, bitcoin, blockchain, Bretton Woods, Chelsea Manning, citizen journalism, Claude Shannon: information theory, clean water, cloud computing, complexity theory, conceptual framework, corporate governance, correlation does not imply causation, create, read, update, delete, crowdsourcing, cryptocurrency, data science, dematerialisation, digital divide, digital map, digital rights, distributed ledger, Evgeny Morozov, fault tolerance, fiat currency, Filter Bubble, floating exchange rates, folksonomy, functional programming, global value chain, Google Earth, Hacker News, hive mind, information security, Internet of things, Kickstarter, knowledge economy, Lewis Mumford, lifelogging, linked data, loose coupling, machine readable, new economy, New Urbanism, Nicholas Carr, nowcasting, open economy, openstreetmap, OSI model, packet switching, pattern recognition, performance metric, place-making, power law, quantum entanglement, RAND corporation, RFID, Richard Florida, ride hailing / ride sharing, semantic web, sentiment analysis, sharing economy, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, smart contracts, smart grid, smart meter, social graph, software studies, statistical model, tacit knowledge, TaskRabbit, technological determinism, technological solutionism, text mining, The Chicago School, The Death and Life of Great American Cities, the long tail, the market place, the medium is the message, the scientific method, Toyota Production System, urban planning, urban sprawl, web application

Merricks White Before positioning data threads within these debates, it is worth contextualizing the assertion that data are material. For what else could they be? Important here is genealogical work on the development of cybernetics and information theory. Through a close reading of post-war cyberneticians such as Norbert Weiner and Claude Shannon, Hayles (1999) shows how information came to be understood as a pattern separate from a material form. This strategy of disembodiment was important in allowing information theory to migrate between academic disciplines such that it could equally be applied to biology or cognitive psychology as to communications engineering. Implicit in this separation between matter and meaning, however, is a latent mind/body dualism.


pages: 313 words: 101,403

My Life as a Quant: Reflections on Physics and Finance by Emanuel Derman

Bear Stearns, Berlin Wall, bioinformatics, Black-Scholes formula, book value, Brownian motion, buy and hold, capital asset pricing model, Claude Shannon: information theory, Dennis Ritchie, Donald Knuth, Emanuel Derman, financial engineering, fixed income, Gödel, Escher, Bach, haute couture, hiring and firing, implied volatility, interest rate derivative, Jeff Bezos, John Meriwether, John von Neumann, Ken Thompson, law of one price, linked data, Long Term Capital Management, moral hazard, Murray Gell-Mann, Myron Scholes, PalmPilot, Paul Samuelson, pre–internet, proprietary trading, publish or perish, quantitative trading / quantitative finance, Sharpe ratio, statistical arbitrage, statistical model, Stephen Hawking, Steve Jobs, stochastic volatility, technology bubble, the new new thing, transaction costs, volatility smile, Y2K, yield curve, zero-coupon bond, zero-sum game

Area 10 had played a large role in evangelizing the dual view of programs as both tools and text, written not only to control electronic machines but also to to be understood and manipulated by people. At the labs people were proud of programming and viewed it as an art. In physics and engineering the Labs was an experimental and theoretical powerhouse, producing research in electronics and information theory that made possible many of the subsequent advances in communications. Bardeen, Brattain, and Shockley had invented the transistor there in 1947, and Claude Shannon published his landmark paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in 1948. There were fundamental discoveries made, too-Penzias and Wilson won the Nobel Prize for discovering the cosmic radiation left behind by the Big Bang, as predicted by Robert Herman.


pages: 372 words: 101,174

How to Create a Mind: The Secret of Human Thought Revealed by Ray Kurzweil

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Albert Michelson, anesthesia awareness, anthropic principle, brain emulation, cellular automata, Charles Babbage, Claude Shannon: information theory, cloud computing, computer age, Computing Machinery and Intelligence, Dean Kamen, discovery of DNA, double helix, driverless car, en.wikipedia.org, epigenetics, George Gilder, Google Earth, Hans Moravec, Isaac Newton, iterative process, Jacquard loom, Jeff Hawkins, John von Neumann, Law of Accelerating Returns, linear programming, Loebner Prize, mandelbrot fractal, Nick Bostrom, Norbert Wiener, optical character recognition, PalmPilot, pattern recognition, Peter Thiel, Ralph Waldo Emerson, random walk, Ray Kurzweil, reversible computing, selective serotonin reuptake inhibitor (SSRI), self-driving car, speech recognition, Steven Pinker, strong AI, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Wall-E, Watson beat the top human players on Jeopardy!, X Prize

Communication is pervasive at every level. If we consider that error rates escalate rapidly with increased communication and that a single-bit error can destroy the integrity of a process, digital computation was doomed—or so it seemed at the time. Remarkably, that was the common view until American mathematician Claude Shannon (1916–2001) came along and demonstrated how we can create arbitrarily accurate communication using even the most unreliable communication channels. What Shannon stated in his landmark paper “A Mathematical Theory of Communication,” published in the Bell System Technical Journal in July and October 1948, and in particular in his noisy channel-coding theorem, was that if you have available a channel with any error rate (except for exactly 50 percent per bit, which would mean that the channel was just transmitting pure noise), you are able to transmit a message in which the error rate is as accurate as you desire.

If that is not good enough, simply increase the redundancy until you get the reliability you need. Simply repeating information is the easiest way to achieve arbitrarily high accuracy rates from low-accuracy channels, but it is not the most efficient approach. Shannon’s paper, which established the field of information theory, presented optimal methods of error detection and correction codes that can achieve any target accuracy through any nonrandom channel. Older readers will recall telephone modems, which transmitted information through noisy analog phone lines. These lines featured audibly obvious hisses and pops and many other forms of distortion, but nonetheless were able to transmit digital data with very high accuracy rates, thanks to Shannon’s noisy channel theorem.


pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable by James Owen Weatherall

Alan Greenspan, Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Apollo 11, Asian financial crisis, bank run, Bear Stearns, beat the dealer, behavioural economics, Benoit Mandelbrot, Black Monday: stock market crash in 1987, Black Swan, Black-Scholes formula, Bonfire of the Vanities, book value, Bretton Woods, Brownian motion, business cycle, butterfly effect, buy and hold, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, coastline paradox / Richardson effect, collateralized debt obligation, collective bargaining, currency risk, dark matter, Edward Lorenz: Chaos theory, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial engineering, financial innovation, Financial Modelers Manifesto, fixed income, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, Jim Simons, John Nash: game theory, junk bonds, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, Market Wizards by Jack D. Schwager, martingale, Michael Milken, military-industrial complex, Myron Scholes, Neil Armstrong, new economy, Nixon triggered the end of the Bretton Woods system, Paul Lévy, Paul Samuelson, power law, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk free rate, risk-adjusted returns, Robert Gordon, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, Stuart Kauffman, The Chicago School, The Myth of the Rational Market, tulip mania, Vilfredo Pareto, volatility smile

Thorp and his wife, Vivian, left Southern California and moved to Cambridge, Massachusetts. They spent only two years on the East Coast before moving back west, to New Mexico. But it was enough to set their lives on a different track: it was at MIT that Thorp met Claude Shannon. Shannon may be the only person in the twentieth century who can claim to have founded an entirely new science. The field he invented, information theory, is essentially the mathematics behind the digital revolution. It undergirds computer science, modern telecommunications, cryptography, and code-breaking. The basic object of study is data: bits (a term Shannon coined) of information.

He accomplished what Bachelier and Osborne never could: he showed that physics and mathematics could be used to profit from financial markets. Building on the work of Bachelier and Osborne, and on his own experience with gambling systems, Thorp invented the modern hedge fund — by applying ideas from a new field that combined mathematical physics and electrical engineering. Information theory, as it’s known, was as much a part of the 1960s as the Vegas Strip. And in Thorp’s hands, it proved to be the missing link between the statistics of market prices and a winning strategy on Wall Street. Thorp was born at the peak of the Depression, on August 14, 1932. His father was a retired army officer, a veteran of the First World War.

The study of things such as how light waves move through air or how human languages work is very old; Shannon’s groundbreaking idea was that you could study the information itself — the stuff that’s carried by the light waves from objects in the world to your retinas, or the stuff that passes from one person to another when they speak — independently of the waves and the words. It is hard to overstate how important this idea would become. Information theory grew out of a project Shannon worked on during World War II, as a staff scientist at Bell Labs, AT&T’s research division in Murray Hill, New Jersey. The goal of the project was to build an encrypted telephone system so that generals at the front could safely communicate with central command.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

"World Economic Forum" Davos, Ada Lovelace, adjacent possible, affirmative action, AI winter, Alfred Russel Wallace, algorithmic bias, algorithmic management, AlphaGo, Amazon Mechanical Turk, animal electricity, autonomous vehicles, behavioural economics, Black Swan, Brexit referendum, British Empire, Cambridge Analytica, cellular automata, Charles Babbage, citizen journalism, Claude Shannon: information theory, combinatorial explosion, Computing Machinery and Intelligence, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, desegregation, discovery of DNA, disinformation, Douglas Hofstadter, Elon Musk, fake news, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Geoffrey Hinton, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, Linda problem, low skilled workers, Mark Zuckerberg, mass immigration, meta-analysis, mutually assured destruction, natural language processing, new economy, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, post-truth, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, Stuart Kauffman, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, warehouse robotics, women in the workforce, Yochai Benkler

The problem is, Turing’s wall changes what we call ‘communication’ particularly in the human sense of the word, which involves a great deal more than a disembodied, mute exchange of abstract symbols. While Turing was writing about his famous test, Claude Shannon, an American code-breaker and electrical engineer, was theorizing on purely symbolic communications. At Bell Laboratories Shannon developed information theory, a set of ideas that are vital to every modern electronic communication and computation device, because they make reliable data transmission possible, not just between one smartphone and another, or a media provider and a smart TV, but between memory, CPU, hard drive and every other part of a computer, in every computer on Earth.5 Both early computers and telecommunication devices had to send signals (we’ll call them messages for now) down wires, or through the air on radio waves.

Even attendees of the cybernetics conference had doubts about this profound shift in the meaning of these powerful words, and the simultaneous abandonment of the concept of meaning itself. One attendee, the physicist, philosopher and AI pioneer Heinz von Foerster, commented: I wanted to call the whole of what they called information theory signal theory, because information was not yet there. There were ‘beep beeps’ but that was all, no information. The moment one transforms that set of signals into other signals our brains can make an understanding of, then information is born—it’s not in the beeps. Yet information theory is now ubiquitous not only in engineering communication theory, and every single aspect of computing and telecommunications, but in ‘deep’ AI, where algorithms derived from this theory are often used as a means of deriving representations from data, based on probabilistic assumptions.

The connections are numerous, from the representation of thinking as the ‘if–then’ rules of Aristotelian logic, to the modelling of uncertainty based on calculations drawn from dice games, through the reduction of evolution to a philosophy of ‘survival of the fittest’, to the attempt to capture complex human characteristics like ‘intelligence’ in numbers like IQ, to the transformation and reduction of human artisanship into discrete tasks neatly divided to fit mass-manufacturing processes, through an unassailable faith that free markets generate spontaneous order, to viewing the living brain as a simple, synaptic computer, to a staggering reduction of the subtle meanings of language to the bit transmissions of information theory. Each of these simplifications arose in a particular historical and cultural context and has a direct connection to the conception and design of algorithms that now operate all around us today. Just as algorithms have emerged from various historical and cultural contexts, they are also a product engineered by people.


pages: 378 words: 110,518

Postcapitalism: A Guide to Our Future by Paul Mason

air traffic controllers' union, Alan Greenspan, Alfred Russel Wallace, bank run, banking crisis, banks create money, Basel III, basic income, Bernie Madoff, Bill Gates: Altair 8800, bitcoin, Bletchley Park, Branko Milanovic, Bretton Woods, BRICs, British Empire, business cycle, business process, butterfly effect, call centre, capital controls, carbon tax, Cesare Marchetti: Marchetti’s constant, Claude Shannon: information theory, collaborative economy, collective bargaining, commons-based peer production, Corn Laws, corporate social responsibility, creative destruction, credit crunch, currency manipulation / currency intervention, currency peg, David Graeber, deglobalization, deindustrialization, deskilling, discovery of the americas, disinformation, Downton Abbey, drone strike, en.wikipedia.org, energy security, eurozone crisis, factory automation, false flag, financial engineering, financial repression, Firefox, Fractional reserve banking, Frederick Winslow Taylor, fulfillment center, full employment, future of work, game design, Glass-Steagall Act, green new deal, guns versus butter model, Herbert Marcuse, income inequality, inflation targeting, informal economy, information asymmetry, intangible asset, Intergovernmental Panel on Climate Change (IPCC), Internet of things, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Perry Barlow, Joseph Schumpeter, Kenneth Arrow, Kevin Kelly, Kickstarter, knowledge economy, knowledge worker, late capitalism, low interest rates, low skilled workers, market clearing, means of production, Metcalfe's law, microservices, middle-income trap, Money creation, money: store of value / unit of account / medium of exchange, mortgage debt, Network effects, new economy, Nixon triggered the end of the Bretton Woods system, Norbert Wiener, Occupy movement, oil shale / tar sands, oil shock, Paul Samuelson, payday loans, Pearl River Delta, post-industrial society, power law, precariat, precautionary principle, price mechanism, profit motive, quantitative easing, race to the bottom, RAND corporation, rent-seeking, reserve currency, RFID, Richard Stallman, Robert Gordon, Robert Metcalfe, scientific management, secular stagnation, sharing economy, Stewart Brand, structural adjustment programs, supply-chain management, technological determinism, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, Transnistria, Twitter Arab Spring, union organizing, universal basic income, urban decay, urban planning, vertical integration, Vilfredo Pareto, wages for housework, WikiLeaks, women in the workforce, Yochai Benkler

The result was to stimulate an unprecedented culture of cross-fertilization in strategic disciplines. The new approach inserted maths and science into the heart of the industrial process; economics and data management into political decision-making. It was the OSRD that took Claude Shannon, the founder of information theory, out of Princeton and put him into Bell Labs to design algorithms for anti-aircraft guns.21 There, he would meet Alan Turing and discuss the possibility of ‘thinking machines’. Turing, too, had been scooped out of academia by the British government to run the Enigma codebreaking operation at Bletchley Park.

Though the ‘information economy’ lay decades in the future, the post-war economies saw information used on an industrial scale. It flowed as science, as management theory, as data, as mass communications and even – in a few hallowed places – out of a computer and into a tray of folding paper. A transistor is simply a switch with no moving parts. Information theory plus transistors gives you the ability to automate physical processes. So factories throughout the West were re-tooled with semi-automated machinery: pneumatic presses, drills, cutters, lathes, sewing machines and production lines. What they lacked was sophisticated feedback mechanisms: electronic sensors and automated logic systems were so crude that the latter used compressed air to do what we now do with iPhone apps.

Deloitte, who did these calculations, describes the falling price of basic info-tech as exponential: ‘The current pace of technological advance is unprecedented in history and shows no signs of stabilizing as other historical technological innovations, such as electricity, eventually did.’26 It has become commonplace to think of information as ‘immaterial’. Norbert Wiener, one of the founders of information theory once claimed: ‘Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.’27 But this is a fallacy. In 1961, IBM physicist Rolf Landauer proved, logically, that information is physical.28 He wrote: ‘Information is not a disembodied abstract entity; it is always tied to a physical representation.


pages: 332 words: 109,213

The Scientist as Rebel by Freeman Dyson

"World Economic Forum" Davos, Albert Einstein, Asilomar, Boeing 747, British Empire, Claude Shannon: information theory, dark matter, double helix, Edmond Halley, Ernest Rutherford, experimental subject, Fellow of the Royal Society, From Mathematics to the Technologies of Life and Death, Gregor Mendel, Henri Poincaré, Isaac Newton, Johannes Kepler, John von Neumann, kremlinology, Mikhail Gorbachev, military-industrial complex, Norbert Wiener, Paul Erdős, Plato's cave, precautionary principle, quantum entanglement, Recombinant DNA, Richard Feynman, Ronald Reagan, seminal paper, Silicon Valley, Stephen Hawking, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, undersea cable

In spite of his original intentions, cybernetics became a theory of analog processes. Meanwhile, also in 1948, Claude Shannon published his classic pair of papers with the title “A Mathematical Theory of Communication,” in The Bell System Technical Journal. Shannon’s theory was a theory of digital communication, using many of Wiener’s ideas but applying them in a new direction. Shannon’s theory was mathematically elegant, clear, and easy to apply to practical problems of communication. It was far more user-friendly than cybernetics. It became the basis of a new discipline called “information theory.” During the next ten years, digital computers began to operate all over the world, and analog computers rapidly became obsolete.

During the next ten years, digital computers began to operate all over the world, and analog computers rapidly became obsolete. Electronic engineers learned information theory, the gospel according to Shannon, as part of their basic training, and cybernetics was forgotten. Neither Wiener nor von Neumann nor Shannon, nor anyone else in the 1940s, foresaw the microprocessors that would make digital computers small and cheap and reliable and available to private citizens. Nobody foresaw the Internet or the ubiquitous cell phone. As a result of the proliferation of digital computers in private hands, Wiener’s nightmare vision of a few giant computers determining the fate of human societies never came to pass.


pages: 396 words: 107,814

Is That a Fish in Your Ear?: Translation and the Meaning of Everything by David Bellos

Bletchley Park, Clapham omnibus, Claude Shannon: information theory, Douglas Hofstadter, Dr. Strangelove, Etonian, European colonialism, Great Leap Forward, haute cuisine, high-speed rail, invention of the telephone, invention of writing, language acquisition, machine readable, machine translation, natural language processing, Republic of Letters, Sapir-Whorf hypothesis, speech recognition

In a famous memorandum written in July 1949, Warren Weaver, then a senior official with the Rockefeller Foundation, found it “very tempting to say that a book written in Chinese is simply a book in English which was coded into the ‘Chinese code.’ If we have useful methods for solving almost any cryptographic problem, may it not be that with proper interpretation we already have useful methods for translation?”2 Weaver was aware of the pioneering work of Claude Shannon and others in the nascent disciplines of information theory and cybernetics and could see that if language could be treated as a code, then there would be huge development contracts available for mathematicians, logicians, and engineers working on the new and exciting number-crunching devices that had only just acquired their modern name of “computers.”


pages: 396 words: 117,149

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World by Pedro Domingos

Albert Einstein, Amazon Mechanical Turk, Arthur Eddington, backpropagation, basic income, Bayesian statistics, Benoit Mandelbrot, bioinformatics, Black Swan, Brownian motion, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, computer vision, constrained optimization, correlation does not imply causation, creative destruction, crowdsourcing, Danny Hillis, data is not the new oil, data is the new oil, data science, deep learning, DeepMind, double helix, Douglas Hofstadter, driverless car, Erik Brynjolfsson, experimental subject, Filter Bubble, future of work, Geoffrey Hinton, global village, Google Glasses, Gödel, Escher, Bach, Hans Moravec, incognito mode, information retrieval, Jeff Hawkins, job automation, John Markoff, John Snow's cholera map, John von Neumann, Joseph Schumpeter, Kevin Kelly, large language model, lone genius, machine translation, mandelbrot fractal, Mark Zuckerberg, Moneyball by Michael Lewis explains big data, Narrative Science, Nate Silver, natural language processing, Netflix Prize, Network effects, Nick Bostrom, NP-complete, off grid, P = NP, PageRank, pattern recognition, phenotype, planetary scale, power law, pre–internet, random walk, Ray Kurzweil, recommendation engine, Richard Feynman, scientific worldview, Second Machine Age, self-driving car, Silicon Valley, social intelligence, speech recognition, Stanford marshmallow experiment, statistical model, Stephen Hawking, Steven Levy, Steven Pinker, superintelligent machines, the long tail, the scientific method, The Signal and the Noise by Nate Silver, theory of mind, Thomas Bayes, transaction costs, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, white flight, yottabyte, zero-sum game

The state of one transistor is one bit of information: one if the transistor is on, and zero if it’s off. One bit somewhere in your bank’s computers says whether your account is overdrawn or not. Another bit somewhere in the Social Security Administration’s computers says whether you’re alive or dead. The second simplest algorithm is: combine two bits. Claude Shannon, better known as the father of information theory, was the first to realize that what transistors are doing, as they switch on and off in response to other transistors, is reasoning. (That was his master’s thesis at MIT—the most important master’s thesis of all time.) If transistor A turns on only when transistors B and C are both on, it’s doing a tiny piece of logical reasoning.

For more information, please contact the Special Markets Department at the Perseus Books Group, 2300 Chestnut Street, Suite 200, Philadelphia, PA 19103, or call (800) 810-4145, ext. 5000, or e-mail special.markets@perseusbooks.com. Library of Congress Cataloging-in-Publication Data Domingos, Pedro. The master algorithm: how the quest for the ultimate learning machine will remake our world / Pedro Domingos. pages cm Includes index. ISBN 978-0-465-06192-1 (ebook) 1. Knowledge representation (Information theory) 2. Artificial intelligence—Social aspects. 3. Artificial intelligence—Philosophy. 4. Cognitive science—Mathematics. 5. Algorithms. I. Title. Q387.D66 2015 003′54—dc23 2015007615 10 9 8 7 6 5 4 3 2 1 TO THE MEMORY OF MY SISTER RITA, WHO LOST HER BATTLE WITH CANCER WHILE I WAS WRITING THIS BOOK The grand aim of science is to cover the greatest number of experimental facts by logical deduction from the smallest number of hypotheses or axioms.

One salient question is how to pick the best attribute to test at a node. Accuracy—the number of correctly predicted examples—doesn’t work very well, because we’re not trying to predict a particular class; rather, we’re trying to gradually separate the classes until each branch is “pure.” This brings to mind the concept of entropy from information theory. The entropy of a set of objects is a measure of the amount of disorder in it. If a group of 150 people includes 50 Republicans, 50 Democrats, and 50 independents, its political entropy is maximum. On the other hand, if they’re all Republican then the entropy is zero (as far as party affiliation goes).


pages: 298 words: 43,745

Understanding Sponsored Search: Core Elements of Keyword Advertising by Jim Jansen

AltaVista, AOL-Time Warner, barriers to entry, behavioural economics, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, content marketing, correlation does not imply causation, data science, en.wikipedia.org, first-price auction, folksonomy, Future Shock, information asymmetry, information retrieval, intangible asset, inventory management, life extension, linear programming, longitudinal study, machine translation, megacity, Nash equilibrium, Network effects, PageRank, place-making, power law, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search costs, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social bookmarking, social web, software as a service, stochastic process, tacit knowledge, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, Vilfredo Pareto, yield management

., Moore’s Law), but the ability of humans to process it has remained stable. Potpourri: Why are most query terms short? Some of it may have to do with simple information needs. It may have to do with the cognitive limits of our memory (i.e., the human processing and storage power). (Continued) 42 Understanding Sponsored Search Based on Claude Shannon and Norbert Weiner’s information theory research [21], Harvard psychologist George A. Miller [22] proposed the theoretical construct, known as the Magical Number Seven, Plus or Minus Two. The gist of the construct is that the typical human can process about seven chunks of similar information at any instant in time (sometimes a couple more, sometimes a couple fewer).

As wealth rises, consumers will move away from less costly, perhaps inferior, goods and services, choosing higher-priced alternatives for a variety of perceived benefits, such as quality or status. 92 Understanding Sponsored Search Concerning their perceived lack of searching, consumers have a trade-off between the cost of search, usually measured by time, and the benefit of that search [9]. A strict consumer search model assumes that consumers are likely to search for information as long as they believe that the benefits of acquiring information outweigh the cost of information search as indicated in the economics of information theory [11]. However, a more workable consumer search model should assume that consumers are likely to search for information until they believe they have a reasonable solution, regardless of the cost-benefit ratio. This behavior, again, illustrates the concept of satisficing [12, 13] and the principle of least effort [14].

., 116 Hunter, Lee, 19 IdeaLab, 11 imperfect information, 179, 194 impression, 12, 14, 24, 75–77, 133, 209 impulse buying, 93, 97–98 income effect, 91 information access, 54–55, 70, 72 information acquisition, 35, 63 information asymmetry, 63–64 information foraging, 42, 44, 63, 66–67, 70, 72, 80, 96, 213 information foraging theory, 42, 67, 70 information imbalance, 63–64 information load, 63 275 information obtainability, 42–43, 49, 70–71, 211, 213 information overload, 68 information processing theory, 65, 94, 96 information retrieval, 2, 20, 37, 51, 70–71, 154, 211 information scent, 66–67 information science, xiii, 14, 35, 71, 87 information searching, 20, 37, 211 information theory, 42, 92 informational, xi, 44–46, 64, 93, 96 informivore, 66 Infoseek, 38 Intangible content, 131 internal information search, 88 Internet television, 224 iProspect, xvi, 167 iWon, 21 Jansen, Bernard J., 20, 44, 113–114, 116, 118, 158 Johnson, Samuel, 229 Kamangar, Salar, 181 key drivers of marketing, 128 key performance indicators, 24.


pages: 492 words: 118,882

The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory by Kariappa Bheemaiah

"World Economic Forum" Davos, accounting loophole / creative accounting, Ada Lovelace, Adam Curtis, Airbnb, Alan Greenspan, algorithmic trading, asset allocation, autonomous vehicles, balance sheet recession, bank run, banks create money, Basel III, basic income, behavioural economics, Ben Bernanke: helicopter money, bitcoin, Bletchley Park, blockchain, Bretton Woods, Brexit referendum, business cycle, business process, call centre, capital controls, Capital in the Twenty-First Century by Thomas Piketty, cashless society, cellular automata, central bank independence, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive dissonance, collateralized debt obligation, commoditize, complexity theory, constrained optimization, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-border payments, crowdsourcing, cryptocurrency, data science, David Graeber, deep learning, deskilling, Diane Coyle, discrete time, disruptive innovation, distributed ledger, diversification, double entry bookkeeping, Ethereum, ethereum blockchain, fiat currency, financial engineering, financial innovation, financial intermediation, Flash crash, floating exchange rates, Fractional reserve banking, full employment, George Akerlof, Glass-Steagall Act, Higgs boson, illegal immigration, income inequality, income per capita, inflation targeting, information asymmetry, interest rate derivative, inventory management, invisible hand, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Joseph Schumpeter, junk bonds, Kenneth Arrow, Kenneth Rogoff, Kevin Kelly, knowledge economy, large denomination, Large Hadron Collider, Lewis Mumford, liquidity trap, London Whale, low interest rates, low skilled workers, M-Pesa, machine readable, Marc Andreessen, market bubble, market fundamentalism, Mexican peso crisis / tequila crisis, Michael Milken, MITM: man-in-the-middle, Money creation, money market fund, money: store of value / unit of account / medium of exchange, mortgage debt, natural language processing, Network effects, new economy, Nikolai Kondratiev, offshore financial centre, packet switching, Pareto efficiency, pattern recognition, peer-to-peer lending, Ponzi scheme, power law, precariat, pre–internet, price mechanism, price stability, private sector deleveraging, profit maximization, QR code, quantitative easing, quantitative trading / quantitative finance, Ray Kurzweil, Real Time Gross Settlement, rent control, rent-seeking, robo advisor, Satoshi Nakamoto, Satyajit Das, Savings and loan crisis, savings glut, seigniorage, seminal paper, Silicon Valley, Skype, smart contracts, software as a service, software is eating the world, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, supply-chain management, technology bubble, The Chicago School, The Future of Employment, The Great Moderation, the market place, The Nature of the Firm, the payments system, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, trade liberalization, transaction costs, Turing machine, Turing test, universal basic income, Vitalik Buterin, Von Neumann architecture, Washington Consensus

This computer was used to solve ordinary differential equations which would help calculate the trajectories of shells. It consisted of multiple rotating disks and cylinders driven by electric motors linked together with metal rods that were manually set up (sometime taking up to two days) to solve any differential equation problem. Vannevar had recruited Claude Shannon (known today as the father of information theory), a young graduate who specialised symbolic logic. Although the Differential Analyzer was a mechanical machine with moving parts, Shannon identified it as a complicated control circuit with relays. Shannon thus began creating the first generation of circuit designs and in the process, was able to transform information into a quantity that could be subjected to manipulation by a machine.

Shannon thus began creating the first generation of circuit designs and in the process, was able to transform information into a quantity that could be subjected to manipulation by a machine. Using Boolean algebra, logic gates and binary arithmetic (bits and bytes), Shannon was able to represent all types of information by numbers and in the process created the foundations for today’s modern information theory. It is for this reason that he is referred to as the father of information technology. As World War Two began in 1939, these advances in information technology had been adopted by various militaries to communicate sensitive information. Cryptography became a suitable way of camouflaging information and led to the creation of the Enigma machine.


pages: 400 words: 121,988

Trading at the Speed of Light: How Ultrafast Algorithms Are Transforming Financial Markets by Donald MacKenzie

algorithmic trading, automated trading system, banking crisis, barriers to entry, bitcoin, blockchain, Bonfire of the Vanities, Bretton Woods, Cambridge Analytica, centralized clearinghouse, Claude Shannon: information theory, coronavirus, COVID-19, cryptocurrency, disintermediation, diversification, en.wikipedia.org, Ethereum, ethereum blockchain, family office, financial intermediation, fixed income, Flash crash, Google Earth, Hacker Ethic, Hibernia Atlantic: Project Express, interest rate derivative, interest rate swap, inventory management, Jim Simons, level 1 cache, light touch regulation, linked data, lockdown, low earth orbit, machine readable, market design, market microstructure, Martin Wolf, proprietary trading, Renaissance Technologies, Satoshi Nakamoto, Small Order Execution System, Spread Networks laid a new fibre optics cable between New York and Chicago, statistical arbitrage, statistical model, Steven Levy, The Great Moderation, transaction costs, UUNET, zero-sum game

As Persico explained to me, this switching is at the byte level (a byte is a unit of information made up of eight binary digits). The hybrid system can thus send part of a packet—a larger, structured unit of data—by laser, and part by millimeter wave. 17. The “Shannon limit” is named after the MIT electrical engineer and mathematician Claude Shannon, who did famous, fundamental work on information theory and the capacity of communications channels. 18. See Tyč (2018). Longer hops are possible because LMDS is lower in the frequency spectrum than E-band, and so less affected by rain. There are no other LMDS links in the relevant areas with which McKay’s signals might interfere, which makes it easier to install antennas in optimal, geodesic-hugging locations.

Having struck a deal with the owner, McKay then set about working with radio specialists to develop radios for transmitting and receiving signals in this frequency band; because the band had never been used, LMDS radios were not available commercially. In its digital aspects, says Tyč, an LMDS radio is not very different from a conventional E-band millimeter-wave radio, but the analog components of an LMDS radio are easier to improve, in particular to make more “linear.” So the transmission capacity of LMDS can be closer to what information theory posits as the maximum possible, given the physical characteristics of the channel (closer, in other words, to the “Shannon Limit”).17 This gives LMDS greater capacity than E-band. McKay Brothers reports a capacity of five to seven gigabits per second being achievable with LMDS, closer to the maximum data rate of full raw datafeeds.


pages: 476 words: 121,460

The Man From the Future: The Visionary Life of John Von Neumann by Ananyo Bhattacharya

Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Alvin Roth, Andrew Wiles, Benoit Mandelbrot, business cycle, cellular automata, Charles Babbage, Claude Shannon: information theory, clockwork universe, cloud computing, Conway's Game of Life, cuban missile crisis, Daniel Kahneman / Amos Tversky, DeepMind, deferred acceptance, double helix, Douglas Hofstadter, Dr. Strangelove, From Mathematics to the Technologies of Life and Death, Georg Cantor, Greta Thunberg, Gödel, Escher, Bach, haute cuisine, Herman Kahn, indoor plumbing, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, Jacquard loom, Jean Tirole, John Conway, John Nash: game theory, John von Neumann, Kenneth Arrow, Kickstarter, linear programming, mandelbrot fractal, meta-analysis, mutually assured destruction, Nash equilibrium, Norbert Wiener, Norman Macrae, P = NP, Paul Samuelson, quantum entanglement, RAND corporation, Ray Kurzweil, Richard Feynman, Ronald Reagan, Schrödinger's Cat, second-price auction, side project, Silicon Valley, spectrum auction, Steven Levy, Strategic Defense Initiative, technological singularity, Turing machine, Von Neumann architecture, zero-sum game

The longest can only be nine nodes long: the game must end at this juncture because the grid is full. The tree for chess, however, gets very cluttered very quickly. After three moves, the pieces on the board can be in one of 121 million different possible configurations. Between helping to invent the design principles of digital circuits and modern information theory, American mathematician Claude Shannon calculated that there are at least 10120 possible games of chess – comfortably more than the number of elementary particles in the universe. Von Neumann and Morgenstern’s illustration of the extensive form of a game. Von Neumann called chess and tic-tac-toe games of ‘perfect information’ – all the moves in the game are visible to both players.


pages: 509 words: 132,327

Rise of the Machines: A Cybernetic History by Thomas Rid

1960s counterculture, A Declaration of the Independence of Cyberspace, agricultural Revolution, Albert Einstein, Alistair Cooke, Alvin Toffler, Apple II, Apple's 1984 Super Bowl advert, back-to-the-land, Berlin Wall, Bletchley Park, British Empire, Brownian motion, Buckminster Fuller, business intelligence, Charles Babbage, Charles Lindbergh, Claude Shannon: information theory, conceptual framework, connected car, domain-specific language, Douglas Engelbart, Douglas Engelbart, Dr. Strangelove, dumpster diving, Extropian, full employment, game design, global village, Hacker News, Haight Ashbury, Herman Kahn, Howard Rheingold, Ivan Sutherland, Jaron Lanier, job automation, John Gilmore, John Markoff, John Perry Barlow, John von Neumann, Kevin Kelly, Kubernetes, Marshall McLuhan, Menlo Park, military-industrial complex, Mitch Kapor, Mondo 2000, Morris worm, Mother of all demos, Neal Stephenson, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shale / tar sands, Oklahoma City bombing, operational security, pattern recognition, public intellectual, RAND corporation, Silicon Valley, Simon Singh, Snow Crash, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, systems thinking, technoutopianism, Telecommunications Act of 1996, telepresence, The Hackers Conference, Timothy McVeigh, Vernor Vinge, We are as Gods, Whole Earth Catalog, Whole Earth Review, Y2K, Yom Kippur War, Zimmermann PGP

Ron Hubbard, July 8, 1950, Norbert Wiener Papers, MC 22, box 8 (“Correspondence 1950”), folder 121, Institute Archives and Special Collections, MIT Libraries, Cambridge, MA. 9.L. Ron Hubbard to Norbert Wiener, July 26, 1950, Norbert Wiener Papers, MC 22, box 8 (“Correspondence 1950”), folder 121, Institute Archives and Special Collections, MIT Libraries, Cambridge, MA. 10.L. Ron Hubbard to Claude Shannon, December 6, 1949, Claude Elwood Shannon Papers, box 1, MSS84831, Library of Congress, Washington, DC. 11.Norbert Wiener to William Schlecht, July 8, 1950. 12.Norbert Wiener, “Some Maxims for Biologists and Psychologists,” Dialectica 4, no. 3 (September 15, 1950): 190. 13.Ibid., 191. 14.Ibid. 15.William Grey Walter, The Living Brain (London: Duckworth, 1953), 223. 16.Ibid. 17.Maxwell Maltz, Psycho-Cybernetics (New York: Pocket Books/Simon & Schuster, 1969), cover. 18.The figure of thirty million is provided by the book’s publisher.

These pioneers were Whitfield Diffie and Martin Hellman of Stanford University, and Ralph Merkle of UC Berkeley. Their discovery resembled what the British spy agency had already found in secret. In November 1976, a history-changing article appeared in an obscure journal, IEEE Transactions on Information Theory. It was titled “New Directions in Cryptography.” Diffie and Hellman knew that computers would be coming to the people, as Stewart Brand had just reported from the mesmerized Spacewar players on their own campus. And they knew that these computers would be networked. “The development of computer controlled communication networks promises effortless and inexpensive contact between people or computers on opposite sides of the world,” they wrote in the introduction to their landmark paper.

., The Legacy of Norbert Wiener: A Centennial Symposium in Honor of the 100th Anniversary of Norbert Wiener’s Birth, October 8–14, 1994, Massachusetts Institute of Technology, Cambridge, Massachusetts (Providence, RI: American Mathematical Society, 1997), 19. 8.Entropy is a fundamental and related concept in physics as well as in information theory. See James Gleick, The Information (New York: Pantheon, 2011), chap. 9. 9.Norbert Wiener, The Human Use of Human Beings (New York: Houghton Mifflin, 1954), 263. 10.Ibid., 33. 11.Ibid., 24. 12.Norbert Wiener, Cybernetics (Cambridge, MA: MIT Press, 1948), 43. 13.Norbert Wiener, God and Golem, Inc.


pages: 416 words: 129,308

The One Device: The Secret History of the iPhone by Brian Merchant

Airbnb, animal electricity, Apollo Guidance Computer, Apple II, Apple's 1984 Super Bowl advert, Black Lives Matter, Charles Babbage, citizen journalism, Citizen Lab, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, conceptual framework, cotton gin, deep learning, DeepMind, Douglas Engelbart, Dynabook, Edward Snowden, Elon Musk, Ford paid five dollars a day, Frank Gehry, gigafactory, global supply chain, Google Earth, Google Hangouts, Higgs boson, Huaqiangbei: the electronics market of Shenzhen, China, information security, Internet of things, Jacquard loom, John Gruber, John Markoff, Jony Ive, Large Hadron Collider, Lyft, M-Pesa, MITM: man-in-the-middle, more computing power than Apollo, Mother of all demos, natural language processing, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shock, pattern recognition, peak oil, pirate software, profit motive, QWERTY keyboard, reality distortion field, ride hailing / ride sharing, rolodex, Shenzhen special economic zone , Silicon Valley, Silicon Valley startup, skeuomorphism, skunkworks, Skype, Snapchat, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, TED Talk, Tim Cook: Apple, Tony Fadell, TSMC, Turing test, uber lyft, Upton Sinclair, Vannevar Bush, zero day

in his 1950 paper “Computing Machinery and Intelligence,” Alan Turing framed much of the debate to come. That work discusses his famous Imitation Game, now colloquially known as the Turing Test, which describes criteria for judging whether a machine may be considered sufficiently “intelligent.” Claude Shannon, the communication theorist, published his seminal work on information theory, introducing the concept of the bit as well as a language through which humans might speak to computers. In 1956, Stanford’s John McCarthy and his colleagues coined the term artificial intelligence for a new discipline, and we were off to the races. Over the next decade, as the scientific investigation of AI began to draw interest from the public and as, simultaneously, computer terminals became a more ubiquitous machine-human interface, the two future threads—screen-based interfaces and AI—wound into one, and the servile human-shaped robots of yore became disembodied.


pages: 437 words: 132,041

Alex's Adventures in Numberland by Alex Bellos

Andrew Wiles, Antoine Gombaud: Chevalier de Méré, beat the dealer, Black Swan, Black-Scholes formula, Claude Shannon: information theory, computer age, Daniel Kahneman / Amos Tversky, digital rights, Edward Thorp, family office, forensic accounting, game design, Georg Cantor, Henri Poincaré, Isaac Newton, Johannes Kepler, lateral thinking, Myron Scholes, pattern recognition, Paul Erdős, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, random walk, Richard Feynman, Rubik’s Cube, SETI@home, Steve Jobs, The Bell Curve by Richard Herrnstein and Charles Murray, traveling salesman, two and twenty

In the mid 1950s a young mathematician named Ed Thorp began to ponder what set of information would be required to predict where a ball would land in roulette. Thorp was helped in his endeavour by Claude Shannon, his colleague at the Massachusetts Institute of Technology. He couldn’t have wished for a better co-conspirator. Shannon was a prolific inventor with a garage full of electronic and mechanical gadgets. He was also one of the most important mathematicians in the world, as the father of information theory, a crucial academic breakthrough that led to the development of the computer. The men bought a roulette wheel and conducted experiments in Shannon’s basement.


pages: 573 words: 157,767

From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett

Ada Lovelace, adjacent possible, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, AlphaGo, Andrew Wiles, Bayesian statistics, bioinformatics, bitcoin, Bletchley Park, Build a better mousetrap, Claude Shannon: information theory, computer age, computer vision, Computing Machinery and Intelligence, CRISPR, deep learning, disinformation, double entry bookkeeping, double helix, Douglas Hofstadter, Elon Musk, epigenetics, experimental subject, Fermat's Last Theorem, Gödel, Escher, Bach, Higgs boson, information asymmetry, information retrieval, invention of writing, Isaac Newton, iterative process, John von Neumann, language acquisition, megaproject, Menlo Park, Murray Gell-Mann, Necker cube, Norbert Wiener, pattern recognition, phenotype, Richard Feynman, Rodney Brooks, self-driving car, social intelligence, sorting algorithm, speech recognition, Stephen Hawking, Steven Pinker, strong AI, Stuart Kauffman, TED Talk, The Wealth of Nations by Adam Smith, theory of mind, Thomas Bayes, trickle-down economics, Turing machine, Turing test, Watson beat the top human players on Jeopardy!, Y2K

Gibson, ecological psychologist, 1966) 4.The task of a nervous system is to extract information from the environment to use in modulating or guiding successful behavior. 5.We are drowning in information. 6.We no longer can control our personal information. 7.The task of the Central Intelligence Agency is to gather information about our enemies. 8.Humint, intelligence or information gathered by human agents in clandestine interaction with other human beings, is much more important than the information obtainable by satellite surveillance and other high-tech methods. Claude Shannon’s mathematical theory of information (Shannon 1948; Shannon and Weaver 1949) is duly celebrated as the scientific backbone that grounds and legitimizes all the talk about information that engulfs us, but some of that talk involves a different conception of information that is only indirectly addressed by Shannon’s theory.

A byte is eight bits, and a megabyte is eight million bits, so you can send a 2.5 megabyte monochrome bitmap picture file by playing Twenty Million Questions. (Is the first pixel white? …) Shannon’s information theory is a great advance for civilization because semantic information is so important to us that we want to be able to use it effectively, store it without loss, move it, transform it, share it, hide it. Informational artifacts abound—telephones, books, maps, recipes—and information theory itself began as an artifact for studying important features of those artifacts. What began as an engineering discipline has subsequently proven useful to physicists, biologists, and others not concerned with the properties of informational artifacts.

The idea, generalized, of the whole universe being exhaustively (?) describable in one cosmic bitmap lies at the heart of various largely speculative but fascinating proposals in physics. And of course it doesn’t stop at big old atoms, a relatively “low-res” recipe for reality these days. Such an application of Shannon information theory does permit, “in principle” but not remotely in practice, saying exactly how much (Shannon) information there is in the cubic meter of ocean and ocean floor surrounding a particular clam, for instance, but it says nothing about how much of this information—a Vanishingly small portion—is semantic information for the clam.36 Back to the theft of the widget design.


pages: 660 words: 141,595

Data Science for Business: What You Need to Know About Data Mining and Data-Analytic Thinking by Foster Provost, Tom Fawcett

Albert Einstein, Amazon Mechanical Turk, Apollo 13, big data - Walmart - Pop Tarts, bioinformatics, business process, call centre, chief data officer, Claude Shannon: information theory, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, data acquisition, data science, David Brooks, en.wikipedia.org, Erik Brynjolfsson, Gini coefficient, Helicobacter pylori, independent contractor, information retrieval, intangible asset, iterative process, Johann Wolfgang von Goethe, Louis Pasteur, Menlo Park, Nate Silver, Netflix Prize, new economy, p-value, pattern recognition, placebo effect, price discrimination, recommendation engine, Ronald Coase, selection bias, Silicon Valley, Skype, SoftBank, speech recognition, Steve Jobs, supply-chain management, systems thinking, Teledyne, text mining, the long tail, The Signal and the Noise by Nate Silver, Thomas Bayes, transaction costs, WikiLeaks

Fortunately, for classification problems we can address all the issues by creating a formula that evaluates how well each attribute splits a set of examples into segments, with respect to a chosen target variable. Such a formula is based on a purity measure. The most common splitting criterion is called information gain, and it is based on a purity measure called entropy. Both concepts were invented by one of the pioneers of information theory, Claude Shannon, in his seminal work in the field (Shannon, 1948). Entropy is a measure of disorder that can be applied to a set, such as one of our individual segments. Consider that we have a set of properties of members of the set, and each member has one and only one of the properties. In supervised segmentation, the member properties will correspond to the values of the target variable.

Interactive tree of life (iTOL): an online tool for phylogenetic tree display and annotation. Bioinformatics, 23 (1). Lin, J.-H., & Vitter, J. S. (1994). A theory for memory-based learning. Machine Learning, 17, 143–167. Lloyd, S. P. (1982). Least square quantization in PCM. IEEE Transactions on Information Theory, 28 (2), 129–137. MacKay, D. (2003). Information Theory, Inference and Learning Algorithms, Chapter 20. An Example Inference Task: Clustering. Cambridge University Press. MacQueen, J. B. (1967). Some methods for classification and analysis of multivariate observations. In Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability, pp. 281–297.

Capital One: Exploiting and Information-based Strategy. In Proceedings of the 31st Hawaii International Conference on System Sciences. Cohen, L., Diether, K., & Malloy, C. (2012). Legislating Stock Prices. Harvard Business School Working Paper, No. 13–010. Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. Information Theory, IEEE Transactions on, 13(1), 21–27. Crandall, D., Backstrom, L., Cosley, D., Suri, S., Huttenlocher, D., & Kleinberg, J. (2010). Inferring social ties from geographic coincidences. Proceedings of the National Academy of Sciences, 107(52), 22436-22441. Deza, E., & Deza, M. (2006). Dictionary of distances.


The Greatest Show on Earth: The Evidence for Evolution by Richard Dawkins

Alfred Russel Wallace, Andrew Wiles, Arthur Eddington, back-to-the-land, Claude Shannon: information theory, correlation does not imply causation, Craig Reynolds: boids flock, Danny Hillis, David Attenborough, discovery of DNA, Dmitri Mendeleev, domesticated silver fox, double helix, en.wikipedia.org, epigenetics, experimental subject, Gregor Mendel, heat death of the universe, if you see hoof prints, think horses—not zebras, invisible hand, Large Hadron Collider, Louis Pasteur, out of africa, phenotype, precautionary principle, Thomas Malthus

Even of those who understand what a year is, a larger percentage has no understanding of what causes seasons, presuming, with rampant Northern Hemisphere chauvinism, that we are closest to the sun in June and furthest away in December. * ‘Every Schoolboy Knows’ (and every schoolgirl can prove it by Euclidean geometry). * It is no accident that Claude Shannon, when developing his metric of ‘information’, which is itself a measure of statistical improbability, lit upon exactly the same mathematical formula that Ludwig Boltzman had developed for entropy in the previous century. * Insects, crustaceans, spiders, centipedes, etc. † For example, a mutation in the leg of a millipede will be mirrored on both sides, and probably repeated the length of the body as well.

Other scientists have proposed, as a unit of evolutionary rate, the haldane. * I have even been called an ‘ultra-Darwinist’, a gibe that I find less insulting than its coiners perhaps intended. * ‘Degenerate’ is not the same (though the two terms are often confused) as ‘redundant’, another technical term of Information Theory. A redundant code is one in which the same message is conveyed more than once (e.g. ‘She is a female woman’ conveys the message of her sex three times). Redundancy is used by engineers to guard against transmission errors. A degenerate code is one in which more than one ‘word’ is used to mean the same thing.


pages: 625 words: 167,349

The Alignment Problem: Machine Learning and Human Values by Brian Christian

Albert Einstein, algorithmic bias, Alignment Problem, AlphaGo, Amazon Mechanical Turk, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, butterfly effect, Cambridge Analytica, Cass Sunstein, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, data science, deep learning, DeepMind, Donald Knuth, Douglas Hofstadter, effective altruism, Elaine Herzberg, Elon Musk, Frances Oldham Kelsey, game design, gamification, Geoffrey Hinton, Goodhart's law, Google Chrome, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, hedonic treadmill, ImageNet competition, industrial robot, Internet Archive, John von Neumann, Joi Ito, Kenneth Arrow, language acquisition, longitudinal study, machine translation, mandatory minimum, mass incarceration, multi-armed bandit, natural language processing, Nick Bostrom, Norbert Wiener, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, OpenAI, Panopticon Jeremy Bentham, pattern recognition, Peter Singer: altruism, Peter Thiel, precautionary principle, premature optimization, RAND corporation, recommendation engine, Richard Feynman, Rodney Brooks, Saturday Night Live, selection bias, self-driving car, seminal paper, side project, Silicon Valley, Skinner box, sparse data, speech recognition, Stanislav Petrov, statistical model, Steve Jobs, strong AI, the map is not the territory, theory of mind, Tim Cook: Apple, W. E. B. Du Bois, Wayback Machine, zero-sum game

Now imagine you’re a computer with a complete lack of such common sense—let alone the ability to put yourself in the shoes of a prospective treasure burier—but what you do have is an extremely large sample (a “corpus”) of real-world texts to scan for patterns. How good a job could you do at predicting the missing word purely based on the statistics of the language itself? Constructing these kinds of predictive models has long been a grail for computational linguists.54 (Indeed, Claude Shannon founded information theory in the 1940s on a mathematical analysis of this very sort, noticing that some missing words are more predictable than others, and attempting to quantify by how much.55) Early methods involved what are known as “n-grams,” which meant simply counting up every single chain of, say, two words in a row that appeared in a particular corpus—“appeared in,” “in a,” “a particular,” “particular corpus”—and tallying them in a huge database.56 Then it was simple enough, given a missing word, to look at the preceding word and find which n-gram in the database beginning with that preceding word had appeared most often.

And yet this intrinsic motivation, as Berlyne saw, was every bit as central to human nature as the drives for, say, food and sex—despite being “unduly neglected by psychology for many years.”20 (Indeed, the severest punishment our society allows, short of death—solitary confinement—is, in effect, the infliction of boredom on people.) In his landmark 1960 book Conflict, Arousal, and Curiosity, Berlyne notes that a proper study of curiosity first began to emerge in the late 1940s; it is no coincidence, he argues, that information theory and neuroscience also came into their own at the same time.21 A proper understanding of curiosity appears only to be possible at the interdisciplinary junction of all three.22 Berlyne appeared to be just as strongly motivated by curiosity in his own life as he was motivated by it as a subject of study.

At the time of his early death at fifty-two, he was on a quest to ride every subway in the world. Despite a prodigious and prolific output of articles and papers, he rarely worked on nights or weekends. There was too much else to do.23 His ideas, in particular the agenda of reaching out to both neuroscience and information theory for clues, would inspire succeeding generations of psychologists for the latter half of the twentieth century, and in the twenty-first they would come full circle. Starting at the end of the 2000s and continuing through the deep-learning boom of the 2010s, it was the mathematicians and information theorists and computer scientists—stuck on the problem of intrinsic motivation in cases like Montezuma’s Revenge—who would be turning to his ideas for help.


The Art of Computer Programming: Sorting and Searching by Donald Ervin Knuth

card file, Charles Babbage, Claude Shannon: information theory, complexity theory, correlation coefficient, Donald Knuth, double entry bookkeeping, Eratosthenes, Fermat's Last Theorem, G4S, information retrieval, iterative process, John von Neumann, linked data, locality of reference, Menlo Park, Norbert Wiener, NP-complete, p-value, Paul Erdős, RAND corporation, refrigerator car, sorting algorithm, Vilfredo Pareto, Yogi Berra, Zipf's Law

Fredman has shown that O(n) units of time suffice, if suitable data structures are used [STOC 7 A975), 240-244]; see K. Mehlhorn, Data Structures and Algorithms 1 (Springer, 1984), Section 4.2. Optimum trees and entropy. The minimum cost is closely related to a mathematical concept called entropy, which was introduced by Claude Shannon in his seminal work on information theory [Bell System Tech. J. 27 A948), 379- 423, 623-656]. If pi, p2, ¦ ¦ ¦, pn are probabilities with pi +p2 H hpn = 1, we define the entropy H(pi,p2,... ,pn) by the formula H{pi,p2,---,Pn) = A8) Intuitively, if n events are possible and the kth event occurs with probability pk, we can imagine that we have received lg(l/pfc) bits of information when the kth 6.2.2 BINARY TREE SEARCHING 443 event has occurred.

This situation would be interesting to explore as an alternative to the stated problem.) 34. [HM21] Show that the asymptotic value of the multinomial coefficient N piiV, p2N, ..., pnN as N —>• 00 is related to the entropy H(pi,p2,... ,pn)- 35. [HM22] Complete the proof of Theorem B by establishing the inequality B4). 36. [HM25] (Claude Shannon.) Let X and Y be random variables with finite ranges {xi,...,xm} and {yi,...,yn}, and let pi = Pt(X = Xi), qj = Pr(Y = %¦), rv,- = Pr(X = Xi and Y = Vj). Let H{X) = H{Pl,... ,pm) and H{Y) = H(qu. ..,qn) be the 458 SEARCHING 6.2.2 respective entropies of the variables singly, and let H(XY) — H(m,...

Hence, letting k = S(n), we have Since S(n) is an integer, we can rewrite this formula to obtain the lower bound S(n)>\\gn\]. A) Stirling's approximation tells us that [lgn!] =nlgn-n/ln2 + |lgn + 0(l), B) hence roughly n lg n comparisons are needed. 5.3.1 MINIMUM-COMPARISON SORTING 183 Relation A) is often called the information-theoretic lower bound, since cognoscenti of information theory would say that lg n\ "bits of information" are being acquired during a sorting process; each comparison yields at most one bit of information. Trees such as Fig. 34 have also been called "questionnaires"; their mathematical properties were first explored systematically in Claude Picard's book Theorie des Questionnaires (Paris: Gauthier-Villars, 1965).


pages: 685 words: 203,949

The Organized Mind: Thinking Straight in the Age of Information Overload by Daniel J. Levitin

Abraham Maslow, airport security, Albert Einstein, Amazon Mechanical Turk, Anton Chekhov, autism spectrum disorder, Bayesian statistics, behavioural economics, big-box store, business process, call centre, Claude Shannon: information theory, cloud computing, cognitive bias, cognitive load, complexity theory, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, data science, deep learning, delayed gratification, Donald Trump, en.wikipedia.org, epigenetics, Eratosthenes, Exxon Valdez, framing effect, friendly fire, fundamental attribution error, Golden Gate Park, Google Glasses, GPS: selective availability, haute cuisine, How many piano tuners are there in Chicago?, human-factors engineering, if you see hoof prints, think horses—not zebras, impulse control, index card, indoor plumbing, information retrieval, information security, invention of writing, iterative process, jimmy wales, job satisfaction, Kickstarter, language acquisition, Lewis Mumford, life extension, longitudinal study, meta-analysis, more computing power than Apollo, Network effects, new economy, Nicholas Carr, optical character recognition, Pareto efficiency, pattern recognition, phenotype, placebo effect, pre–internet, profit motive, randomized controlled trial, Rubik’s Cube, Salesforce, shared worldview, Sheryl Sandberg, Skype, Snapchat, social intelligence, statistical model, Steve Jobs, supply-chain management, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, traumatic brain injury, Turing test, Twitter Arab Spring, ultimatum game, Wayback Machine, zero-sum game

Then the question becomes not one of how many things you can do at once, but how orderly you can make the information environment. There is considerable research into the difference in utility between simple and complex information. Claude Shannon, an electrical engineer who worked at Bell Laboratories, developed information theory in the 1940s. Shannon information theory is among the most important mathematical ideas of the twentieth century; it has profoundly affected computing and telecommunications, and is the basis for the compression of sound, image, and movie files (e.g., MP3, JPEG, and MP4 respectively).

Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press. developed information theory in the 1940s Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–423, 623–656. See also, Cover, T. M., & Thomas, J. A. (2006). Elements of information theory (2nd ed.). New York, NY: Wiley-Interscience. and, Hartley, R. V. L. (1928). Transmission of information. The Bell System Technical Journal, 7(3), 535–563. Pierce, J. R. (1980) An introduction to information theory: Symbols, signals, and noise. New York, NY: Dover Publications. transmitted only 300–3300 hertz Anderson, H., & Yull, S. (2002).

This bandwidth limitation is most apparent if you try to listen to music over the telephone—the low frequencies of the bass and the high frequencies of cymbals are almost completely absent. Information theory came up in Chapter 1 in discussing the number of simultaneous conversations that a person can follow, and the information processing limits of human attention being estimated at around 120 bits per second. It is a way to quantify the amount of information contained in any transmission, instruction, or sensory stimulus. It can apply to music, speech, paintings, and military orders. The application of information theory generates a number that allows us to compare the amount of information contained in one transmission with that contained in another.


The Big Score by Michael S. Malone

Apple II, Bob Noyce, bread and circuses, Buckminster Fuller, Byte Shop, Charles Babbage, Claude Shannon: information theory, computer age, creative destruction, Donner party, Douglas Engelbart, Douglas Engelbart, El Camino Real, Fairchild Semiconductor, fear of failure, financial independence, game design, Isaac Newton, job-hopping, lone genius, market bubble, Menlo Park, military-industrial complex, packet switching, plutocrats, RAND corporation, ROLM, Ronald Reagan, Salesforce, Sand Hill Road, Silicon Valley, Silicon Valley startup, speech recognition, Steve Jobs, Steve Wozniak, tech worker, Teledyne, The Home Computer Revolution, transcontinental railway, Turing machine, union organizing, Upton Sinclair, upwardly mobile, William Shockley: the traitorous eight, Yom Kippur War

It was a string-and-chewing-gum deal, but the differential analyzer could solve equations in calculus, and it was obvious to observers that there might be something to this newfangled computer thing—if something could just be done about those damn wheels and gears. The breakthrough came on the eve of World War II, when Claude Shannon, an electrical engineer at MIT, was studying electromechanical relays and suddenly had an idea. Relays were already being used by the telephone company as switches for interconnecting phone calls (you can still hear them, as clicking sounds, on old exchanges) and in operation they were pretty straightforward, working like a telegraph key.

But Shannon’s remarkable contribution, one of the wonderful inductive leaps among the engineering discoveries in his generation, was to ask if there was some sort of logic that would describe multiple combinations of two-stage relays… binary… Boolean algebra. Bingo. Shannon had pulled off something remarkable; he had linked the controllable behavior of machines with a system of logic that encompassed all science, perhaps even all of human thought. The Age of Computers had begun—and hard on its heels the rise of information theory, the great organizer of the postwar world. Shannon wasn’t alone in defining the shape of the computer to come. In 1936, Englishman Alan Turing wrote a paper describing a universal computing machine, the Turing Machine. It, too, would be instructed using a language of ones and zeros, entered into the machine via a pattern of holes punched into ribbons of paper tape.


pages: 829 words: 186,976

The Signal and the Noise: Why So Many Predictions Fail-But Some Don't by Nate Silver

airport security, Alan Greenspan, Alvin Toffler, An Inconvenient Truth, availability heuristic, Bayesian statistics, Bear Stearns, behavioural economics, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, big-box store, Black Monday: stock market crash in 1987, Black Swan, Boeing 747, book value, Broken windows theory, business cycle, buy and hold, Carmen Reinhart, Charles Babbage, classic study, Claude Shannon: information theory, Climategate, Climatic Research Unit, cognitive dissonance, collapse of Lehman Brothers, collateralized debt obligation, complexity theory, computer age, correlation does not imply causation, Credit Default Swap, credit default swaps / collateralized debt obligations, cuban missile crisis, Daniel Kahneman / Amos Tversky, disinformation, diversification, Donald Trump, Edmond Halley, Edward Lorenz: Chaos theory, en.wikipedia.org, equity premium, Eugene Fama: efficient market hypothesis, everywhere but in the productivity statistics, fear of failure, Fellow of the Royal Society, Ford Model T, Freestyle chess, fudge factor, Future Shock, George Akerlof, global pandemic, Goodhart's law, haute cuisine, Henri Poincaré, high batting average, housing crisis, income per capita, index fund, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), Internet Archive, invention of the printing press, invisible hand, Isaac Newton, James Watt: steam engine, Japanese asset price bubble, John Bogle, John Nash: game theory, John von Neumann, Kenneth Rogoff, knowledge economy, Laplace demon, locking in a profit, Loma Prieta earthquake, market bubble, Mikhail Gorbachev, Moneyball by Michael Lewis explains big data, Monroe Doctrine, mortgage debt, Nate Silver, negative equity, new economy, Norbert Wiener, Oklahoma City bombing, PageRank, pattern recognition, pets.com, Phillips curve, Pierre-Simon Laplace, Plato's cave, power law, prediction markets, Productivity paradox, proprietary trading, public intellectual, random walk, Richard Thaler, Robert Shiller, Robert Solow, Rodney Brooks, Ronald Reagan, Saturday Night Live, savings glut, security theater, short selling, SimCity, Skype, statistical model, Steven Pinker, The Great Moderation, The Market for Lemons, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, Timothy McVeigh, too big to fail, transaction costs, transfer pricing, University of East Anglia, Watson beat the top human players on Jeopardy!, Wayback Machine, wikimedia commons

Although El Ajedrecista is sometimes regarded as the first computer game,7 it was extremely limited in its functionality, restricted to determining positions in an endgame in which there are just three pieces left on the board. (El Ajedrecista also did not have any stereotypical Turkish headgear.) The father of the modern chess computer was MIT’s Claude Shannon, a mathematician regarded as the founder of information theory, who in 1950 published a paper called “Programming a Computer for Playing Chess.”8 Shannon identified some of the algorithms and techniques that form the backbone of chess programs today. He also recognized why chess is such an interesting problem for testing the powers of information-processing machines.

., 187 incentives, 184, 250, 313, 333, 356, 357, 501 Inconvenient Truth, An, 385 index funds, 344, 370 India, 210n Indiana Pacers, 236n, 489 indicators: lagging, 187–88 leading, 186–88, 196–97 Indonesia, 209 Industrial Revolution, 2, 5–6, 112, 212 infectious disease, 16, 204–31 SIR model of, 220–21, 221, 223, 225, 389 see also specific diseases inflation, 11, 186n, 191, 198, 202 information, 1, 451 analyzing, 232 asymmetrical, 35 collecting, 232 exclusive, 98, 99–101 explosion of, 3–4, 6, 7, 12, 13, 17, 45 forecasting and, 5 quality of, 2, 13, 17 information age, 7, 45, 358 information overload, 12 information processing, 449 information technology, 411 revolution in, 1–4 information theory, 265 InfoSpace, 353, 360 initial condition uncertainty, 390–92 insider trading, 341–42 intelligent extraterrestrial species, 488 interest rates, 186n, 190, 202 International Panel on Climate Change (IPCC), 382, 383–84, 393 1990 predictions of, 373–76, 389, 393, 397–99, 397, 399, 401, 507 “uncertainty” in, 389 Internet, 13, 223, 250, 514 poker on, 296–97, 310 quality of information on, 3 Intrade, 333, 334, 335, 336–37, 358, 367, 497 invisible hand, 332 Ioannidis, John P.


pages: 781 words: 226,928

Commodore: A Company on the Edge by Brian Bagnall

Apple II, belly landing, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, Computer Lib, Dennis Ritchie, Douglas Engelbart, Douglas Engelbart, Firefox, Ford Model T, game design, Gary Kildall, Great Leap Forward, index card, inventory management, Isaac Newton, Ken Thompson, low skilled workers, Menlo Park, packet switching, pink-collar, popular electronics, prediction markets, pre–internet, QWERTY keyboard, Robert Metcalfe, Robert X Cringely, Silicon Valley, special economic zone, Steve Jobs, Steve Wozniak, systems thinking, Ted Nelson, vertical integration

In 1956, Shannon returned to MIT at Lincoln Labs as a lecturer and Artificial Intelligence researcher. While there, he spread his concepts on Information Theory. “He changed the world,” says Peddle. “Shannon was not only a pioneer, but a prophet. He effectively developed a following, almost like a cult.” One of Shannon’s cultists would soon spread the word to Peddle at the University of Maine. During Peddle’s senior year, the University of Maine accepted a lecturer from MIT who studied under Claude Shannon. According to Peddle, “He had a nervous breakdown, so he left MIT. The University of Maine was so happy to get him because he was so superior to the type of instructor they could normally get.

He decided to put together a class to teach people about Information Theory.” At the time, Peddle was enrolling for his junior year. The new Information Theory class happened to fit into his schedule. As Peddle recalls, “It changed my life.” The class began with the instructor discussing the eyes and ears as the primary sensors for receiving information. “He started teaching us about Boolean algebra and binary logic, and the concept of Information Theory,” recalls Peddle. “I just fell in love. This was where I was going to spend my life.” However, the topic that interested Peddle the most was computers. “Information Theory was interesting, and I’ve used it from time to time, but the computer stuff this guy taught me was life changing.”

Over the years, he filled his beachside house with juggling robots, maze-solving robot mice, chess-playing programs, mind-reading machines, and an electric chair to transport his children down to the lake. In 1948, while working at Bell Labs, Shannon produced a groundbreaking paper, A Mathematical Theory of Communication. In it, he rigorously analyzed the concept of Information Theory and how pictures, words, sounds and other media are transmitted using a stream of ones and zeros. He even coined the word “bit.” Peddle was enchanted with his theories. “Today, you take this for granted, but you have to remember that someone had to dream all this up,” he says. “Everyone else’s work stands on his shoulders and most people don’t even know it.”


pages: 1,799 words: 532,462

The Codebreakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet by David Kahn

anti-communist, Bletchley Park, British Empire, Charles Babbage, classic study, Claude Shannon: information theory, computer age, cotton gin, cuban missile crisis, Easter island, end-to-end encryption, Fellow of the Royal Society, heat death of the universe, Honoré de Balzac, index card, interchangeable parts, invention of the telegraph, Isaac Newton, Johannes Kepler, John von Neumann, Louis Daguerre, machine translation, Maui Hawaii, Norbert Wiener, out of africa, pattern recognition, place-making, planned obsolescence, Plato's cave, pneumatic tube, popular electronics, positional goods, Republic of Letters, Searching for Interstellar Communications, stochastic process, Suez canal 1869, the scientific method, trade route, Turing machine, union organizing, yellow journalism, zero-sum game

Code Expert,” The New York Times (March 3, 1966), 35. 730 Britain: Great Britain, The British Imperial Calendar and Civil Service List, 1964 (London: H.M.’s Stationery Office, 1964), columns 322, 326; Who’s Who, 1964 for Sir Eric Jones, Sir Clive Lochnis, Leonard J. Hooper, Brig. John H. Tiltman. 731 “One of the primary,” “provide full protection”: CED 1301.10d, CED 3204.8a. Other goals at CED 1301.6, -.7, –.8. Chapter 20 THE ANATOMY OF CRYPTOLOGY I am grateful to Claude Shannon and to David Slepian for reading an early draft of the parts of this chapter on information theory and making helpful suggestions. 737 “It would not be”: A. Adrian Albert, “Some Mathematical Aspects of Cryptography,” unpublished paper delivered before the American Mathematical Association, November 22, 1941. 737 “The transformations are”: Maurits de Vries, “Concealment of Information,” Synthèse, IX (1953), 326-336 at 330. 739 stability of letter frequency: for an explanation of this—based on de Saussure’s axiom of the independence of sound and meaning—see G.

Anti-Baconian Ib Melchior fell victim to his opponents’ obsession and enigmalyzed from Shakespeare’s tombstone a message in alleged Elizabethan English that supposedly meant, “Elsinore laid wedge first Hamlet edition”; his enigmaduction enjoys the dubious prestige of being the first to have been laid low by Claude Shannon’s unicity-point formula. Baconian Pierre Henrion has extended his enigmaplan to Jonathan Swift; by anagramming the nonsensical names in Gulliver’s Travels, substituting letters, and then anagramming again, he “proves” that LEMUEL GULLIVER really means Jonathan Swift and that LILLIPUT is Nowhere.

Shannon’s information theory shows how to make cryptanalysis more difficult and tells how much ciphertext is needed to reach a valid solution. In all these ways it has contributed to a deeper understanding of cryptology. Shannon has also viewed cryptology from a couple of other perspectives, which, while not as useful as information theory, are enlightening. The first, in fact, is a kind of corollary to the information-theory view. “From the point of view of the cryptanalyst,” Shannon wrote, “a secrecy system is almost identical with a noisy communication system.” In information theory, the term “noise” has a special meaning.


pages: 761 words: 231,902

The Singularity Is Near: When Humans Transcend Biology by Ray Kurzweil

additive manufacturing, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, Benoit Mandelbrot, Bill Joy: nanobots, bioinformatics, brain emulation, Brewster Kahle, Brownian motion, business cycle, business intelligence, c2.com, call centre, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, complexity theory, conceptual framework, Conway's Game of Life, coronavirus, cosmological constant, cosmological principle, cuban missile crisis, data acquisition, Dava Sobel, David Brooks, Dean Kamen, digital divide, disintermediation, double helix, Douglas Hofstadter, en.wikipedia.org, epigenetics, factory automation, friendly AI, functional programming, George Gilder, Gödel, Escher, Bach, Hans Moravec, hype cycle, informal economy, information retrieval, information security, invention of the telephone, invention of the telescope, invention of writing, iterative process, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, linked data, Loebner Prize, Louis Pasteur, mandelbrot fractal, Marshall McLuhan, Mikhail Gorbachev, Mitch Kapor, mouse model, Murray Gell-Mann, mutually assured destruction, natural language processing, Network effects, new economy, Nick Bostrom, Norbert Wiener, oil shale / tar sands, optical character recognition, PalmPilot, pattern recognition, phenotype, power law, precautionary principle, premature optimization, punch-card reader, quantum cryptography, quantum entanglement, radical life extension, randomized controlled trial, Ray Kurzweil, remote working, reversible computing, Richard Feynman, Robert Metcalfe, Rodney Brooks, scientific worldview, Search for Extraterrestrial Intelligence, selection bias, semantic web, seminal paper, Silicon Valley, Singularitarianism, speech recognition, statistical model, stem cell, Stephen Hawking, Stewart Brand, strong AI, Stuart Kauffman, superintelligent machines, technological singularity, Ted Kaczynski, telepresence, The Coming Technological Singularity, Thomas Bayes, transaction costs, Turing machine, Turing test, two and twenty, Vernor Vinge, Y2K, Yogi Berra

See Theodore Modis, "Forecasting the Growth of Complexity and Change," Technological Forecasting and Social Change 69.4 (2002), http://ourworld.compuserve.com/homepages/tmodis/TedWEB.htm. 3. Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less space it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper "A Mathematical Theory of Communication," The Bell System Technical Journal 27 (July–October 1948): 379–423, 623–56. Data compression is possible because of factors such as redundancy (repetition) and probability of appearance of character combinations in data.


Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei

backpropagation, bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, disinformation, distributed generation, finite state, industrial research laboratory, information retrieval, information security, iterative process, knowledge worker, linked data, machine readable, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, power law, random walk, recommendation engine, RFID, search costs, semantic web, seminal paper, sentiment analysis, sparse data, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, Thomas Bayes, web application

Suppose the class label attribute has m distinct values defining m distinct classes, Ci (for ). Let be the set of tuples of class Ci in D. Let and denote the number of tuples in D and , respectively. Information Gain ID3 uses information gain as its attribute selection measure. This measure is based on pioneering work by Claude Shannon on information theory, which studied the value or “information content” of messages. Let node N represent or hold the tuples of partition D. The attribute with the highest information gain is chosen as the splitting attribute for node N. This attribute minimizes the information needed to classify the tuples in the resulting partitions and reflects the least randomness or “impurity” in these partitions.

For a detailed discussion on attribute selection measures, see Kononenko and Hong [KH97]. Information gain was proposed by Quinlan [Qui86] and is based on pioneering work on information theory by Shannon and Weaver [SW49]. The gain ratio, proposed as an extension to information gain, is described as part of C4.5 (Quinlan [Qui93]). The Gini index was proposed for CART in Breiman, Friedman, Olshen, and Stone [BFOS84]. The G-statistic, based on information theory, is given in Sokal and Rohlf [SR81]. Comparisons of attribute selection measures include Buntine and Niblett [BN92], Fayyad and Irani [FI92], Kononenko [Kon95], Loh and Shih [LS97] and Shih [Shi99].

A good weighting function should obey the following properties: (1) the best semantic indicator of a pattern, p, is itself, (2) assign the same score to two patterns if they are equally strong, and (3) if two patterns are independent, neither can indicate the meaning of the other. The meaning of a pattern, p, can be inferred from either the appearance or absence of indicators. Mutual information is one of several possible weighting functions. It is widely used in information theory to measure the mutual independency of two random variables. Intuitively, it measures how much information a random variable tells about the other. Given two frequent patterns, pα and pβ, let X = {0, 1} and Y = {0, 1} be two random variables representing the appearance of pα and pβ, respectively.


Applied Cryptography: Protocols, Algorithms, and Source Code in C by Bruce Schneier

active measures, cellular automata, Claude Shannon: information theory, complexity theory, dark matter, Donald Davies, Donald Knuth, dumpster diving, Dutch auction, end-to-end encryption, Exxon Valdez, fault tolerance, finite state, heat death of the universe, information security, invisible hand, John von Neumann, knapsack problem, MITM: man-in-the-middle, Multics, NP-complete, OSI model, P = NP, packet switching, quantum cryptography, RAND corporation, RFC: Request For Comment, seminal paper, software patent, telemarketer, traveling salesman, Turing machine, web of trust, Zimmermann PGP

During the thirties and forties a few basic papers did appear in the open literature and several treatises on the subject were published, but the latter were farther and farther behind the state of the art. By the end of the war the transition was complete. With one notable exception, the public literature had died. That exception was Claude Shannon’s paper “The Communication Theory of Secrecy Systems,” which appeared in the Bell System Technical Journal in 1949 [1432]. It was similar to Friedman’s 1918 paper, in that it grew out of wartime work of Shannon’s. After the Second World War ended it was declassified, possibly by mistake. From 1949 until 1967 the cryptographic literature was barren.

Author(s): Bruce Schneier ISBN: 0471128457 Publication Date: 01/01/96 Search this book: Go! Previous Table of Contents Next ----------- Part III Cryptographic Algorithms Chapter 11 Mathematical Background 11.1 Information Theory Modern information theory was first published in 1948 by Claude Elmwood Shannon [1431, 1432]. (His papers have been reprinted by the IEEE Press [1433].) For a good mathematical treatment of the topic, consult [593]. In this section, I will just sketch some important ideas. Entropy and Uncertainty Information theory defines the amount of information in a message as the minimum number of bits needed to encode all possible meanings of that message, assuming all messages are equally likely.

If a cryptosystem has ideal secrecy, even successful cryptanalysis will leave some uncertainty about whether the recovered plaintext is the real plaintext. Information Theory in Practice While these concepts have great theoretical value, actual cryptanalysis seldom proceeds along these lines. Unicity distance guarantees insecurity if it’s too small but does not guarantee security if it’s high. Few practical algorithms are absolutely impervious to analysis; all manner of characteristics might serve as entering wedges to crack some encrypted messages. However, similar information theory considerations are occasionally useful, for example, to determine a recommended key change interval for a particular algorithm.