Von Neumann architecture

34 results back to index


pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science by Chris Bernhardt

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, British Empire, cellular automata, Claude Shannon: information theory, complexity theory, Conway's Game of Life, discrete time, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Internet Archive, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Norbert Wiener, Paul Erdős, Turing complete, Turing machine, Turing test, Von Neumann architecture

Turing Machines Examples of Turing Machines Computable Functions and Calculations Church-Turing Thesis Computational Power Machines That Don’t Halt 5. Other Systems for Computation The Lambda Calculus Tag Systems One-Dimensional Cellular Automata 6. Encodings and the Universal Machine A Method of Encoding Finite Automata Universal Machines Construction of Universal Machines Modern Computers Are Universal Machines Von Neumann Architecture Random Access Machines RAMs Can Be Emulated by Turing Machines Other Universal Machines What Happens When We Input 〈M〉 into M? 7. Undecidable Problems Proof by Contradiction Russell’s Barber Finite Automata That Do Not Accept Their Encodings Turing Machines That Do Not Accept Their Encodings Does a Turing Machine Diverge on Its Encoding? Is Undecidable The Acceptance, Halting, and Blank Tape Problems An Uncomputable Function Turing’s Approach 8.

When we simulate the machine on a modern computer we do need to assume the computer has enough storage to be able to do the calculation.) That we can simulate Turing machines on modern computers is not surprising. What is surprising is that we can design a Turing machine to simulate a modern computer, showing that Turing machines are equivalent in computing power to modern computers. We will sketch how this is done. The first step is to get a concrete description of the modern computer. Von Neumann Architecture Later we will talk more about John von Neumann, but it is important to know a few facts before we proceed. The First Draft of a Report on the EDVAC is probably the most important paper on the design of modern computers. It was written in 1945, as the first electronic computers were being built. It described the basic outline of how a computer should be designed, incorporating what had been learned from the design of earlier machines.

Turing was interested in what it was possible to compute. His machines were theoretical constructs meant to incorporate the basic computational steps of human computers. Von Neumann was interested in building a physical machine. His focus was not on the theory of computation, but on the design of an efficient machine for doing actual computations. The resulting design outlined in the report is often referred to as von Neumann architecture and most modern computers are based on this architecture. Von Neumann’s design built on the ideas of many people. The First Draft, as its name suggests, was a draft of a paper and it was only meant to be circulated to a small number of people. The fact that von Neumann was listed as the sole author and that other people’s work was not credited correctly would not have been a problem if the readership was restricted, as originally intended, to just a few colleagues, but the First Draft was widely circulated and became enormously influential in the design of all subsequent computers.


pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, British Empire, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, low earth orbit, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, spectrum auction, strong AI, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

Shuffled among a succession of departments, the original proposal was reconsidered to death. Turing’s automatic computing engine, like Babbage’s analytical engine, was never built. Turing’s proposal “synthesized the concepts of a stored-program universal computer, a floating-point subroutine library, artificial intelligence, details such as a hardware bootstrap loader, and much else.”36 At a time when no such machines were in existence and the von Neumann architecture had only just been proposed, Turing produced a complete description of a million-cycle-per-second computer that foreshadowed the RISC (Reduced Instruction Set Computer) architecture that has now gained prominence after fifty years. The report was accompanied by circuit diagrams, a detailed physical and logical analysis of the internal storage system, sample programs, detailed (if bug-ridden) subroutines, and even an estimated (if unrealistic) cost of £11,200.

“If he really wanted a computer,” explained Arthur Burks, “the thing to do was to build it.”17 Von Neumann structured the project so as to introduce multiple copies of the new machine in several places at once. Progress reports were disseminated not only among the participating funding agencies and to a half-dozen groups that were duplicating the IAS design, but to any location where the potential of high-speed digital computers might fall on fertile ground. It is no accident that the vast majority of computers in circulation today follow the von Neumann architecture—characterized by a central processing unit operating in parallel on the multiple bits of one word of data at a time, a hierarchical memory ranging from fast but limited random-access memory to slow but unlimited media, such as floppy disks or tape, and a distinction between hardware and software that enabled robust computers (and a robust computer industry) to advance by a leapfrog process with each element evolving freely on its own.

“Quite often the likelihood of getting actual numerical results was very much larger if he was not in the computer room, because everybody got so nervous when he was there,” reported Martin Schwarzschild. “But when you were in real thinking trouble, you would go to von Neumann and nobody else.”43 Von Neumann’s reputation, after fifty years, has been injured less by his critics than by his own success. The astounding proliferation of the von Neumann architecture has obscured von Neumann’s contributions to massively parallel computing, distributed information processing, evolutionary computation, and neural nets. Because his deathbed notes for his canceled Silliman lectures at Yale were published posthumously (and for a popular audience) as The Computer and the Brain (1958), von Neumann’s work has been associated with the claims of those who were exaggerating the analogies between the digital computer and the brain.


When Computers Can Think: The Artificial Intelligence Singularity by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann, Michelle Estes

3D printing, AI winter, anthropic principle, artificial general intelligence, Asilomar, augmented reality, Automated Insights, autonomous vehicles, availability heuristic, blue-collar work, brain emulation, call centre, cognitive bias, combinatorial explosion, computer vision, create, read, update, delete, cuban missile crisis, David Attenborough, Elon Musk, en.wikipedia.org, epigenetics, Ernest Rutherford, factory automation, feminist movement, finite state, Flynn Effect, friendly AI, general-purpose programming language, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, industrial robot, Isaac Newton, job automation, John von Neumann, Law of Accelerating Returns, license plate recognition, Mahatma Gandhi, mandelbrot fractal, natural language processing, Parkinson's law, patent troll, patient HM, pattern recognition, phenotype, ransomware, Ray Kurzweil, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, sorting algorithm, speech recognition, statistical model, stem cell, Stephen Hawking, Stuxnet, superintelligent machines, technological singularity, Thomas Malthus, Turing machine, Turing test, uranium enrichment, Von Neumann architecture, Watson beat the top human players on Jeopardy!, wikimedia commons, zero day

When constructed, it was hard wired to perform specific calculations, often related to the trajectories of artillery shells. Changing its program required rewiring the computer, which took days or weeks. Then in 1948 ENIAC was modified to have what is essentially a von Neumann architecture. This made it much easier to program. However, it also made the computer six times slower than it had been previously because it could now only execute one instruction at a time. Even on that ancient computer that ran thousands of times slower than modern computers, the trade off was considered worthwhile. Being easy to program was and is generally far more important than being very efficient. Today there are variations of the basic von Neumann architecture. Graphics Processing Units (GPUs) contain hundreds of von Neumann subsystems that can compute at the same time and so render complex scenes in real time. More radical designs are used for specialized Digital Signal Processors (DSPs), which can process radio wave signals in real time.

Robots leaving the factory Programs writing Programs 1. The task of man 2. Recursive compilation 3. Quines 4. Reasoning about program logic 5. Automating program generation 6. High-level models 7. Learning first order concepts 8. Evolutionary algorithms 9. Artificial life 10. Evolutionary programming Computer Hardware 1. Introduction 2. Transistors 3. Logic Elements 4. Programmable Logic Arrays 5. Von Neumann Architecture 6. PLAs vs von Neumann 7. Analog Computers 8. Neurons Brains 1. Gross anatomy 2. Neocortex 3. Brain activity 4. Brain function and size 5. Brain simulation 6. Worms 13. Computational Neuroscience 1. Neurons 2. Neuron synapse 3. Integrate and fire (IF) neurons 4. Hebbian learning 5. Plasticity 6. Neuron chains 7. Self organizing maps (SOMs) 8. Recurrent networks and learning 9. Memory 10.

Neurons also have quirks such as sometimes firing for no good reason, and so multiple neurons need to be used to provide one reliable signal. Neurons are also relatively slow, with only roughly 200 firings per second, so they have to work concurrently to produce results in a timely manner. On the other hand, ordinary personal computers might contain 4 billion bytes of fast memory, and several thousand billion bytes of slower disk storage. Unlike a neuron, a byte of computer memory is passive, and a conventional “von Neumann” architecture can only process a few dozen bytes at any one time. That said, the computer can perform several billion operations per second, which is millions of times faster than neurons. Specialized hardware and advanced architectures can perform many operations simultaneously, but we also know from experience that it is difficult to write highly concurrent programs that utilize that hardware efficiently.


pages: 118 words: 35,663

Smart Machines: IBM's Watson and the Era of Cognitive Computing (Columbia Business School Publishing) by John E. Kelly Iii

AI winter, call centre, carbon footprint, crowdsourcing, demand response, discovery of DNA, disruptive innovation, Erik Brynjolfsson, future of work, Geoffrey West, Santa Fe Institute, global supply chain, Internet of things, John von Neumann, Mars Rover, natural language processing, optical character recognition, pattern recognition, planetary scale, RAND corporation, RFID, Richard Feynman, smart grid, smart meter, speech recognition, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!

A cognitive computer employing these systems will respond to inquiries more quickly than today’s computers; less data movement will be required and less energy will be used. Today’s von Neumann–style computing won’t go away when cognitive systems come online. New chip and computing technologies will extend its life far into the future. In many cases, the cognitive architecture and the von Neumann architecture will be employed side by side in hybrid systems. Traditional computing will become ever more capable while cognitive technologies will do things that were not possible before. Already, cloud, social networking, mobile, and new ways to interact with computing from tablets to glasses are fueling the desire for cognitive systems that will, for example, both harvest insights from social networks and enhance our experiences within them.

But being stuck at home gave him a lot of time to think over the SyNAPSE dilemma. He concluded that while it was futile in the short term to try to invent a new technology for a cognitive machine, that didn’t mean the project should be abandoned. Instead, the team needed to refocus on CMOS chip technology and on digital circuitry rather than analog circuitry. They would create an entirely new non–von Neumann architecture in both silicon and software that would simulate the functions of neurons and synapses. Using that architecture, they would produce chips for sense-making tasks that would be vastly more efficient than today’s standard digital processors.7 Dharmendra called a meeting of all of the participants in the project. On January 21, 2010, about twenty-five people crowded into the library at the Almaden lab.

Unless we can make computers many orders of magnitude more energy efficient, we’re not going to be able to use them extensively as our intelligent assistants. Computing intelligence will be too costly to be practical. Scientists at IBM Research believe that to make computing sustainable in the era of big data, we will need a different kind of machine—the data-centric computer. Today’s computers are processor-centric. The microprocessor, which is the central processing unit in the von Neumann architecture, is where much of the action happens in computing. Working hand in hand with the operating system, the microprocessor sends out instructions to various components within the computer, requesting data from where it’s stored, including memory chips and disk drives. If the computer is part of a larger network, the processor fetches data from storage systems located out on the network. A new system design is needed to greatly reduce the amount of movement required.


pages: 253 words: 80,074

The Man Who Invented the Computer by Jane Smiley

1919 Motor Transport Corps convoy, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Arthur Eddington, British Empire, c2.com, computer age, Fellow of the Royal Society, Henri Poincaré, IBM and the Holocaust, Isaac Newton, John von Neumann, Karl Jansky, Norbert Wiener, Norman Macrae, Pierre-Simon Laplace, RAND corporation, Turing machine, Vannevar Bush, Von Neumann architecture

Partisans of von Neumann make the case that, as with everything else von Neumann did, he took the raw material of another man’s ideas and immediately transcended it, or, as Macrae says, “Johnny grabbed other people’s ideas, then by his clarity leapt five blocks ahead of them, and helped put them into practical effect.” The most important contribution of the “First Draft” to computer design was that it laid out what came to be known as “von Neumann architecture”—that is, that the computer could contain a set of instructions in its memory like the set of instructions that Turing’s human “computer” would have been given and would have to follow day after day forever. The instructions would be stored in the memory, which the electronic computer could readily access (not like a paper tape or a deck of punch cards). This set of instructions in the memory would be called a stored program.

Flowers promised the machine by August, but postwar repairs and improvements to the telephone system superseded the project, and by February 1947 the ACE was going nowhere because Turing could not persuade Womersley to commit himself to Turing’s ideas—for example, an engineering department was set up, but made no progress. Possibly, Womersley was the sort of administrator who thinks contradictory ideas constitute a backup plan, but in the end they constituted no plan at all because what had come to be called “von Neumann architecture”—the principles of computer design set out in the “First Draft”—were simply taking over by coming to seem tried and tested.3 Turing quit. In the autumn of 1947, he returned to Cambridge. 1. One reason that Zuse’s autobiography is interesting is that it gives Americans a perspective on life in Nazi Germany that we rarely get. Zuse seems perennially surprised by the power of the Nazis and the events he lives through.

Mauchly had received an honorary doctorate from the University of Pennsylvania, the Scott Medal from the Franklin Institute, and other Philadelphia-based awards. Eckert was still with Sperry Rand (he stayed with Sperry, and then Unisys, until 1989). Neither Mauchly nor Eckert had profited directly from the ENIAC patent, but they did get credit (and they did seek that credit) for inventing the computer. Eckert, in particular, was vocal about the inaccuracy of the phrase “von Neumann architecture”—he thought it should be called “Eckert architecture.” But the vagaries of patent law and the delay in awarding the Eckert and Mauchly patents seemed to be working for Sperry. If the patent had been awarded in 1947, it would have run out by 1964, before computers became big business. However, in 1960, the patent was still being challenged. It would not be finally awarded until 1964. At that point, it looked as though it would run into the eighties.


pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

Bill Gates: Altair 8800, Claude Shannon: information theory, computer age, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Dynabook, Eratosthenes, Grace Hopper, invention of the telegraph, Isaac Newton, Jacquard loom, James Watt: steam engine, John von Neumann, Joseph-Marie Jacquard, Louis Daguerre, millennium bug, Norbert Wiener, optical character recognition, popular electronics, Richard Feynman, Richard Stallman, Silicon Valley, Steve Jobs, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

Programming the ENIAC was a matter of throwing switches and plugging in cables.) These instructions should be sequential in memory and addressed with a program counter but should also allow conditional jumps. This design came to be known as the stored-program concept. These design decisions were such an important evolutionary step that today we speak of von Neumann architecture. The computer that we built in the last chapter was a classic von Neumann machine. But with von Neumann architecture comes the von Neumann bottleneck. A von Neumann machine generally spends a significant amount of time just fetching instructions from memory in preparation for executing them. You'll recall that the final design of the Chapter 17 computer required that three-quarters of the time it spent on each instruction be involved in the instruction fetch.

The storage-and-memory question is usually answered with an analogy: "Memory is like the surface of your desk and storage is like the filing cabinet." That's not a bad answer as far as it goes. But I find it quite unsatisfactory. It makes it sound as if computer architecture were patterned after an office. The truth is that the distinction between memory and storage is an artificial one and exists solely because we don't have a single storage medium that is both fast and vast as well as nonvolatile. What we know today as "von Neumann architecture"—the dominant computer architecture for over 50 years—is a direct result of this technical deficiency. Here's another question that someone once asked me: "Why can't you run Macintosh programs under Windows?" My mouth opened to begin an answer when I realized that it involved many more technical issues than I'm sure my questioner was prepared to deal with in one sitting. I want Code to be a book that makes you understand these things, not in some abstract way, but with a depth that just might even rival that of electrical engineers and programmers.

In addition, an important feature of C is its support of pointers, which are essentially numeric memory addresses. Because C has operations that parallel many common processor instructions, C is sometimes categorized as a high-level assembly language. More than any ALGOL-like language, C closely mimics common processor instruction sets. Yet all ALGOL-like languages—which really means most commonly used programming languages—were designed based on von Neumann architecture computers. Breaking out of the von Neumann mind-set when designing a computer language isn't easy, and getting other people to use such a language is even harder. One such non–von Neumann language is LISP (which stands for List Processing), which was designed by John McCarthy in the late 1950s and is useful for work in the field of artificial intelligence. Another language that's just as unusual but nothing like LISP is APL (A Programming Language), developed in the late 1950s by Kenneth Iverson.


Turing's Cathedral by George Dyson

1919 Motor Transport Corps convoy, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Benoit Mandelbrot, British Empire, Brownian motion, cellular automata, cloud computing, computer age, Danny Hillis, dark matter, double helix, fault tolerance, Fellow of the Royal Society, finite state, Georg Cantor, Henri Poincaré, housing crisis, IFF: identification friend or foe, indoor plumbing, Isaac Newton, Jacquard loom, John von Neumann, mandelbrot fractal, Menlo Park, Murray Gell-Mann, Norbert Wiener, Norman Macrae, packet switching, pattern recognition, Paul Erdős, Paul Samuelson, phenotype, planetary scale, RAND corporation, random walk, Richard Feynman, SETI@home, social graph, speech recognition, Thorstein Veblen, Turing complete, Turing machine, Von Neumann architecture

., adopts “Johnny” nickname (1930) appointed to IAS (1933) resigns positions in Germany (1933–1935) becomes U.S. citizen (1937) marries Klára (Klári) Dan (1938) mission to England during World War II (1943), 4.1, 13.1 joins Manhattan Project at Los Alamos (1943), 4.1, 5.1 appointed to Atomic Energy Commission (1954) accepts position at UCLA (1954) diagnosed with cancer (1955) final 18 months of life, 14.1, 18.1 death (1957) and alternative models of computing, 14.1, 14.2 on analog vs. digital, 9.1, 12.1, 14.1 and applied mathematics, 3.1, 4.1, 5.1, 10.1 and Atomic Energy Commission (AEC), 1.1, 11.1, 14.1, 14.2, 14.3, 15.1, 15.2, 16.1, 18.1 on axiomatization 49–50 and Nils Barricelli and beginnings of ECP, 6.1, 7.1 and Julian Bigelow, 7.1, 14.1, 18.1 with Niels Bohr in Copenhagen (1938) on bottlenecks in computing, 5.1, 5.2 and the brain in Cambridge with Turing (1935) on communism, 4.1, 10.1, 10.2 character of, 3.1, 4.1, 4.2, 4.3, 4.4, 5.1, 5.2, 5.3, 7.1, 7.2, 8.1, 8.2, 8.3, 10.1, 10.2, 10.3, 11.1, 11.2, 11.3, 12.1, 13.1, 13.2, 14.1, 17.1, 18.1, 18.2, 18.3 and Cybernetics movement driving habits of, 4.1, 10.1, 15.1 and economics, 4.1, 4.2, 15.1 and the EDVAC and engineers, 4.1, 8.1, 10.1 and the ENIAC, 5.1, 5.2, 5.3, 9.1, 9.2, 10.1, 11.1 and foundations of mathematics frequent travels of, 4.1, 4.2, 5.1, 10.1, 10.2, 13.1, 16.1 on future of computing and future of technology and gambling, 10.1, 10.2, 10.3 and game theory, 4.1, 4.2, 4.3, 8.1, 10.1, 15.1, 18.1 and Gödel, 4.1, 6.1, 6.2, 6.3, 6.4, 8.1, 13.1, 15.1, 18.1 and Herman Goldstine and IBM, 4.1, 5.1, 5.2, 8.1, 9.1, 14.1, 18.1 and incompleteness on infinity, 4.1, 16.1 and Los Alamos, prf.1, 1.1, 1.2, 4.1, 5.1, 5.2, 10.1, 10.2, 10.3, 11.1, 11.2, 11.3, 15.1, 18.1 mental abilities of, 1.1, 4.1, 4.2, 4.3, 4.4, 7.1, 7.2, 11.1, 14.1, 18.1 and military, 1.1, 4.1, 4.2, 10.1, 10.2, 10.3, 11.1, 14.1, 16.1, 17.1 and molecular biology, 12.1, 15.1 on “a network of computers that would span the world” (1945) on non-linear coding and nuclear weapons, prf.1, 1.1, 1.2, 4.1, 6.1, 10.1, 10.2, 11.1, 11.2, 11.3, 11.4, 15.1, 16.1 and numerical weather prediction, 9.1, 9.2, 9.3, 18.1 and Oppenheimer, 9.1, 14.1, 14.2 and origins of ECP, 5.1, 5.2, 5.3 and patent rights, 5.1, 5.2, 8.1 and preventive (“quick”) war, 6.1, 10.1, 10.2, 15.1, 18.1 and punched cards, 4.1, 9.1 and RCA, 5.1, 5.2, 9.1 on reliability, 7.1, 12.1 religious beliefs of, 4.1, 14.1 on remote input/output and Selectron and shock waves, 4.1, 12.1, 16.1, 16.2 on singularity (technological) and stored-program computing, 5.1, 5.2, 6.1, 8.1, 10.1, 16.1 superstitions of and theory of self-reproducing automata, 1.1, 15.1, 15.2 and Turing (and Universal Turing Machine), 1.1, 3.1, 5.1, 6.1, 8.1, 13.1, 13.2, 13.3, 13.4, 13.5, 15.1 and Stan Ulam, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 10.1, 10.2, 11.1, 11.2, 14.1, 14.2 and Oswald Veblen, 4.1, 14.1, 18.1 work habits, 4.1, 7.1, 10.1 and World War II, 4.1, 6.1 von Neumann, Klári (Klára Dán, 1911–1963), 4.1, 10.1, 10.2, 14.1, 14.2, 16.1, 18.1, 18.2 childhood in “Roaring Twenties” Budapest meets von Neumann in Monte Carlo (1930s) meets von Neumann in Budapest (1937) marries John von Neumann (1938) arrives in U.S. (1938) suicide of father (1939) pregnancy and miscarriage (1942) death, in La Jolla (1963) and computer programming, 10.1, 10.2 and depression, 10.1, 10.2 on ECP, 5.1, 5.2, 7.1, 7.2, 18.1 and ENIAC, 9.1, 10.1 on Abraham Flexner, 3.1, 3.2 on Hungary and Hungarians on IAS, 3.1, 6.1, 7.1, 7.2, 11.1 marriages of, 10.1, 10.2, 18.1, nts.1n and Monte Carlo (code), 10.1, 10.2, 16.1 and nuclear weapons design, 10.1, 18.1 and population research, 10.1, 10.2, 10.3 and Princeton, 4.1, 10.1, 14.1, 18.1 on John von Neumann, 3.1, 4.1, 4.2, 4.3, 4.4, 5.1, 5.2, 10.1, 11.1, 14.1, 18.1 on von Neumann and destruction of Europe, 4.1, 4.2, 10.1, 10.2, 10.3 on von Neumann and Morgenstern, 4.1, 4.2 on von Neumann and U.S. landscape, 4.1, 10.1 on Oswald Veblen, 3.1, 4.1, 4.2, 14.1 Vonneumann, Nicholas (1911–2011), 4.1, 4.2, 4.3 on life under Béla Kun on family life in Budapest on John von Neumann, 4.1, 4.2, 4.3, 4.4, 14.1 on Max von Neumann von Neumann, Max (1873–1928), 4.1, 4.2, 4.3, 4.4 awarded nobility, 1913 von Neumann, Michael (1907–1989) von Neumann (Whitman), Marina, 4.1, 10.1, 10.2 on John von Neumann, 4.1, 4.2, 10.1, 14.1 von Neumann architecture and non–von Neumann architecture von Neumann bottleneck Wald, Abraham, 7.1, 7.2 Walter Reed Hospital, 4.1, 14.1 Ware, Willis, 1.1, 5.1, 7.1, 7.2, 7.3, 7.4, 8.1, 8.2, 18.1, 18.2 on Bigelow, 7.1, 8.1, 14.1 on duplication of IAS computer joins ECP, 1946 on numerical testing routines on opposition to ECP at IAS on Presper Eckert on Selectron memory on von Neumann, prf.1, prf.2, 5.1, 10.1 Warren, Robert B.

The “First Draft of a Report on the EDVAC,” reproduced by mimeograph and released into limited distribution by the Moore School on June 30, 1945, outlined the design of a high-speed stored-program electronic digital computer, including the requisite formulation and interpretation of coded instructions—“which must be given to the device in absolutely exhaustive detail.”42 The functional elements of the computer were separated into a hierarchical memory, a control organ, a central arithmetic unit, and input/output channels, making distinctions still known as the “von Neumann architecture” today. A fast internal memory, coupled to a larger secondary memory, and linked in turn to an unlimited supply of punched cards or paper tape, gave the unbounded storage that Turing had prescribed. The impediment of a single channel between memory and processor is memorialized as the “von Neumann bottleneck,” although its namesake attempted, unsuccessfully, to nip this in the bud. “The whole system will be well balanced, so, that if it is properly and intelligently used, there will be no bottlenecks,” he explained to Max Newman, “even not at the outputs and inputs of the human intellects with which it has to be matched.”43 When a subject captured von Neumann’s attention, he reconstituted it on his own terms from the bottom up.

“This may seem like a highly whimsical way of characterizing a logically deep question of how to express computations to machines. However, it is believed to be not far from an important central truth, that highly recursive, conditional and repetitive routines are used because they are notationally efficient (but not necessarily unique) as descriptions of underlying processes.”40 Bigelow questioned the persistence of the von Neumann architecture and challenged the central dogma of digital computing: that without programmers, computers cannot compute. He (and von Neumann) had speculated from the very beginning about “the possibility of causing various elementary pieces of information situated in the cells of a large array (say, of memory) to enter into a computation process without explicitly generating a coordinate address in ‘machine-space’ for selecting them out of the array.”41 Biology has been doing this all along.


pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI by John Brockman

AI winter, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, artificial general intelligence, Asilomar, autonomous vehicles, basic income, Benoit Mandelbrot, Bill Joy: nanobots, Buckminster Fuller, cellular automata, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, Danny Hillis, David Graeber, easy for humans, difficult for computers, Elon Musk, Eratosthenes, Ernest Rutherford, finite state, friendly AI, future of work, Geoffrey West, Santa Fe Institute, gig economy, income inequality, industrial robot, information retrieval, invention of writing, James Watt: steam engine, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kickstarter, Laplace demon, Loebner Prize, market fundamentalism, Marshall McLuhan, Menlo Park, Norbert Wiener, optical character recognition, pattern recognition, personalized medicine, Picturephone, profit maximization, profit motive, RAND corporation, random walk, Ray Kurzweil, Richard Feynman, Rodney Brooks, self-driving car, sexual politics, Silicon Valley, Skype, social graph, speech recognition, statistical model, Stephen Hawking, Steven Pinker, Stewart Brand, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, telemarketer, telerobotics, the scientific method, theory of mind, Turing machine, Turing test, universal basic income, Upton Sinclair, Von Neumann architecture, Whole Earth Catalog, Y2K, zero-sum game

.* As in von Neumann’s machine, in biological reproduction the linear sequence of symbols in DNA is interpreted—through transcription into RNA molecules, which are then translated into proteins, the structures that make up a new cell—and the DNA is replicated and encased in the new cell. A second foundational piece of work was in a 1945 “First Draft” report on the design for a digital computer, wherein von Neumann advocated for a memory that could contain both instructions and data.* This is now known as a von Neumann architecture computer—as distinct from a Harvard architecture computer, where there are two separate memories, one for instructions and one for data. The vast majority of computer chips built in the era of Moore’s Law are based on the von Neumann architecture, including those powering our data centers, our laptops, and our smartphones. Von Neumann’s digital-computer architecture is conceptually the same generalization—from early digital computers constructed with electromagnetic relays at both Harvard University and Bletchley Park—that occurs in going from a special-purpose Turing Machine to a Universal Turing Machine.

The first is rooted in the abstractions they adopted. In the fifty-year Moore’s Law–fueled race to produce software that could exploit the doubling of computer capability every two years, the typical care and certification of engineering disciplines were thrown by the wayside. Software engineering was fast and prone to failures. This rapid development of software without standards of correctness has opened up many routes to exploit von Neumann architecture’s storage of data and instructions in the same memory. One of the most common routes, known as “buffer overrun,” involves an input number (or long string of characters) that is bigger than the programmer expected and overflows into where the instructions are stored. By carefully designing an input number that is too big by far, someone using a piece of software can infect it with instructions not intended by the programmer, and thus change what it does.


pages: 352 words: 120,202

Tools for Thought: The History and Future of Mind-Expanding Technology by Howard Rheingold

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, card file, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, conceptual framework, Conway's Game of Life, Douglas Engelbart, Dynabook, experimental subject, Hacker Ethic, Howard Rheingold, interchangeable parts, invention of movable type, invention of the printing press, Jacquard loom, John von Neumann, knowledge worker, Marshall McLuhan, Menlo Park, Norbert Wiener, packet switching, pattern recognition, popular electronics, post-industrial society, RAND corporation, Robert Metcalfe, Silicon Valley, speech recognition, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, telemarketer, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

His colleagues, famous thinkers in their own right, all agreed that the operations of Johnny's mind were too deep and far too fast to be entirely human. He was one of history's most brilliant physicists, logicians, and mathematicians, as well as the software genius who invented the first electronic digital computer. John von Neumann was the center of the group who created the "stored program" concept that made truly powerful computers possible, and he specified a template that is still used to design almost all computers--the "von Neumann architecture." When he died, the Secretaries of Defense, the Army, Air Force, and Navy and the Joint Chiefs of staff were all gathered around his bed, attentive to his last gasps of technical and policy advice. Norbert Wiener, raised to be a prodigy, graduated from Tufts at fourteen, earned his Ph.D. from Harvard at eighteen, and studied with Bertrand Russell at nineteen. Wiener had a different kind of personality than his contemporary and colleague, von Neumann.

All such machines, the authors of the "Preliminary Report" declared, must have a unit where arithmetic and logical operations can be performed (the processing unit where actual calculation takes place, equivalent to Babbage's "mill"), a unit where instructions and data for the current problem can be stored (like Babbage's "store," a kind of temporary memory device), a unit that executes the instructions according to the specified sequential order (like the "read/write head" of Turing's theoretical machine), and a unit where the human operator can enter raw information or see the computed output (what we now call "input-output devices"). Any machine that adheres to these principles -- no matter what physical technology is used to implement these logical functions -- is an example of what has become known as "the von Neumann architecture." It doesn't matter whether you build such a machine out of gears and springs, vacuum tubes, or transistors, as long as its operations follow this logical sequence. This theoretical template was first implemented in the Unites States at the Institute for Advanced Study. Modified copies of the IAS machine were made for the Rand Corporation, an Air Force spinoff "think tank" that was responsible for keeping track of targets for the nation's new but fast-growing nuclear armory, and for the Los Alamos Laboratory.

There is something diffident yet warm about the man, something gentle yet stubborn in his nature that wins respect. "He reminds me of Moses parting the Red Sea," is the way Alan Kay describes Engelbart's gentle charisma. Of course, the original Moses never set foot in the promised Land. And he never had the reputation of being an easy man to work with. In 1951, Engelbart quit his job at Ames and went to graduate school at the University of California at Berkeley, where one of the first von Neumann architecture computers was being built. That was when he began to notice that not only didn't people know what he was talking about, but some presumably "objective" scientists were overly hostile. He started saying the wrong things to people who could affect his career, things that simply sounded strange to the other electrical engineers. "When we get the computer built," this young engineer kept asking, "would it be okay if I use it to teach people?


pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, animal electricity, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, British Empire, business process, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, complexity theory, continuous integration, Conway's Game of Life, cosmological principle, dark matter, dematerialisation, double helix, Douglas Hofstadter, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, income inequality, index card, industrial robot, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, Kickstarter, liberal capitalism, lifelogging, millennium bug, Moravec's paradox, natural language processing, Norbert Wiener, off grid, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, post-industrial society, prediction markets, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, social intelligence, speech recognition, stem cell, Stephen Hawking, Steven Pinker, strong AI, technological singularity, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

Von Neumann was fascinated by the design of ENIAC, and wondered how the computer might be easily reprogrammed to perform a different set of operations – not involving artillery ballistics this time, but to predict the results of a hydrogen bomb explosion. Invited by the team that developed ENIAC to advise them, von Neumann produced a landmark report,7 which described a machine that could store both data and programs.8 The ‘von Neumann architecture’ – as it has hitherto been known – demonstrated how computers could be reprogrammed easily. Until then computers had fixed programs, and had to be physically rewired in order to be reprogrammed. Von Neumann’s architecture allowed code in a computer to be self-modified. One could thus write programs that write programs, an idea that makes possible the host of automated tools that computer engineers have nowadays at their disposal, such as assemblers and compilers.

This is a very hard question to answer, since we do not yet have a way to collect credible evidence.26 Nevertheless, I personally would be inclined to bet that the spontaneous emergence of self-awareness in current technological cyberspace is highly improbable. Since the 1940s, we have been locked in a specific approach to computer technology that separates hardware from software, and which is mostly based on a specific hardware architecture called the ‘von Neumann architecture’, as we saw in the previous chapter. There could have been many other paths we could have taken in computer evolution (for instance advanced analogue computers), but we did not. The word ‘evolution’ is of great importance here. The pseudo-cybernetic assumption of the AI Singularity hypothesis essentially claims that an evolutionary kind of emergence of self-awareness is involved. Let us accept, for argument’s sake, that evolutionary forces are at play in the way computers have evolved since the 1940s.

The quest for a general formula of computation persisted until the Swiss mathematician Jakob Bernoulli (1654–1705) ‘discovered’ a sequence of constants (the Bernoulli numbers) that provides a uniform formula for the sums of all powers. 12Brynjolfsson E., McAfee A. (2014),The Second Machine Engine. New York: W.W. Norton & Co. 13In Turing’s description the tape with the symbols (the ‘data’) is separate from the table of instructions (the ‘program’). In modern computers data and programs are stored in the same storage, a key insight that is part of the ‘von Neumann architecture’. 14According to historians Robert Friedel and Paul Israel at least twenty-two other inventors ‘discovered’ the incandescent lamp prior to Thomas Edison. However, it was Edison who developed the lamp into an effective source of electric lighting by selecting an effective incandescent material, achieving a higher vacuum and using a higher resistance filament. 15Konrad Zuse invented the world’s first programmable computer Z3, which became operational in May 1941. 14 From Bletchley Park to Google Campus 1‘Global Information Report 2013’, World Economic Forum (www.weforum.com). 2This is a phrase from Greek philosopher Heraclitus (535–475 BC).


pages: 239 words: 70,206

Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else by Steve Lohr

"Robert Solow", 23andMe, Affordable Care Act / Obamacare, Albert Einstein, big data - Walmart - Pop Tarts, bioinformatics, business cycle, business intelligence, call centre, cloud computing, computer age, conceptual framework, Credit Default Swap, crowdsourcing, Daniel Kahneman / Amos Tversky, Danny Hillis, data is the new oil, David Brooks, East Village, Edward Snowden, Emanuel Derman, Erik Brynjolfsson, everywhere but in the productivity statistics, Frederick Winslow Taylor, Google Glasses, impulse control, income inequality, indoor plumbing, industrial robot, informal economy, Internet of things, invention of writing, Johannes Kepler, John Markoff, John von Neumann, lifelogging, Mark Zuckerberg, market bubble, meta analysis, meta-analysis, money market fund, natural language processing, obamacare, pattern recognition, payday loans, personalized medicine, precision agriculture, pre–internet, Productivity paradox, RAND corporation, rising living standards, Robert Gordon, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley startup, six sigma, skunkworks, speech recognition, statistical model, Steve Jobs, Steven Levy, The Design of Experiments, the scientific method, Thomas Kuhn: the structure of scientific revolutions, unbanked and underbanked, underbanked, Von Neumann architecture, Watson beat the top human players on Jeopardy!

The low end of the server computer business had become fiercely price competitive, and IBM brought no technical advantage to that market. It was not IBM’s kind of business, even though it generated yearly sales of $4 billion. The big-data era is the next evolutionary upheaval in the landscape of computing. The things people want to do with data, like real-time analysis of data streams or continuously running machine-learning software, pose a threat to the traditional computer industry. Conventional computing—the Von Neumann architecture, named for mathematician and computer scientist John von Neumann—operates according to discrete steps of program, store, and process. Major companies and markets were built around those tiers of computing—software, disk drives, and microprocessors, respectively. Modern data computing, according to John Kelly, IBM’s senior vice president in charge of research, will “completely disrupt the industry as we know it, creating new platforms and players.”

., 5–6 Snyder, Steven, 165–67, 170 social networks, research using human behavior and, 86–94 retail use, 153–62 spread of information and, 73–74 Twitter posts and, 197–202 see also privacy concerns Social Security numbers, data used to predict person’s, 187–88 software, origin of term, 96 Solow, Robert, 72 Speakeasy programming language, 160 Spee (Harvard club), 28–30 Spohrer, Jim, 25 Stanford University, 211–12 Starbucks, 157 Stockholm, rush-hour pricing in, 47 storytelling, computer algorithms and, 120–21, 149, 165–66, 205, 214 structural racism, in big data racial profiling, 194–95 Structure of Scientific Revolutions, The (Kuhn), 175 Sweeney, Latanya, 193–95 System S, at IBM, 40 Tarbell, Ida, 208 Taylor, Frederick Winslow, 207–8 Tecco, Halle, 16, 25, 28, 168–69 Tetlock, Philip, 67–68 thermostats, learning by, 143–45, 147–53 Thinking, Fast and Slow (Kahneman), 66–67 toggling, 84 Truth in Lending Act (1968), 185 T-shaped people, 25 Tukey, John, 96–97 Turing, Alan, 178–79 Tversky, Amos, 66 Twitter, 85 posts studied for personal information, 197–202 “Two Cultures, The” (Snow), 5–6 “universal machine” (Turing’s theoretical computer), 179 universities, data science and, 15–16, 97–98, 211–12 Unlocking the Value of Personal Data: From Collection to Usage (World Economic Forum), 203 “Unreasonable Effectiveness of Data, The” (Norvig), 116 use-only restrictions, on data, 203 Uttamchandani, Menka, 77–78, 80, 212 VALS (Values, Attitudes, and Lifestyles), 155 Van Alstyne, Marshall, 74 Vance, Ashlee, 85 Vargas, Veronica, 159–60 Varma, Anil, 136–37 Veritas, 91 vineyards, data used for precision agriculture in, 123–33, 212 Vivero, David, 29 Vladeck, David, 203, 204 von Neumann, John, 54 Von Neumann architecture, 54 Walker, Donald, 2, 63, 212 Walmart, 104, 154 Watson, Thomas Jr., 49 Watson technology, of IBM, 45, 66–67, 120, 205 as cloud service, 9, 54 Jeopardy and, 7, 40, 111, 114 medical diagnoses and, 69–70, 109 Watts, Duncan J., 86 weather analysis, with big data, 129–32 Weitzner, Daniel, 184 “Why ask Why?” (Gelman and Imbens), 115–16 winemaking, precision agriculture and, 123–33, 212 Wing, Michael, 49–50 workforce rebalancing, at IBM, 57 World Economic Forum, 203 Yarkoni, Tal, 199 Yoshimi, Bill, 198 ZestFinance, data correlation and, 104–7 Zeyliger, Philip, 100–101 Zhou, Michelle, 197–202 Zuckerberg, Mark, 28, 86, 89 ABOUT THE AUTHOR Photo by Fred Conrad STEVE LOHR reports on technology, business, and economics for the New York Times.


pages: 294 words: 96,661

The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity by Byron Reese

agricultural Revolution, AI winter, artificial general intelligence, basic income, Buckminster Fuller, business cycle, business process, Claude Shannon: information theory, clean water, cognitive bias, computer age, crowdsourcing, dark matter, Elon Musk, Eratosthenes, estate planning, financial independence, first square of the chessboard, first square of the chessboard / second half of the chessboard, full employment, Hans Rosling, income inequality, invention of agriculture, invention of movable type, invention of the printing press, invention of writing, Isaac Newton, Islamic Golden Age, James Hargreaves, job automation, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, lateral thinking, life extension, Louis Pasteur, low skilled workers, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Mary Lou Jepsen, Moravec's paradox, On the Revolutions of the Heavenly Spheres, pattern recognition, profit motive, Ray Kurzweil, recommendation engine, Rodney Brooks, Sam Altman, self-driving car, Silicon Valley, Skype, spinning jenny, Stephen Hawking, Steve Wozniak, Steven Pinker, strong AI, technological singularity, telepresence, telepresence robot, The Future of Employment, the scientific method, Turing machine, Turing test, universal basic income, Von Neumann architecture, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working poor, Works Progress Administration, Y Combinator

Everything your smartphone can do can be programmed on a Turing machine, and everything IBM Watson can do can be programmed on a Turing machine. Who could have guessed that such a humble little device could do all that? Well, Turing could, of course. But no one else seems to have had that singular idea. Exit Turing. Enter John von Neumann, whom we call the father of modern computing. In 1945, he developed the von Neumann architecture for computers. While Turing machines are purely theoretical, designed to frame the question of what computers can do, the von Neumann architecture is about how to build actual computers. He suggested an internal processor and computer memory that holds both programs and data. In addition to the computer’s memory, there might also be external storage to hold data and information not currently needed. Throw in input and output devices, and one has a von Neumann setup.


The Deep Learning Revolution (The MIT Press) by Terrence J. Sejnowski

AI winter, Albert Einstein, algorithmic trading, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, bioinformatics, cellular automata, Claude Shannon: information theory, cloud computing, complexity theory, computer vision, conceptual framework, constrained optimization, Conway's Game of Life, correlation does not imply causation, crowdsourcing, Danny Hillis, delayed gratification, discovery of DNA, Donald Trump, Douglas Engelbart, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, Flynn Effect, Frank Gehry, future of work, Google Glasses, Google X / Alphabet X, Guggenheim Bilbao, Gödel, Escher, Bach, haute couture, Henri Poincaré, I think there is a world market for maybe five computers, industrial robot, informal economy, Internet of things, Isaac Newton, John Conway, John Markoff, John von Neumann, Mark Zuckerberg, Minecraft, natural language processing, Netflix Prize, Norbert Wiener, orbital mechanics / astrodynamics, PageRank, pattern recognition, prediction markets, randomized controlled trial, recommendation engine, Renaissance Technologies, Rodney Brooks, self-driving car, Silicon Valley, Silicon Valley startup, Socratic dialogue, speech recognition, statistical model, Stephen Hawking, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Von Neumann architecture, Watson beat the top human players on Jeopardy!, X Prize, Yogi Berra

There was no limit either to the number of layers in a network or to the connectivity within any given layer. But there was one problem: coming to equilibrium and collecting statistics became increasingly slow to simulate, and larger networks took much longer to reach equilibrium. In principle, it is possible to build a computer with a massively parallel architecture that is much faster than one with a traditional von Neumann architecture that makes one update at a time. Digital computers in the 1980s could perform only a million operations per second. Today’s computers perform billions of operations per second, and, by linking together many thousands of cores, high-performance computers are a million times faster than before—an unprecedented increase in technological power. The Manhattan Project was a $26 billion dollar bet, in 2016 dollars, made by the United States without any assurance that the atomic bomb 110 Chapter 8 Figure 8.1 David Rumelhart at the University of California, San Diego, around the time the two volumes of Parallel Distributed Processing were published in 1986.

The race is on to design and build a new generation of chips to run learning algorithms, whether deep, reinforcement, or other, thousands of times faster and more efficiently than the way they are now simulated on generalpurpose computers. The new very large-scale integration (VLSI) chips have parallel processing architectures, with memory onboard to alleviate the bottleneck between memory and the central processing unit (CPU) in the sequential von Neumann architectures that have dominated computing for the last fifty years. We are still in an exploratory phase with regard to hardware, and each type of special-purpose VLSI chip has different strengths and limitations. Massive amounts of computer power will be needed to run the large-scale networks that are being developed for AI applications, and there is tremendous potential for profit in building efficient hardware.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

Ada Lovelace, affirmative action, AI winter, Alfred Russel Wallace, Amazon Mechanical Turk, animal electricity, autonomous vehicles, Black Swan, British Empire, cellular automata, citizen journalism, Claude Shannon: information theory, combinatorial explosion, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, desegregation, discovery of DNA, Douglas Hofstadter, Elon Musk, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, low skilled workers, Mark Zuckerberg, mass immigration, meta analysis, meta-analysis, mutually assured destruction, natural language processing, new economy, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, women in the workforce

The others include devising a way to make ENIAC (arguably the world’s first real computer) programmable; making substantial contributions to quantum physics6 and equilibrium theories in economics; and, inventing game theory, an area of mathematical research which shaped cold war politics for a generation through his descriptions of a game-theoretic construct he called “mutually assured destruction.” Inspired by Turing’s papers on computation, von Neumann also came up with the modern conception of Babbage’s ‘store’ and ‘mill’ computer structure, in what is now called the ‘von Neumann architecture’, the architecture at the heart of almost all modern computers. Amongst this world-changing productivity, von Neumann also speculated about how computer programs, like genetic organisms, might be able to self-replicate. His ‘cellular automata’ theory closely parallels the actual replication methods of biological DNA, despite the fact that von Neumann’s work was done in advance of the actual structure of DNA being discovered by Watson and Crick in 1953.7 At the same time, statistician George Box suggested ‘evolutionary operations’8 as a methodology for optimizing industrial processes in the late 1950s, though he never implemented the procedure as a computer algorithm, and there are a number of other scientists who also struck close to the ideas that would eventually emerge as evolutionary computation.

See TSP truth uncertainty, here TSP (travelling salesman problem), here, here Tuckett, David, here, here, here, here Tumblr, here Turing, Alan, here, here, here, here, here Turing Test, here, here, here Turkers, here, here Turner, Ted, here Twitter, here, here, here, here Uber, here, here UCL (University College London), here, here, here, here, here, here, here UK Eugenics Records Office, here uncertainty, here, here, here, here, here, here, here, here uncertainty factors, here University College London. See UCL Ursula Le Guin, here USENET groups, here Vaucanson, Jacques, here, here, here Verhulst, Pierre François, here Vlaams Belang, here Voltaire, here Von Neumann, John, here, here Von Neumann Architecture, here Von Neumann’s game theory, here Wallace, Alfred Russel, here, here, here Wallace/Darwin synchronicity, here Walras, Leon, here, here Washington, Booker T., here Watson, James, here, here Weizenbaum, Joseph, here Whale, James, here Whitman, Walt, here Williams, Robert, here Wollstonecraft, Mary, here, here, here, here, here World Economic Forum at Davos, here, here Yager, Chuck, here YouTube, here Zuckerberg, Mark, here, here, here, here, here BLOOMSBURY BUSINESS Bloomsbury Publishing Plc 50 Bedford Square, London, WC1B 3DP, UK 1385 Broadway, New York, NY 10018, USA BLOOMSBURY, BLOOMSBURY BUSINESS and the Diana logo are trademarks of Bloomsbury Publishing Plc First published in Great Britain 2019 Copyright © Robert Elliott Smith, 2019 Cover design by Alice Marwick Robert Elliott Smith has asserted his right under the Copyright, Designs and Patents Act, 1988, to be identified as Author of this work.


pages: 420 words: 119,928

The Three-Body Problem (Remembrance of Earth's Past) by Cixin Liu

back-to-the-land, cosmic microwave background, Deng Xiaoping, game design, Henri Poincaré, horn antenna, invisible hand, Isaac Newton, Norbert Wiener, Panamax, RAND corporation, Search for Extraterrestrial Intelligence, Von Neumann architecture

Qin Shi Huang grasped his sword and said, “Replace the malfunctioning component and behead all the soldiers who made up that gate. In the future, any malfunctions will be dealt with the same way!” Von Neumann glanced at Newton, disgusted. They watched as a few riders dashed into the motherboard with their swords unsheathed. After they “repaired” the faulty component, the order to restart was given. This time, the operation went very smoothly. Twenty minutes later, Three Body’s Von Neumann architecture human-formation computer had begun full operations under the Qin 1.0 operating system. “Run solar orbit computation software ‘Three Body 1.0’!” Newton screamed at the top of his lungs. “Start the master computing module! Load the differential calculus module! Load the finite element analysis module! Load the spectral method module! Enter initial condition parameters … and begin calculation!”

Against the background of the three suns in syzygy, text appeared: Civilization Number 184 was destroyed by the stacked gravitational attractions of a tri-solar syzygy. This civilization had advanced to the Scientific Revolution and the Industrial Revolution. In this civilization, Newton established nonrelativistic classical mechanics. At the same time, due to the invention of calculus and the Von Neumann architecture computer, the foundation was set for the quantitative mathematical analysis of the motion of three bodies. After a long time, life and civilization will begin once more, and progress through the unpredictable world of Three Body. We invite you to log on again. * * * Just as Wang logged out of the game, a stranger called. The voice on the phone was that of a very charismatic man. “Hello!


pages: 720 words: 197,129

The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution by Walter Isaacson

1960s counterculture, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, Apple II, augmented reality, back-to-the-land, beat the dealer, Bill Gates: Altair 8800, bitcoin, Bob Noyce, Buckminster Fuller, Byte Shop, c2.com, call centre, citizen journalism, Claude Shannon: information theory, Clayton Christensen, commoditize, computer age, crowdsourcing, cryptocurrency, Debian, desegregation, Donald Davies, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, en.wikipedia.org, Firefox, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, Haight Ashbury, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, index card, Internet Archive, Jacquard loom, Jaron Lanier, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Joseph-Marie Jacquard, Leonard Kleinrock, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, Mother of all demos, new economy, New Journalism, Norbert Wiener, Norman Macrae, packet switching, PageRank, Paul Terrell, pirate software, popular electronics, pre–internet, RAND corporation, Ray Kurzweil, RFC: Request For Comment, Richard Feynman, Richard Stallman, Robert Metcalfe, Rubik’s Cube, Sand Hill Road, Saturday Night Live, self-driving car, Silicon Valley, Silicon Valley startup, Skype, slashdot, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Steven Pinker, Stewart Brand, technological singularity, technoutopianism, Ted Nelson, The Coming Technological Singularity, The Nature of the Firm, The Wisdom of Crowds, Turing complete, Turing machine, Turing test, Vannevar Bush, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Whole Earth Review, wikimedia commons, William Shockley: the traitorous eight

To facilitate this, von Neumann came up with a variable-address program language that enabled an easy switch to substitute instructions while the program was running.57 The team at Penn proposed to the Army that a new and improved ENIAC be built along these lines. It would be binary rather than decimal, use mercury delay lines for memory, and include much, though not all, of what became known as “von Neumann architecture.” In the original proposal to the Army, this new machine was called the Electronic Discrete Variable Automatic Calculator. Increasingly, however, the team started referring to it as a computer, because it would do so much more than merely calculate. Not that it mattered. Everyone simply called it EDVAC. Over the ensuing years, at patent trials and conferences, in books and dueling historical papers, there would be debates over who deserved the most credit for the ideas developed in 1944 and early 1945 that became part of the stored-program computer.

Watson was a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history.

The report is available at http://www.virtualtravelog.net/wp/wp-content/media/2003-08-TheFirstDraft.pdf. 61. Dyson, Turing’s Cathedral, 1957. See also Aspray, John von Neumann and the Origins of Modern Computing. 62. Eckert oral history, Charles Babbage Institute. See also McCartney, ENIAC, 125, quoting Eckert: “We were clearly suckered by John von Neumann, who succeeded in some circles at getting my ideas called the ‘von Neumann architecture.’ ” 63. Jennings Bartik, Pioneer Programmer, 518. 64. Charles Duhigg and Steve Lohr, “The Patent, Used as a Sword,” New York Times, Oct. 7, 2012. 65. McCartney, ENIAC, 103. 66. C. Dianne Martin, “ENIAC: The Press Conference That Shook the World,” IEEE Technology and Society, Dec. 1995. 67. Jennings Bartik, Pioneer Programmer, 1878. 68. Fritz, “The Women of ENIAC.” 69. Jennings Bartik, Pioneer Programmer, 1939. 70.


pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence by John Brockman

agricultural Revolution, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic trading, artificial general intelligence, augmented reality, autonomous vehicles, basic income, bitcoin, blockchain, clean water, cognitive dissonance, Colonization of Mars, complexity theory, computer age, computer vision, constrained optimization, corporate personhood, cosmological principle, cryptocurrency, cuban missile crisis, Danny Hillis, dark matter, discrete time, Douglas Engelbart, Elon Musk, Emanuel Derman, endowment effect, epigenetics, Ernest Rutherford, experimental economics, Flash crash, friendly AI, functional fixedness, global pandemic, Google Glasses, hive mind, income inequality, information trail, Internet of things, invention of writing, iterative process, Jaron Lanier, job automation, Johannes Kepler, John Markoff, John von Neumann, Kevin Kelly, knowledge worker, loose coupling, microbiome, Moneyball by Michael Lewis explains big data, natural language processing, Network effects, Norbert Wiener, pattern recognition, Peter Singer: altruism, phenotype, planetary scale, Ray Kurzweil, recommendation engine, Republic of Letters, RFID, Richard Thaler, Rory Sutherland, Satyajit Das, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, social intelligence, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, strong AI, Stuxnet, superintelligent machines, supervolcano, the scientific method, The Wisdom of Crowds, theory of mind, Thorstein Veblen, too big to fail, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

As molecular neuroscience progresses, encountering no boundaries, and computers reproduce more and more of the behaviors we call intelligence in humans, that hypothesis looks inescapable. If it’s true, then all intelligence is machine intelligence. What distinguishes natural from artificial intelligence is not what it is but only how it’s made. Of course, that little word only is doing some heavy lifting here. Brains use a highly parallel architecture and mobilize many noisy analog units (i.e., neurons) firing simultaneously, while most computers use von Neumann architecture, with serial operation of much faster digital units. These distinctions are blurring, however, from both ends. Neural-net architectures are built in silicon, and brains interact ever more seamlessly with external digital organs. Already I feel that my laptop is an extension of my self—in particular, it is a repository for both visual and narrative memory, a sensory portal into the outside world, and a big part of my mathematical digestive system. 2.

That is, will sufficient complexity in the hardware bring about that sudden jump to self-awareness, all on its own? Or is there some missing ingredient? This is far from obvious; we lack any data, either way. I personally think that consciousness is incredibly more complex than is currently assumed by the “experts.” A human being is not merely x numbers of axons and synapses, and we have no reason to assume that we can count our flops-per-second in a plain Von Neumann architecture, reach a certain number, and suddenly out pops a thinking machine. If true consciousness can emerge, let’s be clear what that could entail. If the machine is truly aware, it will, by definition, develop a “personality.” It may be irascible, flirtatious, maybe the ultimate know-it-all, possibly incredibly full of itself. Would it have doubts or jealousy? Would it instantly spit out the Seventh Brandenburg and then 1,000 more?


pages: 894 words: 190,485

Write Great Code, Volume 1 by Randall Hyde

AltaVista, business process, Donald Knuth, John von Neumann, locality of reference, Von Neumann architecture, Y2K

Knowing about memory performance characteristics, data locality, and cache operation can help you design software that runs as fast as possible. Writing great code requires a strong knowledge of the computer’s architecture. 6.1 The Basic System Components The basic operational design of a computer system is called its architecture. John von Neumann, a pioneer in computer design, is given credit for the principal architecture in use today. For example, the 80x86 family uses the von Neumann architecture (VNA). A typical von Neumann system has three major components: the central processing unit (CPU), memory, and input/output (I/O), as shown in Figure 6-1. Figure 6-1. Typical von Neumann machine In VNA machines, like the 80x86, the CPU is where all the action takes place. All computations occur within the CPU. Data and machine instructions reside in memory until the CPU requires them, at which point the system transfers the data into the CPU.

The extra pins needed on the processor to support two physically separate buses increase the cost of the processor and introduce many other engineering problems. However, microprocessor designers have discovered that they can obtain many benefits of the Harvard architecture with few of the disadvantages by using separate on-chip caches for data and instructions. Advanced CPUs use an internal Harvard architecture and an external von Neumann architecture. Figure 9-9 shows the structure of the 80x86 with separate data and instruction caches. Each path between the sections inside the CPU represents an independent bus, and data can flow on all paths concurrently. This means that the prefetch queue can be pulling instruction opcodes from the instruction cache while the execution unit is writing data to the data cache. However, it is not always possible, even with a cache, to avoid bus contention.


pages: 968 words: 224,513

The Art of Assembly Language by Randall Hyde

Donald Knuth, P = NP, p-value, sorting algorithm, Von Neumann architecture, Y2K

However, all the statements appearing in programs to this point have been either data declarations or calls to HLA Standard Library routines. There hasn't been any real assembly language. Before we can progress any further and learn some real assembly language, a detour is necessary; unless you understand the basic structure of the Intel 80x86 CPU family, the machine instructions will make little sense. The Intel CPU family is generally classified as a Von Neumann Architecture Machine. Von Neumann computer systems contain three main building blocks: the central processing unit (CPU), memory, and input/output (I/0) devices. These three components are interconnected using the system bus (consisting of the address, data, and control buses). The block diagram in Figure 1-4 shows this relationship. The CPU communicates with memory and I/O devices by placing a numeric value on the address bus to select one of the memory locations or I/O device port locations, each of which has a unique binary numeric address.

The call instruction supports the following (low-level) syntax: call Procname; // Direct call to procedure Procname (or Stmt label). call( Reg32 ); // Indirect call to procedure whose address appears // in the Reg32 general-purpose 32-bit register. call( dwordVar ); // Indirect call to the procedure whose address // appears in the dwordVar double word variable. The first form we've been using throughout this chapter, so there is little need to discuss it here. The second form, the register indirect call, calls the procedure whose address is held in the specified 32-bit register. The address of a procedure is the byte address of the first instruction to execute within that procedure. Remember, on a Von Neumann architecture machine (like the 80x86), the system stores machine instructions in memory along with other data. The CPU fetches the instruction opcode values from memory prior to executing them. When you execute the register indirect call instruction, the 80x86 first pushes the return address onto the stack and then begins fetching the next opcode byte (instruction) from the address specified by the register's value.


pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden

Benevolent Dictator For Life (BDFL), business intelligence, business process, cellular automata, cloud computing, commoditize, complexity theory, conceptual framework, continuous integration, data acquisition, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, general-purpose programming language, Guido van Rossum, HyperCard, information retrieval, iterative process, John von Neumann, Larry Wall, linear programming, loose coupling, Mars Rover, millennium bug, NP-complete, Paul Graham, performance metric, Perl 6, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Ruby on Rails, Sapir-Whorf hypothesis, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

I mean some people foresee a time where .NET rules the world; other people foresee a time where JVMs rule the world. To me, that all seems like wishful thinking. At the same time, I don’t know what will happen. There could be a quantum jump where, even though the computers that we know don’t actually change, a different kind of platform suddenly becomes much more prevalent and the rules are different. Perhaps a shift away from the von Neumann architecture? Guido: I wasn’t even thinking of that, but that’s certainly also a possibility. I was more thinking of what if mobile phones become the ubiquitous computing device. Mobile phones are only a few years behind the curve of the power of regular laptops, which suggests that in a few years, mobile phones, apart from the puny keyboard and screen, will have enough computing power so that you don’t need a laptop anymore.

These were inherently extremely concurrent languages. They were very innovative and spawned a lot of follow-on work over the years. Unfortunately, there were a few problems that I didn’t solve, and neither did anybody else. So here was a promising idea, but it just didn’t quite work in the long run. I pulled some of those ideas into UML, but data flow architecture doesn’t seem to replace von Neumann architecture in most cases. So I had my shot and didn’t quite make it. There are also cellular automata. I think over half of my fellow grad students tried to build on them a highly parallel computer. That has to be the right approach, because that’s how the universe is constructed. (Or maybe not. Modern physics is stranger than fiction. The latest speculations suggest that space and time arise out of something more primitive.)


pages: 377 words: 97,144

Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World by James D. Miller

23andMe, affirmative action, Albert Einstein, artificial general intelligence, Asperger Syndrome, barriers to entry, brain emulation, cloud computing, cognitive bias, correlation does not imply causation, crowdsourcing, Daniel Kahneman / Amos Tversky, David Brooks, David Ricardo: comparative advantage, Deng Xiaoping, en.wikipedia.org, feminist movement, Flynn Effect, friendly AI, hive mind, impulse control, indoor plumbing, invention of agriculture, Isaac Newton, John von Neumann, knowledge worker, Long Term Capital Management, low skilled workers, Netflix Prize, neurotypical, Norman Macrae, pattern recognition, Peter Thiel, phenotype, placebo effect, prisoner's dilemma, profit maximization, Ray Kurzweil, recommendation engine, reversible computing, Richard Feynman, Rodney Brooks, Silicon Valley, Singularitarianism, Skype, statistical model, Stephen Hawking, Steve Jobs, supervolcano, technological singularity, The Coming Technological Singularity, the scientific method, Thomas Malthus, transaction costs, Turing test, twin studies, Vernor Vinge, Von Neumann architecture

But even if people such as Albert Einstein and his almost-as-theoretically-brilliant contemporary John von Neumann had close to the highest possible level of intelligence allowed by the laws of physics, creating a few million people or machines possessing these men’s brainpower would still change the world far more than the Industrial Revolution did. To understand why, let me tell you a bit about von Neumann. Although a fantastic scientist, a pathbreaking economist, and one of the best mathematicians of the twentieth century, von Neumann also possessed fierce practical skills. He was, arguably, the creator of the modern digital computer.11 The computer architecture he developed, now called “von Neumann architecture,” lies at the heart of most computers.12 Von Neumann’s brains took him to the centers of corporate power, and he did high-level consulting work for many private businesses, including Standard Oil, for which he helped to extract more resources from dried-out wells.13 Johnny (as his biographer often calls him in tribute to von Neumann’s unpretentious nature) was described as having “the invaluable faculty of being able to take the most difficult problem, separate it into its components, whereupon everything looked brilliantly simple. . . .”14 During World War II, von Neumann became the world’s leading expert on explosives and used this knowledge to help build better conventional bombs, thwart German sea mines, and determine the optimal altitude for airborne detonations. 15 Johnny functioned as a human computer as a part of the Manhattan Project’s efforts to create fission bombs. 16 Whereas atomic weapons developers today use computers to decipher the many mathematical equations that challenge their trade, the Manhattan Project’s scientists had to rely on human intellect alone.


pages: 317 words: 101,074

The Road Ahead by Bill Gates, Nathan Myhrvold, Peter Rinearson

Albert Einstein, Apple's 1984 Super Bowl advert, Berlin Wall, Bill Gates: Altair 8800, Bob Noyce, Bonfire of the Vanities, business process, California gold rush, Claude Shannon: information theory, computer age, Donald Knuth, first square of the chessboard, first square of the chessboard / second half of the chessboard, glass ceiling, global village, informal economy, invention of movable type, invention of the printing press, invention of writing, John von Neumann, knowledge worker, medical malpractice, Mitch Kapor, new economy, packet switching, popular electronics, Richard Feynman, Ronald Reagan, speech recognition, Steve Ballmer, Steve Jobs, Steven Pinker, Ted Nelson, telemarketer, the scientific method, The Wealth of Nations by Adam Smith, transaction costs, Turing machine, Turing test, Von Neumann architecture

To make it perform another function, the staff had to reconfigure the cabling—every time. John von Neumann, a brilliant Hungarian-born American, who is known for many things, including the development of game theory and his contributions to nuclear weaponry, is credited with the leading role in figuring out a way around this problem. He created the paradigm that all digital computers still follow. The "von Neumann architecture," as it is known today, is based on principles he articulated in 1945—including the principle that a computer could avoid cabling changes by storing instructions in its memory. As soon as this idea was put into practice, the modern computer was born. Today the brains of most computers are descendants of the microprocessor Paul Allen and I were so knocked out by in the seventies, and personal computers often are rated according to how many bits of information (one switch in the lighting example) their microprocessor can process at a time, or how many bytes (a cluster of eight bits) of memory or disk-based storage they have.


pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity by Amy Webb

Ada Lovelace, AI winter, Airbnb, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, artificial general intelligence, Asilomar, autonomous vehicles, Bayesian statistics, Bernie Sanders, bioinformatics, blockchain, Bretton Woods, business intelligence, Cass Sunstein, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, Deng Xiaoping, distributed ledger, don't be evil, Donald Trump, Elon Musk, Filter Bubble, Flynn Effect, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Inbox Zero, Internet of things, Jacques de Vaucanson, Jeff Bezos, Joan Didion, job automation, John von Neumann, knowledge worker, Lyft, Mark Zuckerberg, Menlo Park, move fast and break things, move fast and break things, natural language processing, New Urbanism, one-China policy, optical character recognition, packet switching, pattern recognition, personalized medicine, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Rodney Brooks, Rubik’s Cube, Sand Hill Road, Second Machine Age, self-driving car, SETI@home, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart cities, South China Sea, sovereign wealth fund, speech recognition, Stephen Hawking, strong AI, superintelligent machines, technological singularity, The Coming Technological Singularity, theory of mind, Tim Cook: Apple, trade route, Turing machine, Turing test, uber lyft, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero day

If you don’t have enough of either, the machine will start running hot, or you’ll get an error message, or it will simply shut down. It’s a problem known as the “von Neumann bottleneck.” No matter how fast the processor is capable of working, the program memory and data memory cause the von Neumann bottleneck, limiting the data transfer rate. Just about all of our current computers are based on the von Neumann architecture, and the problem is that existing processors can’t execute programs any faster than they’re able to retrieve instructions and data from memory. The bottleneck is a big problem for AI. Right now, when you talk to your Alexa or Google Home, your voice is being recorded, parsed, and then transmitted to the cloud for a response—given the physical distance between you and the various data centers involved, it’s mind-blowing that Alexa can talk back within a second or two.


pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive by Brian Christian

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bertrand Russell: In Praise of Idleness, carbon footprint, cellular automata, Claude Shannon: information theory, cognitive dissonance, commoditize, complexity theory, crowdsourcing, David Heinemeier Hansson, Donald Trump, Douglas Hofstadter, George Akerlof, Gödel, Escher, Bach, high net worth, Isaac Newton, Jacques de Vaucanson, Jaron Lanier, job automation, l'esprit de l'escalier, Loebner Prize, Menlo Park, Ray Kurzweil, RFID, Richard Feynman, Ronald Reagan, Skype, Social Responsibility of Business Is to Increase Its Profits, starchitect, statistical model, Stephen Hawking, Steve Jobs, Steven Pinker, Thales of Miletus, theory of mind, Thomas Bayes, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero-sum game

Ray Kurzweil (in 2005’s The Singularity Is Near), among several other computer scientists, speaks of a utopian future where we shed our bodies and upload our minds into computers and live forever, virtual, immortal, disembodied. Heaven for hackers. To Ackley’s point, most work on computation has not traditionally been on dynamic systems, or interactive ones, or ones integrating data from the real world in real time. Indeed, theoretical models of the computer—the Turing machine, the von Neumann architecture—seem like reproductions of an idealized version of conscious, deliberate reasoning. As Ackley puts it, “The von Neumann machine is an image of one’s conscious mind where you tend to think: you’re doing long division, and you run this algorithm step-by-step. And that’s not how brains operate. And only in various circumstances is that how minds operate.” I spoke next with University of Massachusetts theoretical computer scientist Hava Siegelmann, who agreed.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, blockchain, brain emulation, Cass Sunstein, Claude Shannon: information theory, complexity theory, computer vision, connected car, crowdsourcing, Daniel Kahneman / Amos Tversky, delayed gratification, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, Flash crash, full employment, future of work, Gerolamo Cardano, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, Mark Zuckerberg, Nash equilibrium, Norbert Wiener, NP-complete, openstreetmap, P = NP, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, Robert Shiller, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, Thales of Miletus, The Future of Employment, Thomas Bayes, Thorstein Veblen, transport as a service, Turing machine, Turing test, universal basic income, uranium enrichment, Von Neumann architecture, Wall-E, Watson beat the top human players on Jeopardy!, web application, zero-sum game

It was all the more remarkable for the fact that, unlike monetary amounts, the utility values of various bets and prizes are not directly observable; instead, utilities are to be inferred from the preferences exhibited by an individual. It would be two centuries before the implications of the idea were fully worked out and it became broadly accepted by statisticians and economists. In the middle of the twentieth century, John von Neumann (a great mathematician after whom the standard “von Neumann architecture” for computers was named16) and Oskar Morgenstern published an axiomatic basis for utility theory.17 What this means is the following: as long as the preferences exhibited by an individual satisfy certain basic axioms that any rational agent should satisfy, then necessarily the choices made by that individual can be described as maximizing the expected value of a utility function. In short, a rational agent acts so as to maximize expected utility.


The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, Donald Davies, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Leonard Kleinrock, Marc Andreessen, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Robert Metcalfe, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process, zero-sum game

Moreover, he had a point: it was only in the late 1970s, with the availability of reliable and inexpensive microchips, that computer scientists would begin serious experimentation with "parallel" computers that could carry out many operations simultaneously. To this day, the vast majority of computers in the world-including essentially all personal computers-are still based on the serial, step-by-step "von Neumann" architecture. Von Neumann mailed off his handwritten manuscript to Goldstine at the Moore School in late June 1945. He may well have felt rushed at that point, since the Trinity test of the plutonium bomb was less than three weeks away (it would take place on July 16). But in any case, he left numerous blank spaces for names, references, and other information that he planned to insert after his col- leagues had had a chance to comment.

Once they were in, moreover, assign each of them a securely walled-off piece of the computer's memory where they could store data and programming code without anybody else's horning in. And fi- nally, when the users needed some actual processing power, dole it out to them via an artful trick. You couldn't literally divide a computer's central processing unit, McCarthy knew; the standard von Neumann architecture allowed for only one such unit, which could carry out only one operation at a time. However, even the slowest electronic computer was very, very fast on any human time scale. So, Mc- Carthy wondered, why not let the CPU skip from one user's memory area to the next user's in sequence, executing a few steps of each task as it went? If that cycle was repeated rapidly enough, the users would never notice the gaps (think of a kindergarten teacher holding simultaneous conversations with a dozen in- sistent five-year-olds).


pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise by Nathan L. Ensmenger

barriers to entry, business process, Claude Shannon: information theory, computer age, deskilling, Donald Knuth, Firefox, Frederick Winslow Taylor, future of work, Grace Hopper, informal economy, information retrieval, interchangeable parts, Isaac Newton, Jacquard loom, job satisfaction, John von Neumann, knowledge worker, loose coupling, new economy, Norbert Wiener, pattern recognition, performance metric, Philip Mirowski, post-industrial society, Productivity paradox, RAND corporation, Robert Gordon, Shoshana Zuboff, sorting algorithm, Steve Jobs, Steven Levy, the market place, Thomas Kuhn: the structure of scientific revolutions, Thorstein Veblen, Turing machine, Von Neumann architecture, Y2K

In 1945–1946, von Neumann circulated an informal “First Draft of a Report on the EDVAC,” which described the EDVAC in terms of its logical structure, using notation borrowed from neurophysiology. Ignoring most of the physical details of the EDVAC design, such as its vacuum tube circuitry, von Neumann focused instead on the main functional units of the computer: its arithmetic unit, memory, and input and output. The “von Neumann architecture,” as it came to be known, served as the logical basis for almost all computers designed in subsequent decades. By abstracting the logical design of the digital computer from any particular physical implementation, von Neumann took a crucial first step in the development of a modern theory of computation.55 His was not the only contribution; in 1937, for example, Turing had described, for the purposes of demonstrating the limits of computation, what would become known as the Universal Turing Machine.


pages: 492 words: 118,882

The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory by Kariappa Bheemaiah

accounting loophole / creative accounting, Ada Lovelace, Airbnb, algorithmic trading, asset allocation, autonomous vehicles, balance sheet recession, bank run, banks create money, Basel III, basic income, Ben Bernanke: helicopter money, bitcoin, blockchain, Bretton Woods, business cycle, business process, call centre, capital controls, Capital in the Twenty-First Century by Thomas Piketty, cashless society, cellular automata, central bank independence, Claude Shannon: information theory, cloud computing, cognitive dissonance, collateralized debt obligation, commoditize, complexity theory, constrained optimization, corporate governance, creative destruction, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, cryptocurrency, David Graeber, deskilling, Diane Coyle, discrete time, disruptive innovation, distributed ledger, diversification, double entry bookkeeping, Ethereum, ethereum blockchain, fiat currency, financial innovation, financial intermediation, Flash crash, floating exchange rates, Fractional reserve banking, full employment, George Akerlof, illegal immigration, income inequality, income per capita, inflation targeting, information asymmetry, interest rate derivative, inventory management, invisible hand, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Joseph Schumpeter, Kenneth Arrow, Kenneth Rogoff, Kevin Kelly, knowledge economy, large denomination, liquidity trap, London Whale, low skilled workers, M-Pesa, Marc Andreessen, market bubble, market fundamentalism, Mexican peso crisis / tequila crisis, MITM: man-in-the-middle, money market fund, money: store of value / unit of account / medium of exchange, mortgage debt, natural language processing, Network effects, new economy, Nikolai Kondratiev, offshore financial centre, packet switching, Pareto efficiency, pattern recognition, peer-to-peer lending, Ponzi scheme, precariat, pre–internet, price mechanism, price stability, private sector deleveraging, profit maximization, QR code, quantitative easing, quantitative trading / quantitative finance, Ray Kurzweil, Real Time Gross Settlement, rent control, rent-seeking, Satoshi Nakamoto, Satyajit Das, savings glut, seigniorage, Silicon Valley, Skype, smart contracts, software as a service, software is eating the world, speech recognition, statistical model, Stephen Hawking, supply-chain management, technology bubble, The Chicago School, The Future of Employment, The Great Moderation, the market place, The Nature of the Firm, the payments system, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, trade liberalization, transaction costs, Turing machine, Turing test, universal basic income, Von Neumann architecture, Washington Consensus

It was during the time of developing ENIAC that he met the renowned polymath, John von Neumann, and with his help went on to design a stored-program computer, the EDVAC (Electronic Discrete Variable Automatic Computer), the first binary computer (ENIAC was decimal). See Figure 4-11. Figure 4-11. General design of the Electronic Discrete Variable Automatic Computer. Reference Source: ‘The von Neumann Architecture’, The Computing Universe, 2014 From an abstract architecture perspective, von Neumann’s design is logically equivalent to Turing’s Universal Turing Machine. In fact, von Neumann had read Turing’s theoretical papers prior to designing his machine. Ultimately it was this simple design that was built upon by successive generations of computer scientists and led to the design of computers with multiple processors and the creation of parallel computing. 219 Chapter 4 ■ Complexity Economics: A New Way to Witness Capitalism The period following the war saw great strides being made in the hardware of computers.


pages: 303 words: 67,891

Advances in Artificial General Intelligence: Concepts, Architectures and Algorithms: Proceedings of the Agi Workshop 2006 by Ben Goertzel, Pei Wang

AI winter, artificial general intelligence, bioinformatics, brain emulation, combinatorial explosion, complexity theory, computer vision, conceptual framework, correlation coefficient, epigenetics, friendly AI, G4S, information retrieval, Isaac Newton, John Conway, Loebner Prize, Menlo Park, natural language processing, Occam's razor, p-value, pattern recognition, performance metric, Ray Kurzweil, Rodney Brooks, semantic web, statistical model, strong AI, theory of mind, traveling salesman, Turing machine, Turing test, Von Neumann architecture, Y2K

Since the number of processing unit is a constant, and so does the capacity of each unit, they will need to be shared by the concepts, because the system as a whole will producing new concepts from time to time, whose number will soon exceed the number of processing units. Consequently, the system still need time-sharing and space-sharing, and it is only that what to be shared is not a single CPU and RAM, but many processing units. Some people blame the von Neumann architecture of computer for the past failure of AI, but the argument is not convincing. It is true that the current computer architecture is not designed especially for AI, but it has not been proved that it cannot be used to implement a truly intelligent system. Special hardware is optional for NARS, since the system can be fully implemented on the current hardware/software platform, though special hardware will surely make it work better. 3.6 Evolution Under the assumption of insufficient knowledge, all object-level knowledge in NARS can be modified by the system’s various learning mechanisms.


pages: 500 words: 146,240

Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay, Peter Molyneux

Any sufficiently advanced technology is indistinguishable from magic, augmented reality, Bob Noyce, collective bargaining, game design, index card, Mark Zuckerberg, oil shock, pirate software, RAND corporation, risk tolerance, Silicon Valley, Skype, Steve Jobs, Von Neumann architecture

If you mentioned it to somebody, they would sort of look askance at you. I knew that Ampex wasn’t going to do it. I had been starting companies all of my life, so it just seemed like the natural thing to do. I really didn’t question any alternatives to starting the company and then licensing the hardware. Ramsay: Was there a lot of focus on hardware then? Bushnell: It was all hardware. As it turned out, the first video games didn’t have Von Neumann architectures at all. They had what we called “digital-state machines.” These machines were, essentially, clocked output signal generators that created waveforms that drove the television monitor. If you wanted to change anything, you had to change the hardware. There was no software at all. In fact, the very first game that executed a program was Asteroids in 1979. Ramsay: Did you put together a business plan?


pages: 489 words: 148,885

Accelerando by Stross, Charles

business cycle, call centre, carbon-based life, cellular automata, cognitive dissonance, commoditize, Conway's Game of Life, dark matter, dumpster diving, Extropian, finite state, Flynn Effect, glass ceiling, gravity well, John von Neumann, Kickstarter, knapsack problem, Kuiper Belt, Magellanic Cloud, mandelbrot fractal, market bubble, means of production, MITM: man-in-the-middle, orbital mechanics / astrodynamics, packet switching, performance metric, phenotype, planetary scale, Pluto: dwarf planet, reversible computing, Richard Stallman, SETI@home, Silicon Valley, Singularitarianism, slashdot, South China Sea, stem cell, technological singularity, telepresence, The Chicago School, theory of mind, Turing complete, Turing machine, Turing test, upwardly mobile, Vernor Vinge, Von Neumann architecture, web of trust, Y2K, zero-sum game

"The cat –" Donna's head swivels round, but Aineko has banged out again, retroactively editing her presence out of the event history of this public space. "What about the cat?" "The family cat," explains Ang. She reaches over for Boris's pitcher of jellyfish juice, but frowns as she does so: "Aineko wasn't conscious back then, but later … when SETI@home finally received that message back, oh, however many years ago, Aineko remembered the lobsters. And cracked it wide open while all the CETI teams were still thinking in terms of von Neumann architectures and concept-oriented programming. The message was a semantic net designed to mesh perfectly with the lobster broadcast all those years ago, and provide a high-level interface to a communications network we're going to visit." She squeezes Boris's fingertips. "SETI@home logged these coordinates as the origin of the transmission, even though the public word was that the message came from a whole lot farther away – they didn't want to risk a panic if people knew there were aliens on our cosmic doorstep.


pages: 528 words: 146,459

Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray, Nathan L. Ensmenger, Jeffrey R. Yost

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, borderless world, Buckminster Fuller, Build a better mousetrap, Byte Shop, card file, cashless society, cloud computing, combinatorial explosion, computer age, deskilling, don't be evil, Donald Davies, Douglas Engelbart, Douglas Engelbart, Dynabook, fault tolerance, Fellow of the Royal Society, financial independence, Frederick Winslow Taylor, game design, garden city movement, Grace Hopper, informal economy, interchangeable parts, invention of the wheel, Jacquard loom, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Kickstarter, light touch regulation, linked data, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, natural language processing, Network effects, New Journalism, Norbert Wiener, Occupy movement, optical character recognition, packet switching, PageRank, pattern recognition, Pierre-Simon Laplace, pirate software, popular electronics, prediction markets, pre–internet, QWERTY keyboard, RAND corporation, Robert X Cringely, Silicon Valley, Silicon Valley startup, Steve Jobs, Steven Levy, Stewart Brand, Ted Nelson, the market place, Turing machine, Vannevar Bush, Von Neumann architecture, Whole Earth Catalog, William Shockley: the traitorous eight, women in the workforce, young professional

Although the 101-page report was in draft form, with many references left incomplete, twenty-four copies were immediately distributed to people closely associated with Project PY. Von Neumann’s sole authorship of the report seemed unimportant at the time, but it later led to his being given sole credit for the invention of the modern computer. Today, computer scientists routinely speak of “the von Neumann architecture” in preference to the more prosaic “stored-program concept”; this has done an injustice to von Neumann’s co-inventors. Although von Neumann’s EDVAC Report was a masterly synthesis, it had the effect of driving the engineers and logicians further apart. For example, in the report von Neumann had pursued the biological metaphor by eliminating all the electronic circuits in favor of logical elements using the “neurons” of brain science.


pages: 578 words: 168,350

Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies by Geoffrey West

Alfred Russel Wallace, Anton Chekhov, Benoit Mandelbrot, Black Swan, British Empire, butterfly effect, carbon footprint, Cesare Marchetti: Marchetti’s constant, clean water, complexity theory, computer age, conceptual framework, continuous integration, corporate social responsibility, correlation does not imply causation, creative destruction, dark matter, Deng Xiaoping, double helix, Edward Glaeser, endogenous growth, Ernest Rutherford, first square of the chessboard, first square of the chessboard / second half of the chessboard, Frank Gehry, Geoffrey West, Santa Fe Institute, Guggenheim Bilbao, housing crisis, Index librorum prohibitorum, invention of agriculture, invention of the telephone, Isaac Newton, Jane Jacobs, Jeff Bezos, Johann Wolfgang von Goethe, John von Neumann, Kenneth Arrow, laissez-faire capitalism, life extension, Mahatma Gandhi, mandelbrot fractal, Marchetti’s constant, Masdar, megacity, Murano, Venice glass, Murray Gell-Mann, New Urbanism, Peter Thiel, profit motive, publish or perish, Ray Kurzweil, Richard Feynman, Richard Florida, Silicon Valley, smart cities, Stephen Hawking, Steve Jobs, Stewart Brand, technological singularity, The Coming Technological Singularity, The Death and Life of Great American Cities, the scientific method, too big to fail, transaction costs, urban planning, urban renewal, Vernor Vinge, Vilfredo Pareto, Von Neumann architecture, Whole Earth Catalog, Whole Earth Review, wikimedia commons, working poor

The great John von Neumann, mathematician, physicist, computer scientist, and polymath, a man whose ideas and accomplishments have had a huge influence on your life, made the following remarkably prescient observation more than seventy years ago: “The ever accelerating progress of technology and changes in the mode of human life . . . gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”7 Among von Neumann’s many accomplishments before he died at the relatively young age of fifty-three in 1957 are his seminal role in the early development of quantum mechanics, his invention of game theory, which is a major tool in economic modeling, and the conceptual design of modern computers universally referred to as the von Neumann architecture. So can we imagine making an innovation as powerful and influential as the invention of the Internet every fifteen, ten, or even five years? This is a classic reductio ad absurdum argument showing that regardless of how ingenious we are, how many marvelous gadgets and devices we invent, we simply won’t be able to overcome the threat of the ultimate singularity if we continue business as usual.


pages: 798 words: 240,182

The Transhumanist Reader by Max More, Natasha Vita-More

23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Douglas Engelbart, Drosophila, en.wikipedia.org, endogenous growth, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, pattern recognition, Pepto Bismol, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Ronald Reagan, scientific worldview, silicon-based life, Singularitarianism, social intelligence, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce, zero-sum game

The ­transistor density and storage available in computing hardware have increased between 50- and 100-fold, at an exponential rate. Now, the rapidly increasing number of processing cores in ­general-purpose CPUs and GPU arrays are indicative of a drive toward parallel computation. Parallel computation is a more natural fit to neural computation. It is essential for the acquisition and analysis of data from the brain. Of course, compared with a sequential Von Neumann architecture, parallel computing platforms, and in particular neuromorphic platforms, are a much better target for the implementation of a whole brain emulation. An example of neuromorphic processor hardware is the chip developed at IBM as an outcome of research in the DARPA SyNAPSE program led by Dharmendra Modha. Figure 14.2 Large-scale high-resolution representations of neuronal circuitry in neuroinformatics.