Von Neumann architecture

21 results back to index


pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science by Chris Bernhardt

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, British Empire, cellular automata, Claude Shannon: information theory, complexity theory, Conway's Game of Life, discrete time, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Internet Archive, Jacquard loom, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Norbert Wiener, Paul Erdős, Turing complete, Turing machine, Turing test, Von Neumann architecture

Turing Machines Examples of Turing Machines Computable Functions and Calculations Church-Turing Thesis Computational Power Machines That Don’t Halt 5. Other Systems for Computation The Lambda Calculus Tag Systems One-Dimensional Cellular Automata 6. Encodings and the Universal Machine A Method of Encoding Finite Automata Universal Machines Construction of Universal Machines Modern Computers Are Universal Machines Von Neumann Architecture Random Access Machines RAMs Can Be Emulated by Turing Machines Other Universal Machines What Happens When We Input 〈M〉 into M? 7. Undecidable Problems Proof by Contradiction Russell’s Barber Finite Automata That Do Not Accept Their Encodings Turing Machines That Do Not Accept Their Encodings Does a Turing Machine Diverge on Its Encoding? Is Undecidable The Acceptance, Halting, and Blank Tape Problems An Uncomputable Function Turing’s Approach 8.

When we simulate the machine on a modern computer we do need to assume the computer has enough storage to be able to do the calculation.) That we can simulate Turing machines on modern computers is not surprising. What is surprising is that we can design a Turing machine to simulate a modern computer, showing that Turing machines are equivalent in computing power to modern computers. We will sketch how this is done. The first step is to get a concrete description of the modern computer. Von Neumann Architecture Later we will talk more about John von Neumann, but it is important to know a few facts before we proceed. The First Draft of a Report on the EDVAC is probably the most important paper on the design of modern computers. It was written in 1945, as the first electronic computers were being built. It described the basic outline of how a computer should be designed, incorporating what had been learned from the design of earlier machines.

Turing was interested in what it was possible to compute. His machines were theoretical constructs meant to incorporate the basic computational steps of human computers. Von Neumann was interested in building a physical machine. His focus was not on the theory of computation, but on the design of an efficient machine for doing actual computations. The resulting design outlined in the report is often referred to as von Neumann architecture and most modern computers are based on this architecture. Von Neumann’s design built on the ideas of many people. The First Draft, as its name suggests, was a draft of a paper and it was only meant to be circulated to a small number of people. The fact that von Neumann was listed as the sole author and that other people’s work was not credited correctly would not have been a problem if the readership was restricted, as originally intended, to just a few colleagues, but the First Draft was widely circulated and became enormously influential in the design of all subsequent computers.

 

pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, British Empire, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, Danny Hillis, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, Richard Feynman, spectrum auction, strong AI, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture

Shuffled among a succession of departments, the original proposal was reconsidered to death. Turing’s automatic computing engine, like Babbage’s analytical engine, was never built. Turing’s proposal “synthesized the concepts of a stored-program universal computer, a floating-point subroutine library, artificial intelligence, details such as a hardware bootstrap loader, and much else.”36 At a time when no such machines were in existence and the von Neumann architecture had only just been proposed, Turing produced a complete description of a million-cycle-per-second computer that foreshadowed the RISC (Reduced Instruction Set Computer) architecture that has now gained prominence after fifty years. The report was accompanied by circuit diagrams, a detailed physical and logical analysis of the internal storage system, sample programs, detailed (if bug-ridden) subroutines, and even an estimated (if unrealistic) cost of £11,200.

“If he really wanted a computer,” explained Arthur Burks, “the thing to do was to build it.”17 Von Neumann structured the project so as to introduce multiple copies of the new machine in several places at once. Progress reports were disseminated not only among the participating funding agencies and to a half-dozen groups that were duplicating the IAS design, but to any location where the potential of high-speed digital computers might fall on fertile ground. It is no accident that the vast majority of computers in circulation today follow the von Neumann architecture—characterized by a central processing unit operating in parallel on the multiple bits of one word of data at a time, a hierarchical memory ranging from fast but limited random-access memory to slow but unlimited media, such as floppy disks or tape, and a distinction between hardware and software that enabled robust computers (and a robust computer industry) to advance by a leapfrog process with each element evolving freely on its own.

“Quite often the likelihood of getting actual numerical results was very much larger if he was not in the computer room, because everybody got so nervous when he was there,” reported Martin Schwarzschild. “But when you were in real thinking trouble, you would go to von Neumann and nobody else.”43 Von Neumann’s reputation, after fifty years, has been injured less by his critics than by his own success. The astounding proliferation of the von Neumann architecture has obscured von Neumann’s contributions to massively parallel computing, distributed information processing, evolutionary computation, and neural nets. Because his deathbed notes for his canceled Silliman lectures at Yale were published posthumously (and for a popular audience) as The Computer and the Brain (1958), von Neumann’s work has been associated with the claims of those who were exaggerating the analogies between the digital computer and the brain.

 

pages: 118 words: 35,663

Smart Machines: IBM's Watson and the Era of Cognitive Computing (Columbia Business School Publishing) by John E. Kelly Iii

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AI winter, call centre, carbon footprint, crowdsourcing, demand response, discovery of DNA, Erik Brynjolfsson, future of work, Geoffrey West, Santa Fe Institute, global supply chain, Internet of things, John von Neumann, Mars Rover, natural language processing, optical character recognition, pattern recognition, planetary scale, RAND corporation, RFID, Richard Feynman, Richard Feynman, smart grid, smart meter, speech recognition, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!

A cognitive computer employing these systems will respond to inquiries more quickly than today’s computers; less data movement will be required and less energy will be used. Today’s von Neumann–style computing won’t go away when cognitive systems come online. New chip and computing technologies will extend its life far into the future. In many cases, the cognitive architecture and the von Neumann architecture will be employed side by side in hybrid systems. Traditional computing will become ever more capable while cognitive technologies will do things that were not possible before. Already, cloud, social networking, mobile, and new ways to interact with computing from tablets to glasses are fueling the desire for cognitive systems that will, for example, both harvest insights from social networks and enhance our experiences within them.

But being stuck at home gave him a lot of time to think over the SyNAPSE dilemma. He concluded that while it was futile in the short term to try to invent a new technology for a cognitive machine, that didn’t mean the project should be abandoned. Instead, the team needed to refocus on CMOS chip technology and on digital circuitry rather than analog circuitry. They would create an entirely new non–von Neumann architecture in both silicon and software that would simulate the functions of neurons and synapses. Using that architecture, they would produce chips for sense-making tasks that would be vastly more efficient than today’s standard digital processors.7 Dharmendra called a meeting of all of the participants in the project. On January 21, 2010, about twenty-five people crowded into the library at the Almaden lab.

Unless we can make computers many orders of magnitude more energy efficient, we’re not going to be able to use them extensively as our intelligent assistants. Computing intelligence will be too costly to be practical. Scientists at IBM Research believe that to make computing sustainable in the era of big data, we will need a different kind of machine—the data-centric computer. Today’s computers are processor-centric. The microprocessor, which is the central processing unit in the von Neumann architecture, is where much of the action happens in computing. Working hand in hand with the operating system, the microprocessor sends out instructions to various components within the computer, requesting data from where it’s stored, including memory chips and disk drives. If the computer is part of a larger network, the processor fetches data from storage systems located out on the network. A new system design is needed to greatly reduce the amount of movement required.

 

pages: 253 words: 80,074

The Man Who Invented the Computer by Jane Smiley

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

1919 Motor Transport Corps convoy, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Arthur Eddington, British Empire, c2.com, computer age, Fellow of the Royal Society, Henri Poincaré, IBM and the Holocaust, Isaac Newton, John von Neumann, Karl Jansky, Norbert Wiener, RAND corporation, Turing machine, V2 rocket, Vannevar Bush, Von Neumann architecture

Partisans of von Neumann make the case that, as with everything else von Neumann did, he took the raw material of another man’s ideas and immediately transcended it, or, as Macrae says, “Johnny grabbed other people’s ideas, then by his clarity leapt five blocks ahead of them, and helped put them into practical effect.” The most important contribution of the “First Draft” to computer design was that it laid out what came to be known as “von Neumann architecture”—that is, that the computer could contain a set of instructions in its memory like the set of instructions that Turing’s human “computer” would have been given and would have to follow day after day forever. The instructions would be stored in the memory, which the electronic computer could readily access (not like a paper tape or a deck of punch cards). This set of instructions in the memory would be called a stored program.

Flowers promised the machine by August, but postwar repairs and improvements to the telephone system superseded the project, and by February 1947 the ACE was going nowhere because Turing could not persuade Womersley to commit himself to Turing’s ideas—for example, an engineering department was set up, but made no progress. Possibly, Womersley was the sort of administrator who thinks contradictory ideas constitute a backup plan, but in the end they constituted no plan at all because what had come to be called “von Neumann architecture”—the principles of computer design set out in the “First Draft”—were simply taking over by coming to seem tried and tested.3 Turing quit. In the autumn of 1947, he returned to Cambridge. 1. One reason that Zuse’s autobiography is interesting is that it gives Americans a perspective on life in Nazi Germany that we rarely get. Zuse seems perennially surprised by the power of the Nazis and the events he lives through.

Mauchly had received an honorary doctorate from the University of Pennsylvania, the Scott Medal from the Franklin Institute, and other Philadelphia-based awards. Eckert was still with Sperry Rand (he stayed with Sperry, and then Unisys, until 1989). Neither Mauchly nor Eckert had profited directly from the ENIAC patent, but they did get credit (and they did seek that credit) for inventing the computer. Eckert, in particular, was vocal about the inaccuracy of the phrase “von Neumann architecture”—he thought it should be called “Eckert architecture.” But the vagaries of patent law and the delay in awarding the Eckert and Mauchly patents seemed to be working for Sperry. If the patent had been awarded in 1947, it would have run out by 1964, before computers became big business. However, in 1960, the patent was still being challenged. It would not be finally awarded until 1964. At that point, it looked as though it would run into the eighties.

 

pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Bill Gates: Altair 8800, Claude Shannon: information theory, computer age, Douglas Engelbart, Dynabook, Eratosthenes, Grace Hopper, invention of the telegraph, Isaac Newton, Jacquard loom, Jacquard loom, James Watt: steam engine, John von Neumann, Joseph-Marie Jacquard, Louis Daguerre, millennium bug, Norbert Wiener, optical character recognition, popular electronics, Richard Feynman, Richard Feynman, Richard Stallman, Silicon Valley, Steve Jobs, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

Programming the ENIAC was a matter of throwing switches and plugging in cables.) These instructions should be sequential in memory and addressed with a program counter but should also allow conditional jumps. This design came to be known as the stored-program concept. These design decisions were such an important evolutionary step that today we speak of von Neumann architecture. The computer that we built in the last chapter was a classic von Neumann machine. But with von Neumann architecture comes the von Neumann bottleneck. A von Neumann machine generally spends a significant amount of time just fetching instructions from memory in preparation for executing them. You'll recall that the final design of the Chapter 17 computer required that three-quarters of the time it spent on each instruction be involved in the instruction fetch.

The storage-and-memory question is usually answered with an analogy: "Memory is like the surface of your desk and storage is like the filing cabinet." That's not a bad answer as far as it goes. But I find it quite unsatisfactory. It makes it sound as if computer architecture were patterned after an office. The truth is that the distinction between memory and storage is an artificial one and exists solely because we don't have a single storage medium that is both fast and vast as well as nonvolatile. What we know today as "von Neumann architecture"—the dominant computer architecture for over 50 years—is a direct result of this technical deficiency. Here's another question that someone once asked me: "Why can't you run Macintosh programs under Windows?" My mouth opened to begin an answer when I realized that it involved many more technical issues than I'm sure my questioner was prepared to deal with in one sitting. I want Code to be a book that makes you understand these things, not in some abstract way, but with a depth that just might even rival that of electrical engineers and programmers.

In addition, an important feature of C is its support of pointers, which are essentially numeric memory addresses. Because C has operations that parallel many common processor instructions, C is sometimes categorized as a high-level assembly language. More than any ALGOL-like language, C closely mimics common processor instruction sets. Yet all ALGOL-like languages—which really means most commonly used programming languages—were designed based on von Neumann architecture computers. Breaking out of the von Neumann mind-set when designing a computer language isn't easy, and getting other people to use such a language is even harder. One such non–von Neumann language is LISP (which stands for List Processing), which was designed by John McCarthy in the late 1950s and is useful for work in the field of artificial intelligence. Another language that's just as unusual but nothing like LISP is APL (A Programming Language), developed in the late 1950s by Kenneth Iverson.

 

pages: 239 words: 70,206

Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else by Steve Lohr

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, Affordable Care Act / Obamacare, Albert Einstein, big data - Walmart - Pop Tarts, bioinformatics, business intelligence, call centre, cloud computing, computer age, conceptual framework, Credit Default Swap, crowdsourcing, Daniel Kahneman / Amos Tversky, Danny Hillis, data is the new oil, David Brooks, East Village, Edward Snowden, Emanuel Derman, Erik Brynjolfsson, everywhere but in the productivity statistics, Frederick Winslow Taylor, Google Glasses, impulse control, income inequality, indoor plumbing, industrial robot, informal economy, Internet of things, invention of writing, John von Neumann, Mark Zuckerberg, market bubble, meta analysis, meta-analysis, natural language processing, obamacare, pattern recognition, payday loans, personalized medicine, precision agriculture, pre–internet, Productivity paradox, RAND corporation, rising living standards, Robert Gordon, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley startup, six sigma, skunkworks, speech recognition, statistical model, Steve Jobs, Steven Levy, The Design of Experiments, the scientific method, Thomas Kuhn: the structure of scientific revolutions, unbanked and underbanked, underbanked, Von Neumann architecture, Watson beat the top human players on Jeopardy!

The low end of the server computer business had become fiercely price competitive, and IBM brought no technical advantage to that market. It was not IBM’s kind of business, even though it generated yearly sales of $4 billion. The big-data era is the next evolutionary upheaval in the landscape of computing. The things people want to do with data, like real-time analysis of data streams or continuously running machine-learning software, pose a threat to the traditional computer industry. Conventional computing—the Von Neumann architecture, named for mathematician and computer scientist John von Neumann—operates according to discrete steps of program, store, and process. Major companies and markets were built around those tiers of computing—software, disk drives, and microprocessors, respectively. Modern data computing, according to John Kelly, IBM’s senior vice president in charge of research, will “completely disrupt the industry as we know it, creating new platforms and players.”

., 5–6 Snyder, Steven, 165–67, 170 social networks, research using human behavior and, 86–94 retail use, 153–62 spread of information and, 73–74 Twitter posts and, 197–202 see also privacy concerns Social Security numbers, data used to predict person’s, 187–88 software, origin of term, 96 Solow, Robert, 72 Speakeasy programming language, 160 Spee (Harvard club), 28–30 Spohrer, Jim, 25 Stanford University, 211–12 Starbucks, 157 Stockholm, rush-hour pricing in, 47 storytelling, computer algorithms and, 120–21, 149, 165–66, 205, 214 structural racism, in big data racial profiling, 194–95 Structure of Scientific Revolutions, The (Kuhn), 175 Sweeney, Latanya, 193–95 System S, at IBM, 40 Tarbell, Ida, 208 Taylor, Frederick Winslow, 207–8 Tecco, Halle, 16, 25, 28, 168–69 Tetlock, Philip, 67–68 thermostats, learning by, 143–45, 147–53 Thinking, Fast and Slow (Kahneman), 66–67 toggling, 84 Truth in Lending Act (1968), 185 T-shaped people, 25 Tukey, John, 96–97 Turing, Alan, 178–79 Tversky, Amos, 66 Twitter, 85 posts studied for personal information, 197–202 “Two Cultures, The” (Snow), 5–6 “universal machine” (Turing’s theoretical computer), 179 universities, data science and, 15–16, 97–98, 211–12 Unlocking the Value of Personal Data: From Collection to Usage (World Economic Forum), 203 “Unreasonable Effectiveness of Data, The” (Norvig), 116 use-only restrictions, on data, 203 Uttamchandani, Menka, 77–78, 80, 212 VALS (Values, Attitudes, and Lifestyles), 155 Van Alstyne, Marshall, 74 Vance, Ashlee, 85 Vargas, Veronica, 159–60 Varma, Anil, 136–37 Veritas, 91 vineyards, data used for precision agriculture in, 123–33, 212 Vivero, David, 29 Vladeck, David, 203, 204 von Neumann, John, 54 Von Neumann architecture, 54 Walker, Donald, 2, 63, 212 Walmart, 104, 154 Watson, Thomas Jr., 49 Watson technology, of IBM, 45, 66–67, 120, 205 as cloud service, 9, 54 Jeopardy and, 7, 40, 111, 114 medical diagnoses and, 69–70, 109 Watts, Duncan J., 86 weather analysis, with big data, 129–32 Weitzner, Daniel, 184 “Why ask Why?” (Gelman and Imbens), 115–16 winemaking, precision agriculture and, 123–33, 212 Wing, Michael, 49–50 workforce rebalancing, at IBM, 57 World Economic Forum, 203 Yarkoni, Tal, 199 Yoshimi, Bill, 198 ZestFinance, data correlation and, 104–7 Zeyliger, Philip, 100–101 Zhou, Michelle, 197–202 Zuckerberg, Mark, 28, 86, 89 ABOUT THE AUTHOR Photo by Fred Conrad STEVE LOHR reports on technology, business, and economics for the New York Times.

 

pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, British Empire, business process, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, complexity theory, continuous integration, Conway's Game of Life, cosmological principle, dark matter, dematerialisation, double helix, Douglas Hofstadter, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, income inequality, index card, industrial robot, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, millennium bug, natural language processing, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, post-industrial society, prediction markets, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, speech recognition, stem cell, Stephen Hawking, Steven Pinker, strong AI, technological singularity, The Coming Technological Singularity, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

Von Neumann was fascinated by the design of ENIAC, and wondered how the computer might be easily reprogrammed to perform a different set of operations – not involving artillery ballistics this time, but to predict the results of a hydrogen bomb explosion. Invited by the team that developed ENIAC to advise them, von Neumann produced a landmark report,7 which described a machine that could store both data and programs.8 The ‘von Neumann architecture’ – as it has hitherto been known – demonstrated how computers could be reprogrammed easily. Until then computers had fixed programs, and had to be physically rewired in order to be reprogrammed. Von Neumann’s architecture allowed code in a computer to be self-modified. One could thus write programs that write programs, an idea that makes possible the host of automated tools that computer engineers have nowadays at their disposal, such as assemblers and compilers.

This is a very hard question to answer, since we do not yet have a way to collect credible evidence.26 Nevertheless, I personally would be inclined to bet that the spontaneous emergence of self-awareness in current technological cyberspace is highly improbable. Since the 1940s, we have been locked in a specific approach to computer technology that separates hardware from software, and which is mostly based on a specific hardware architecture called the ‘von Neumann architecture’, as we saw in the previous chapter. There could have been many other paths we could have taken in computer evolution (for instance advanced analogue computers), but we did not. The word ‘evolution’ is of great importance here. The pseudo-cybernetic assumption of the AI Singularity hypothesis essentially claims that an evolutionary kind of emergence of self-awareness is involved. Let us accept, for argument’s sake, that evolutionary forces are at play in the way computers have evolved since the 1940s.

The quest for a general formula of computation persisted until the Swiss mathematician Jakob Bernoulli (1654–1705) ‘discovered’ a sequence of constants (the Bernoulli numbers) that provides a uniform formula for the sums of all powers. 12Brynjolfsson E., McAfee A. (2014),The Second Machine Engine. New York: W.W. Norton & Co. 13In Turing’s description the tape with the symbols (the ‘data’) is separate from the table of instructions (the ‘program’). In modern computers data and programs are stored in the same storage, a key insight that is part of the ‘von Neumann architecture’. 14According to historians Robert Friedel and Paul Israel at least twenty-two other inventors ‘discovered’ the incandescent lamp prior to Thomas Edison. However, it was Edison who developed the lamp into an effective source of electric lighting by selecting an effective incandescent material, achieving a higher vacuum and using a higher resistance filament. 15Konrad Zuse invented the world’s first programmable computer Z3, which became operational in May 1941. 14 From Bletchley Park to Google Campus 1‘Global Information Report 2013’, World Economic Forum (www.weforum.com). 2This is a phrase from Greek philosopher Heraclitus (535–475 BC).

 

pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence by John Brockman

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, agricultural Revolution, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic trading, artificial general intelligence, augmented reality, autonomous vehicles, bitcoin, blockchain, clean water, cognitive dissonance, Colonization of Mars, complexity theory, computer age, computer vision, constrained optimization, corporate personhood, cosmological principle, cryptocurrency, cuban missile crisis, Danny Hillis, dark matter, discrete time, Elon Musk, Emanuel Derman, endowment effect, epigenetics, Ernest Rutherford, experimental economics, Flash crash, friendly AI, Google Glasses, hive mind, income inequality, information trail, Internet of things, invention of writing, iterative process, Jaron Lanier, job automation, John von Neumann, Kevin Kelly, knowledge worker, loose coupling, microbiome, Moneyball by Michael Lewis explains big data, natural language processing, Network effects, Norbert Wiener, pattern recognition, Peter Singer: altruism, phenotype, planetary scale, Ray Kurzweil, recommendation engine, Republic of Letters, RFID, Richard Thaler, Rory Sutherland, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, strong AI, Stuxnet, superintelligent machines, supervolcano, the scientific method, The Wisdom of Crowds, theory of mind, Thorstein Veblen, too big to fail, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

As molecular neuroscience progresses, encountering no boundaries, and computers reproduce more and more of the behaviors we call intelligence in humans, that hypothesis looks inescapable. If it’s true, then all intelligence is machine intelligence. What distinguishes natural from artificial intelligence is not what it is but only how it’s made. Of course, that little word only is doing some heavy lifting here. Brains use a highly parallel architecture and mobilize many noisy analog units (i.e., neurons) firing simultaneously, while most computers use von Neumann architecture, with serial operation of much faster digital units. These distinctions are blurring, however, from both ends. Neural-net architectures are built in silicon, and brains interact ever more seamlessly with external digital organs. Already I feel that my laptop is an extension of my self—in particular, it is a repository for both visual and narrative memory, a sensory portal into the outside world, and a big part of my mathematical digestive system. 2.

That is, will sufficient complexity in the hardware bring about that sudden jump to self-awareness, all on its own? Or is there some missing ingredient? This is far from obvious; we lack any data, either way. I personally think that consciousness is incredibly more complex than is currently assumed by the “experts.” A human being is not merely x numbers of axons and synapses, and we have no reason to assume that we can count our flops-per-second in a plain Von Neumann architecture, reach a certain number, and suddenly out pops a thinking machine. If true consciousness can emerge, let’s be clear what that could entail. If the machine is truly aware, it will, by definition, develop a “personality.” It may be irascible, flirtatious, maybe the ultimate know-it-all, possibly incredibly full of itself. Would it have doubts or jealousy? Would it instantly spit out the Seventh Brandenburg and then 1,000 more?

 

pages: 420 words: 119,928

The Three-Body Problem (Remembrance of Earth's Past) by Cixin Liu

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

back-to-the-land, cosmic microwave background, Deng Xiaoping, game design, Henri Poincaré, horn antenna, invisible hand, Isaac Newton, Norbert Wiener, Panamax, RAND corporation, Search for Extraterrestrial Intelligence, Von Neumann architecture

Qin Shi Huang grasped his sword and said, “Replace the malfunctioning component and behead all the soldiers who made up that gate. In the future, any malfunctions will be dealt with the same way!” Von Neumann glanced at Newton, disgusted. They watched as a few riders dashed into the motherboard with their swords unsheathed. After they “repaired” the faulty component, the order to restart was given. This time, the operation went very smoothly. Twenty minutes later, Three Body’s Von Neumann architecture human-formation computer had begun full operations under the Qin 1.0 operating system. “Run solar orbit computation software ‘Three Body 1.0’!” Newton screamed at the top of his lungs. “Start the master computing module! Load the differential calculus module! Load the finite element analysis module! Load the spectral method module! Enter initial condition parameters … and begin calculation!”

Against the background of the three suns in syzygy, text appeared: Civilization Number 184 was destroyed by the stacked gravitational attractions of a tri-solar syzygy. This civilization had advanced to the Scientific Revolution and the Industrial Revolution. In this civilization, Newton established nonrelativistic classical mechanics. At the same time, due to the invention of calculus and the Von Neumann architecture computer, the foundation was set for the quantitative mathematical analysis of the motion of three bodies. After a long time, life and civilization will begin once more, and progress through the unpredictable world of Three Body. We invite you to log on again. * * * Just as Wang logged out of the game, a stranger called. The voice on the phone was that of a very charismatic man. “Hello!

 

pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

business intelligence, business process, cellular automata, cloud computing, complexity theory, conceptual framework, continuous integration, data acquisition, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, general-purpose programming language, HyperCard, information retrieval, iterative process, John von Neumann, linear programming, loose coupling, Mars Rover, millennium bug, NP-complete, Paul Graham, performance metric, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

I mean some people foresee a time where .NET rules the world; other people foresee a time where JVMs rule the world. To me, that all seems like wishful thinking. At the same time, I don’t know what will happen. There could be a quantum jump where, even though the computers that we know don’t actually change, a different kind of platform suddenly becomes much more prevalent and the rules are different. Perhaps a shift away from the von Neumann architecture? Guido: I wasn’t even thinking of that, but that’s certainly also a possibility. I was more thinking of what if mobile phones become the ubiquitous computing device. Mobile phones are only a few years behind the curve of the power of regular laptops, which suggests that in a few years, mobile phones, apart from the puny keyboard and screen, will have enough computing power so that you don’t need a laptop anymore.

These were inherently extremely concurrent languages. They were very innovative and spawned a lot of follow-on work over the years. Unfortunately, there were a few problems that I didn’t solve, and neither did anybody else. So here was a promising idea, but it just didn’t quite work in the long run. I pulled some of those ideas into UML, but data flow architecture doesn’t seem to replace von Neumann architecture in most cases. So I had my shot and didn’t quite make it. There are also cellular automata. I think over half of my fellow grad students tried to build on them a highly parallel computer. That has to be the right approach, because that’s how the universe is constructed. (Or maybe not. Modern physics is stranger than fiction. The latest speculations suggest that space and time arise out of something more primitive.)

 

pages: 968 words: 224,513

The Art of Assembly Language by Randall Hyde

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

p-value, sorting algorithm, Von Neumann architecture, Y2K

However, all the statements appearing in programs to this point have been either data declarations or calls to HLA Standard Library routines. There hasn't been any real assembly language. Before we can progress any further and learn some real assembly language, a detour is necessary; unless you understand the basic structure of the Intel 80x86 CPU family, the machine instructions will make little sense. The Intel CPU family is generally classified as a Von Neumann Architecture Machine. Von Neumann computer systems contain three main building blocks: the central processing unit (CPU), memory, and input/output (I/0) devices. These three components are interconnected using the system bus (consisting of the address, data, and control buses). The block diagram in Figure 1-4 shows this relationship. The CPU communicates with memory and I/O devices by placing a numeric value on the address bus to select one of the memory locations or I/O device port locations, each of which has a unique binary numeric address.

The call instruction supports the following (low-level) syntax: call Procname; // Direct call to procedure Procname (or Stmt label). call( Reg32 ); // Indirect call to procedure whose address appears // in the Reg32 general-purpose 32-bit register. call( dwordVar ); // Indirect call to the procedure whose address // appears in the dwordVar double word variable. The first form we've been using throughout this chapter, so there is little need to discuss it here. The second form, the register indirect call, calls the procedure whose address is held in the specified 32-bit register. The address of a procedure is the byte address of the first instruction to execute within that procedure. Remember, on a Von Neumann architecture machine (like the 80x86), the system stores machine instructions in memory along with other data. The CPU fetches the instruction opcode values from memory prior to executing them. When you execute the register indirect call instruction, the 80x86 first pushes the return address onto the stack and then begins fetching the next opcode byte (instruction) from the address specified by the register's value.

 

pages: 894 words: 190,485

Write Great Code, Volume 1 by Randall Hyde

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AltaVista, business process, John von Neumann, locality of reference, Von Neumann architecture, Y2K

Knowing about memory performance characteristics, data locality, and cache operation can help you design software that runs as fast as possible. Writing great code requires a strong knowledge of the computer’s architecture. 6.1 The Basic System Components The basic operational design of a computer system is called its architecture. John von Neumann, a pioneer in computer design, is given credit for the principal architecture in use today. For example, the 80x86 family uses the von Neumann architecture (VNA). A typical von Neumann system has three major components: the central processing unit (CPU), memory, and input/output (I/O), as shown in Figure 6-1. Figure 6-1. Typical von Neumann machine In VNA machines, like the 80x86, the CPU is where all the action takes place. All computations occur within the CPU. Data and machine instructions reside in memory until the CPU requires them, at which point the system transfers the data into the CPU.

The extra pins needed on the processor to support two physically separate buses increase the cost of the processor and introduce many other engineering problems. However, microprocessor designers have discovered that they can obtain many benefits of the Harvard architecture with few of the disadvantages by using separate on-chip caches for data and instructions. Advanced CPUs use an internal Harvard architecture and an external von Neumann architecture. Figure 9-9 shows the structure of the 80x86 with separate data and instruction caches. Each path between the sections inside the CPU represents an independent bus, and data can flow on all paths concurrently. This means that the prefetch queue can be pulling instruction opcodes from the instruction cache while the execution unit is writing data to the data cache. However, it is not always possible, even with a cache, to avoid bus contention.

 

pages: 377 words: 97,144

Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World by James D. Miller

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, affirmative action, Albert Einstein, artificial general intelligence, Asperger Syndrome, barriers to entry, brain emulation, cloud computing, cognitive bias, correlation does not imply causation, crowdsourcing, Daniel Kahneman / Amos Tversky, David Brooks, David Ricardo: comparative advantage, Deng Xiaoping, en.wikipedia.org, feminist movement, Flynn Effect, friendly AI, hive mind, impulse control, indoor plumbing, invention of agriculture, Isaac Newton, John von Neumann, knowledge worker, Long Term Capital Management, low skilled workers, Netflix Prize, neurotypical, pattern recognition, Peter Thiel, phenotype, placebo effect, prisoner's dilemma, profit maximization, Ray Kurzweil, recommendation engine, reversible computing, Richard Feynman, Richard Feynman, Rodney Brooks, Silicon Valley, Singularitarianism, Skype, statistical model, Stephen Hawking, Steve Jobs, supervolcano, technological singularity, The Coming Technological Singularity, the scientific method, Thomas Malthus, transaction costs, Turing test, Vernor Vinge, Von Neumann architecture

But even if people such as Albert Einstein and his almost-as-theoretically-brilliant contemporary John von Neumann had close to the highest possible level of intelligence allowed by the laws of physics, creating a few million people or machines possessing these men’s brainpower would still change the world far more than the Industrial Revolution did. To understand why, let me tell you a bit about von Neumann. Although a fantastic scientist, a pathbreaking economist, and one of the best mathematicians of the twentieth century, von Neumann also possessed fierce practical skills. He was, arguably, the creator of the modern digital computer.11 The computer architecture he developed, now called “von Neumann architecture,” lies at the heart of most computers.12 Von Neumann’s brains took him to the centers of corporate power, and he did high-level consulting work for many private businesses, including Standard Oil, for which he helped to extract more resources from dried-out wells.13 Johnny (as his biographer often calls him in tribute to von Neumann’s unpretentious nature) was described as having “the invaluable faculty of being able to take the most difficult problem, separate it into its components, whereupon everything looked brilliantly simple. . . .”14 During World War II, von Neumann became the world’s leading expert on explosives and used this knowledge to help build better conventional bombs, thwart German sea mines, and determine the optimal altitude for airborne detonations. 15 Johnny functioned as a human computer as a part of the Manhattan Project’s efforts to create fission bombs. 16 Whereas atomic weapons developers today use computers to decipher the many mathematical equations that challenge their trade, the Manhattan Project’s scientists had to rely on human intellect alone.

 

pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive by Brian Christian

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bertrand Russell: In Praise of Idleness, carbon footprint, cellular automata, Claude Shannon: information theory, cognitive dissonance, complexity theory, crowdsourcing, Donald Trump, Douglas Hofstadter, George Akerlof, Gödel, Escher, Bach, high net worth, Isaac Newton, Jacques de Vaucanson, Jaron Lanier, job automation, l'esprit de l'escalier, Loebner Prize, Menlo Park, Ray Kurzweil, RFID, Richard Feynman, Richard Feynman, Ronald Reagan, Skype, statistical model, Stephen Hawking, Steve Jobs, Steven Pinker, theory of mind, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!

Ray Kurzweil (in 2005’s The Singularity Is Near), among several other computer scientists, speaks of a utopian future where we shed our bodies and upload our minds into computers and live forever, virtual, immortal, disembodied. Heaven for hackers. To Ackley’s point, most work on computation has not traditionally been on dynamic systems, or interactive ones, or ones integrating data from the real world in real time. Indeed, theoretical models of the computer—the Turing machine, the von Neumann architecture—seem like reproductions of an idealized version of conscious, deliberate reasoning. As Ackley puts it, “The von Neumann machine is an image of one’s conscious mind where you tend to think: you’re doing long division, and you run this algorithm step-by-step. And that’s not how brains operate. And only in various circumstances is that how minds operate.” I spoke next with University of Massachusetts theoretical computer scientist Hava Siegelmann, who agreed.

 

The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, double helix, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process

Moreover, he had a point: it was only in the late 1970s, with the availability of reliable and inexpensive microchips, that computer scientists would begin serious experimentation with "parallel" computers that could carry out many operations simultaneously. To this day, the vast majority of computers in the world-including essentially all personal computers-are still based on the serial, step-by-step "von Neumann" architecture. Von Neumann mailed off his handwritten manuscript to Goldstine at the Moore School in late June 1945. He may well have felt rushed at that point, since the Trinity test of the plutonium bomb was less than three weeks away (it would take place on July 16). But in any case, he left numerous blank spaces for names, references, and other information that he planned to insert after his col- leagues had had a chance to comment.

Once they were in, moreover, assign each of them a securely walled-off piece of the computer's memory where they could store data and programming code without anybody else's horning in. And fi- nally, when the users needed some actual processing power, dole it out to them via an artful trick. You couldn't literally divide a computer's central processing unit, McCarthy knew; the standard von Neumann architecture allowed for only one such unit, which could carry out only one operation at a time. However, even the slowest electronic computer was very, very fast on any human time scale. So, Mc- Carthy wondered, why not let the CPU skip from one user's memory area to the next user's in sequence, executing a few steps of each task as it went? If that cycle was repeated rapidly enough, the users would never notice the gaps (think of a kindergarten teacher holding simultaneous conversations with a dozen in- sistent five-year-olds).

 

pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise by Nathan L. Ensmenger

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

barriers to entry, business process, Claude Shannon: information theory, computer age, deskilling, Firefox, Frederick Winslow Taylor, future of work, Grace Hopper, informal economy, information retrieval, interchangeable parts, Isaac Newton, Jacquard loom, Jacquard loom, job satisfaction, John von Neumann, knowledge worker, loose coupling, new economy, Norbert Wiener, pattern recognition, performance metric, post-industrial society, Productivity paradox, RAND corporation, Robert Gordon, sorting algorithm, Steve Jobs, Steven Levy, the market place, Thomas Kuhn: the structure of scientific revolutions, Thorstein Veblen, Turing machine, Von Neumann architecture, Y2K

In 1945–1946, von Neumann circulated an informal “First Draft of a Report on the EDVAC,” which described the EDVAC in terms of its logical structure, using notation borrowed from neurophysiology. Ignoring most of the physical details of the EDVAC design, such as its vacuum tube circuitry, von Neumann focused instead on the main functional units of the computer: its arithmetic unit, memory, and input and output. The “von Neumann architecture,” as it came to be known, served as the logical basis for almost all computers designed in subsequent decades. By abstracting the logical design of the digital computer from any particular physical implementation, von Neumann took a crucial first step in the development of a modern theory of computation.55 His was not the only contribution; in 1937, for example, Turing had described, for the purposes of demonstrating the limits of computation, what would become known as the Universal Turing Machine.

 

pages: 500 words: 146,240

Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay, Peter Molyneux

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Any sufficiently advanced technology is indistinguishable from magic, augmented reality, collective bargaining, game design, index card, Mark Zuckerberg, oil shock, pirate software, RAND corporation, risk tolerance, Silicon Valley, Skype, Steve Jobs, Von Neumann architecture

If you mentioned it to somebody, they would sort of look askance at you. I knew that Ampex wasn’t going to do it. I had been starting companies all of my life, so it just seemed like the natural thing to do. I really didn’t question any alternatives to starting the company and then licensing the hardware. Ramsay: Was there a lot of focus on hardware then? Bushnell: It was all hardware. As it turned out, the first video games didn’t have Von Neumann architectures at all. They had what we called “digital-state machines.” These machines were, essentially, clocked output signal generators that created waveforms that drove the television monitor. If you wanted to change anything, you had to change the hardware. There was no software at all. In fact, the very first game that executed a program was Asteroids in 1979. Ramsay: Did you put together a business plan?

 

pages: 303 words: 67,891

Advances in Artificial General Intelligence: Concepts, Architectures and Algorithms: Proceedings of the Agi Workshop 2006 by Ben Goertzel, Pei Wang

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AI winter, artificial general intelligence, bioinformatics, brain emulation, combinatorial explosion, complexity theory, computer vision, conceptual framework, correlation coefficient, epigenetics, friendly AI, information retrieval, Isaac Newton, John Conway, Loebner Prize, Menlo Park, natural language processing, Occam's razor, p-value, pattern recognition, performance metric, Ray Kurzweil, Rodney Brooks, semantic web, statistical model, strong AI, theory of mind, traveling salesman, Turing machine, Turing test, Von Neumann architecture, Y2K

Since the number of processing unit is a constant, and so does the capacity of each unit, they will need to be shared by the concepts, because the system as a whole will producing new concepts from time to time, whose number will soon exceed the number of processing units. Consequently, the system still need time-sharing and space-sharing, and it is only that what to be shared is not a single CPU and RAM, but many processing units. Some people blame the von Neumann architecture of computer for the past failure of AI, but the argument is not convincing. It is true that the current computer architecture is not designed especially for AI, but it has not been proved that it cannot be used to implement a truly intelligent system. Special hardware is optional for NARS, since the system can be fully implemented on the current hardware/software platform, though special hardware will surely make it work better. 3.6 Evolution Under the assumption of insufficient knowledge, all object-level knowledge in NARS can be modified by the system’s various learning mechanisms.

 

pages: 489 words: 148,885

Accelerando by Stross, Charles

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

call centre, carbon-based life, cellular automata, cognitive dissonance, Conway's Game of Life, dark matter, dumpster diving, Extropian, finite state, Flynn Effect, glass ceiling, gravity well, John von Neumann, knapsack problem, Kuiper Belt, Magellanic Cloud, mandelbrot fractal, market bubble, means of production, packet switching, performance metric, phenotype, planetary scale, Pluto: dwarf planet, reversible computing, Richard Stallman, SETI@home, Silicon Valley, Singularitarianism, slashdot, South China Sea, stem cell, technological singularity, telepresence, The Chicago School, theory of mind, Turing complete, Turing machine, Turing test, upwardly mobile, Vernor Vinge, Von Neumann architecture, web of trust, Y2K

"The cat –" Donna's head swivels round, but Aineko has banged out again, retroactively editing her presence out of the event history of this public space. "What about the cat?" "The family cat," explains Ang. She reaches over for Boris's pitcher of jellyfish juice, but frowns as she does so: "Aineko wasn't conscious back then, but later … when SETI@home finally received that message back, oh, however many years ago, Aineko remembered the lobsters. And cracked it wide open while all the CETI teams were still thinking in terms of von Neumann architectures and concept-oriented programming. The message was a semantic net designed to mesh perfectly with the lobster broadcast all those years ago, and provide a high-level interface to a communications network we're going to visit." She squeezes Boris's fingertips. "SETI@home logged these coordinates as the origin of the transmission, even though the public word was that the message came from a whole lot farther away – they didn't want to risk a panic if people knew there were aliens on our cosmic doorstep.

 

pages: 528 words: 146,459

Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray, Nathan L. Ensmenger, Jeffrey R. Yost

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, borderless world, Buckminster Fuller, Build a better mousetrap, Byte Shop, card file, cashless society, cloud computing, combinatorial explosion, computer age, deskilling, don't be evil, Douglas Engelbart, Dynabook, fault tolerance, Fellow of the Royal Society, financial independence, Frederick Winslow Taylor, game design, garden city movement, Grace Hopper, informal economy, interchangeable parts, invention of the wheel, Jacquard loom, Jacquard loom, Jeff Bezos, jimmy wales, John von Neumann, linked data, Mark Zuckerberg, Marshall McLuhan, Menlo Park, natural language processing, Network effects, New Journalism, Norbert Wiener, Occupy movement, optical character recognition, packet switching, PageRank, pattern recognition, pirate software, popular electronics, prediction markets, pre–internet, QWERTY keyboard, RAND corporation, Robert X Cringely, Silicon Valley, Silicon Valley startup, Steve Jobs, Steven Levy, Stewart Brand, Ted Nelson, the market place, Turing machine, Vannevar Bush, Von Neumann architecture, Whole Earth Catalog, William Shockley: the traitorous eight, women in the workforce, young professional

Although the 101-page report was in draft form, with many references left incomplete, twenty-four copies were immediately distributed to people closely associated with Project PY. Von Neumann’s sole authorship of the report seemed unimportant at the time, but it later led to his being given sole credit for the invention of the modern computer. Today, computer scientists routinely speak of “the von Neumann architecture” in preference to the more prosaic “stored-program concept”; this has done an injustice to von Neumann’s co-inventors. Although von Neumann’s EDVAC Report was a masterly synthesis, it had the effect of driving the engineers and logicians further apart. For example, in the report von Neumann had pursued the biological metaphor by eliminating all the electronic circuits in favor of logical elements using the “neurons” of brain science.

 

pages: 798 words: 240,182

The Transhumanist Reader by Max More, Natasha Vita-More

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Drosophila, en.wikipedia.org, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, P = NP, pattern recognition, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Richard Feynman, Ronald Reagan, silicon-based life, Singularitarianism, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce

The ­transistor density and storage available in computing hardware have increased between 50- and 100-fold, at an exponential rate. Now, the rapidly increasing number of processing cores in ­general-purpose CPUs and GPU arrays are indicative of a drive toward parallel computation. Parallel computation is a more natural fit to neural computation. It is essential for the acquisition and analysis of data from the brain. Of course, compared with a sequential Von Neumann architecture, parallel computing platforms, and in particular neuromorphic platforms, are a much better target for the implementation of a whole brain emulation. An example of neuromorphic processor hardware is the chip developed at IBM as an outcome of research in the DARPA SyNAPSE program led by Dharmendra Modha. Figure 14.2 Large-scale high-resolution representations of neuronal circuitry in neuroinformatics.