combinatorial explosion

43 results back to index


pages: 304 words: 84,396

Bounce: Mozart, Federer, Picasso, Beckham, and the Science of Success by Matthew Syed

barriers to entry, battle of ideas, Berlin Wall, combinatorial explosion, deliberate practice, desegregation, Fall of the Berlin Wall, fear of failure, Isaac Newton, Norman Mailer, pattern recognition, placebo effect, zero-sum game

Sure, you can offer pointers on what to look for and what to avoid, and these can be helpful. But relating the entirety of the information is impossible because the cues being processed by experts—in sport or elsewhere—are so subtle and relate to each other in such complex ways that it would take forever to codify them in their mind-boggling totality. This is known as combinatorial explosion, a concept that will help to nail down many of the insights of this chapter. The best way to get a sense of the strange power of combinatorial explosion is to imagine folding a piece of paper in two, making the paper twice as thick. Now repeat the process a hundred times. How thick is the paper now? Most people tend to guess in the range of a few inches to a few yards. In fact the thickness would stretch eight hundred thousand billion times the distance from Earth to the sun.

This is a perfect example of expert decision making in practice: circumventing combinatorial explosion via advanced pattern recognition. It is precisely the same skill wielded by Kasparov, but on an ice hockey pitch rather than a chessboard. How was Gretzky able to do this? Let’s hear from the man himself: “I wasn’t naturally gifted in terms of size and speed; everything I did in hockey I worked for.” And later: “The highest compliment that you can pay me is to say that I worked hard every day…. That’s how I came to know where the puck was going before it even got there.” All of which helps to explain a qualification that was made earlier in the chapter: you will remember that the ten-thousand-hour rule was said to apply to any complex task. What is meant by complexity? In effect, it describes those tasks characterized by combinatorial explosion; tasks where success is determined, first and foremost, by superiority in software (pattern recognition and sophisticated motor programs) rather than hardware (simple speed or strength).

Deep Blue, 37–39, 46–47, 52–53 memory in, 24–26, 46–47 midgame theory, 103 Polgar family, 66–71 and practice, 48, 66–71 studying historic games, 103 ten years’ practice in, 15–16 chick spotters, 223–24 child prodigies, 55–71 hours of practice, 14, 63, 64 internal motivation of, 63–64, 67 mental math, 71–75 Mozart, 55–58, 63 Polgar theories, 64–71 Williams sisters, 59–61 Woods, 58–59 China, table tennis in, 82, 91, 136 choking: author’s experience, 181–84, 199–200 brain systems in, 188–93, 194, 196 capacity for, 196 as common phenomenon, 185 in complex tasks, 195–96 and focus, 191, 192–93, 196–99 in golf, 185–88, 193–94 names for, 184 overcoming, 198–99 psychological reversion in, 195 in tennis, 194–95 under pressure, 184-85, 200n chunking, 24, 30, 191 and de-chunking, 195n and decision patterns, 48, 49 and kinesiology, 31–32 Clifford, Simon, 87, 88 Cohen, Geoffrey, 117–18, 119 Collier, Sue, 7 Colvin, Geoff, 56, 85–86 Talent Is Overrated, 21–22, 44 combinatorial explosion, 45, 47, 49–50 complexity, 50, 97–100 concentration, 78, 83, 92 Conte, Victor, 240 Courier, Jim, 134 Coyle, Daniel, 118, 121 The Talent Code, 87–88, 119 creativity: lightning bolt theory of, 98–99 and purposeful practice, 98 in technical innovation, 100 cricket, 31, 205–6 Cromwell, Dean, 279 cultural theory of emotion, 212–13 cycling, 91, 136–37, 208–9 Czech, D. R., 163 Darwin, Charles, 10, 278 Davis, Randall, 44 Davis, Steve, 200 Dawkins, Richard, 151 decision making: in chess, 47–48, 49, 51 by chunking patterns, 48, 49 combinatorial explosion, 47, 50 by computer, 44 and experience, 51 by firefighters, 40–43, 44–45, 47, 51 by hospital nurses, 41, 43, 78 independent, 63 and intuition, 47 knowledge in, 43, 51 pattern-recognition theory, 48 on perceptual cues, 43 research in, 40–43, 51–52 in sports, 45, 47 talent in, 43–44 Deep Blue (chess), 37–39, 46–47, 52–53 Devi, Shakuntala, 71 doctors, diagnostic accuracy of, 109 doublethink, 176–77 doubt, 164–66, 168–69, 184 Douglas, Desmond, 28–29, 32–34 decisions made by, 41–42 practice hours of, 33–34 “sixth sense” of, 42 Dweck, Carol: on Enron, 142 talent studies of, 123–29 on words of praise, 129–32, 135, 137–38, 145 East Germany, steroid use in, 233–40, 243 economics, author’s study of, 115–16 education, lowering standards in, 132 Edwards, A.


pages: 398 words: 86,855

Bad Data Handbook by Q. Ethan McCallum

Amazon Mechanical Turk, asset allocation, barriers to entry, Benoit Mandelbrot, business intelligence, cellular automata, chief data officer, Chuck Templeton: OpenTable:, cloud computing, cognitive dissonance, combinatorial explosion, commoditize, conceptual framework, database schema, DevOps, en.wikipedia.org, Firefox, Flash crash, Gini coefficient, illegal immigration, iterative process, labor-force participation, loose coupling, natural language processing, Netflix Prize, quantitative trading / quantitative finance, recommendation engine, selection bias, sentiment analysis, statistical model, supply-chain management, survivorship bias, text mining, too big to fail, web application

–Guessing Text Encoding, Normalizing Text–Normalizing Text, Problem: Application-Specific Characters Leaking into Plain Text–Problem: Application-Specific Characters Leaking into Plain Text, Problem: Application-Specific Characters Leaking into Plain Text–Problem: Application-Specific Characters Leaking into Plain Text, Getting Reviews–Sentiment Classification, Sentiment Classification, Polarized Language–Polarized Language, Corpus Creation–Corpus Creation, Training a Classifier–Lessons Learned, Moving On to the Professional World, Government Data Is Very Real, Lessons Learned and Looking Ahead, File Formats–File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, File Formats, A Relational Cost Allocations Model–A Relational Cost Allocations Model, The Delicate Sound of a Combinatorial Explosion…–The Delicate Sound of a Combinatorial Explosion…, The Hidden Network Emerges–Finding Value in Network Properties Apache Thrift, File Formats columnar, Understand the Data Structure–Understand the Data Structure complexity of, increasing, The Delicate Sound of a Combinatorial Explosion…–The Delicate Sound of a Combinatorial Explosion… CSV, Is It Just Me, or Does This Data Smell Funny?, Understand the Data Structure–Understand the Data Structure, Keyword PPC Example–Keyword PPC Example, Problem: Application-Specific Characters Leaking into Plain Text–Problem: Application-Specific Characters Leaking into Plain Text, File Formats ER model, A Relational Cost Allocations Model–A Relational Cost Allocations Model Google Protocol Buffers, File Formats graph model, The Hidden Network Emerges–Finding Value in Network Properties human-readable format, Data Intended for Human Consumption, Not Machine Consumption–Data Spread Across Multiple Files, The Arrangement of Data–The Arrangement of Data, Reading Data from an Awkward Format–Reading Data Spread Across Several Files limiting analysis, The Arrangement of Data–The Arrangement of Data reading with software, Reading Data from an Awkward Format–Reading Data Spread Across Several Files JSON, Is It Just Me, or Does This Data Smell Funny?

We’ll also need queries for all the cases in between if we continue with this design to cover assets allocated to departments and products allocated to cost centers. Each new level of the query adds to a combinatorial explosion and begs many questions about the design. What happens if we change the allocation rules? What if a product can be allocated directly to a cost center instead of passing through a department? Are the queries efficient as the amount of data in the system increases? Is the system testable? To make matters worse, a real-world allocation model would also contain many more entities and associative entities. The Delicate Sound of a Combinatorial Explosion… We’ve introduced the problem and sketched out a rudimentary solution in just a few pages, but imagine how a real system like this might evolve over an extended period of months or even years with a team of people involved.

–Will This Be on the Test? JavaScript Object Notation, Is It Just Me, or Does This Data Smell Funny? (see JSON) jellyfish library, Python, Text Processing with Python JSON (JavaScript Object Notation), Is It Just Me, or Does This Data Smell Funny?, Understand the Data Structure–Understand the Data Structure, File Formats, File Formats, File Formats K Koch snowflake, The Delicate Sound of a Combinatorial Explosion… L Laiacano, Adam (author), (Re)Organizing the Web’s Data–Conclusion Levy, Josh (author), Bad Data Lurking in Plain Text–Exercises Logistic Regression classifier, Polarized Language longitudinal datasets, Imputation Bias: General Issues, Other Sources of Bias M machine-learning experts, outsourcing, How to Feed and Care for Your Machine-Learning Experts–Conclusion (see also data scientist) manufacturing data example, Example 1: Defect Reduction in Manufacturing–Example 1: Defect Reduction in Manufacturing Maximum Entropy classifier, Polarized Language McCallum, Q.


pages: 396 words: 117,149

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World by Pedro Domingos

Albert Einstein, Amazon Mechanical Turk, Arthur Eddington, basic income, Bayesian statistics, Benoit Mandelbrot, bioinformatics, Black Swan, Brownian motion, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer vision, constrained optimization, correlation does not imply causation, creative destruction, crowdsourcing, Danny Hillis, data is the new oil, double helix, Douglas Hofstadter, Erik Brynjolfsson, experimental subject, Filter Bubble, future of work, global village, Google Glasses, Gödel, Escher, Bach, information retrieval, job automation, John Markoff, John Snow's cholera map, John von Neumann, Joseph Schumpeter, Kevin Kelly, lone genius, mandelbrot fractal, Mark Zuckerberg, Moneyball by Michael Lewis explains big data, Narrative Science, Nate Silver, natural language processing, Netflix Prize, Network effects, NP-complete, off grid, P = NP, PageRank, pattern recognition, phenotype, planetary scale, pre–internet, random walk, Ray Kurzweil, recommendation engine, Richard Feynman, scientific worldview, Second Machine Age, self-driving car, Silicon Valley, social intelligence, speech recognition, Stanford marshmallow experiment, statistical model, Stephen Hawking, Steven Levy, Steven Pinker, superintelligent machines, the scientific method, The Signal and the Noise by Nate Silver, theory of mind, Thomas Bayes, transaction costs, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, white flight, zero-sum game

When the number of things an algorithm needs to do grows exponentially with the size of its input, computer scientists call it a combinatorial explosion and run for cover. In machine learning, the number of possible instances of a concept is an exponential function of the number of attributes: if the attributes are Boolean, each new attribute doubles the number of possible instances by taking each previous instance and extending it with a yes or no for that attribute. In turn, the number of possible concepts is an exponential function of the number of possible instances: since a concept labels each instance as positive or negative, adding an instance doubles the number of possible concepts. As a result, the number of concepts is an exponential function of an exponential function of the number of attributes! In other words, machine learning is a combinatorial explosion of combinatorial explosions. Perhaps we should just give up and not waste our time on such a hopeless problem?

It helps that, if the goal is to cure cancer, we don’t necessarily need to understand all the details of how tumor cells work, only enough to disable them without harming normal cells. In Chapter 6, we’ll also see how to orient learning toward the goal while steering clear of the things we don’t know and don’t need to know. More immediately, we know we can use inverse deduction to infer the structure of the cell’s networks from data and previous knowledge, but there’s a combinatorial explosion of ways to apply it, and we need a strategy. Since metabolic networks were designed by evolution, perhaps simulating it in our learning algorithms is the way to go. In the next chapter, we’ll see how to do just that. Deeper into the brain When backprop first hit the streets, connectionists had visions of quickly learning larger and larger networks until, hardware permitting, they amounted to artificial brains.

In the days before computers, a police artist could quickly put together a portrait of a suspect from eyewitness interviews by selecting a mouth from a set of paper strips depicting typical mouth shapes and doing the same for the eyes, nose, chin, and so on. With only ten building blocks and ten options for each, this system would allow for ten billion different faces, more than there are people on Earth. In machine learning, as elsewhere in computer science, there’s nothing better than getting such a combinatorial explosion to work for you instead of against you. What’s clever about genetic algorithms is that each string implicitly contains an exponential number of building blocks, known as schemas, and so the search is a lot more efficient than it seems. This is because every subset of the string’s bits is a schema, representing some potentially fit combination of properties, and a string has an exponential number of subsets.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

Ada Lovelace, affirmative action, AI winter, Alfred Russel Wallace, Amazon Mechanical Turk, animal electricity, autonomous vehicles, Black Swan, British Empire, cellular automata, citizen journalism, Claude Shannon: information theory, combinatorial explosion, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, desegregation, discovery of DNA, Douglas Hofstadter, Elon Musk, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, low skilled workers, Mark Zuckerberg, mass immigration, meta analysis, meta-analysis, mutually assured destruction, natural language processing, new economy, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, women in the workforce

It’s easy to show that the number of routes possible is combinatorially explosive as a function of the number of cities n. As n gets bigger, the number of alternative routes explodes to n*(n-1)*(n-2) … 1, which mathematicians call n!, or ‘n factorial’. A factorial grows faster than any polynomial (geometric figure) of n. It expands faster than any square, any cube, in four-dimensional cube, etc., all the way up to massive geometric figures that we can’t even imagine. The TSP can be described in a single sentence, but is provably amongst the hardest computational problems possible, regardless of computational power, because there is no algorithm for simplifying the TSP such that a combinatoric explosion can be averted. There is no way to find the best route that practically is better than looking through all the (combinatorially explosive number of) possible routes.

This is the ‘end’ in means–ends analysis, and the connecting roads are the ‘means’. Your current position is calculated (using signals from satellites), and it is also found in the graph. The satnav reasons quantitatively over distances, by exploring the roads spanning out from your location, to find the ‘best’ (shortest) means to the end (your desired destination). This is why means–ends analysis is often called reasoning as search. While not as combinatorially explosive as the TSP, the search for the best route is, in general, terrifically hard, particularly in terms of the usually large number of roads and intermediate locations between you and your destination. So rather than take until the sun burns out to find you a route, means–ends analysis in your satnav exploits heuristics to find you a route in reasonable time. ‘Heuristic’ is a word that describes a practical method (not guaranteed to be optimal or rational) for reaching a satisfactory solution when dealing with a complex problem.

An even worse scenario was if MYCIN was asked a question about something that at first appeared to be a blood infection, but was, in fact, something else (perhaps a poisoning). In that case the program could deliver an answer that was misleadingly, or even dangerously, wrong. On the face of it, the obvious solution to this brittleness would be to expand what an expert system ‘knows’: to give it more ‘atoms’, and more rules that indicate their relationships. But this is, of course, combinatorially explosive. However, Simon and Newell’s work suggests that arming an expert system with sufficient heuristics might overcome that problem. After all, human beings (some of whom aren’t even ‘experts’) often manage to come up with reasonable solutions even when faced with questions for which they don’t have complete domain knowledge. But the human atoms and heuristics that made these solutions possible proved hard to find.


pages: 574 words: 164,509

Superintelligence: Paths, Dangers, Strategies by Nick Bostrom

agricultural Revolution, AI winter, Albert Einstein, algorithmic trading, anthropic principle, anti-communist, artificial general intelligence, autonomous vehicles, barriers to entry, Bayesian statistics, bioinformatics, brain emulation, cloud computing, combinatorial explosion, computer vision, cosmological constant, dark matter, DARPA: Urban Challenge, data acquisition, delayed gratification, demographic transition, different worldview, Donald Knuth, Douglas Hofstadter, Drosophila, Elon Musk, en.wikipedia.org, endogenous growth, epigenetics, fear of failure, Flash crash, Flynn Effect, friendly AI, Gödel, Escher, Bach, income inequality, industrial robot, informal economy, information retrieval, interchangeable parts, iterative process, job automation, John Markoff, John von Neumann, knowledge worker, longitudinal study, Menlo Park, meta analysis, meta-analysis, mutually assured destruction, Nash equilibrium, Netflix Prize, new economy, Norbert Wiener, NP-complete, nuclear winter, optical character recognition, pattern recognition, performance metric, phenotype, prediction markets, price stability, principal–agent problem, race to the bottom, random walk, Ray Kurzweil, recommendation engine, reversible computing, social graph, speech recognition, Stanislav Petrov, statistical model, stem cell, Stephen Hawking, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, The Coming Technological Singularity, The Nature of the Firm, Thomas Kuhn: the structure of scientific revolutions, transaction costs, Turing machine, Vernor Vinge, Watson beat the top human players on Jeopardy!, World Values Survey, zero-sum game

One such early system, the Logic Theorist, was able to prove most of the theorems in the second chapter of Whitehead and Russell’s Principia Mathematica, and even came up with one proof that was much more elegant than the original, thereby debunking the notion that machines could “only think numerically” and showing that machines were also able to do deduction and to invent logical proofs.13 A follow-up program, the General Problem Solver, could in principle solve a wide range of formally specified problems.14 Programs that could solve calculus problems typical of first-year college courses, visual analogy problems of the type that appear in some IQ tests, and simple verbal algebra problems were also written.15 The Shakey robot (so named because of its tendency to tremble during operation) demonstrated how logical reasoning could be integrated with perception and used to plan and control physical activity.16 The ELIZA program showed how a computer could impersonate a Rogerian psychotherapist.17 In the mid-seventies, the program SHRDLU showed how a simulated robotic arm in a simulated world of geometric blocks could follow instructions and answer questions in English that were typed in by a user.18 In later decades, systems would be created that demonstrated that machines could compose music in the style of various classical composers, outperform junior doctors in certain clinical diagnostic tasks, drive cars autonomously, and make patentable inventions.19 There has even been an AI that cracked original jokes.20 (Not that its level of humor was high—“What do you get when you cross an optic with a mental object? An eye-dea”—but children reportedly found its puns consistently entertaining.) The methods that produced successes in the early demonstration systems often proved difficult to extend to a wider variety of problems or to harder problem instances. One reason for this is the “combinatorial explosion” of possibilities that must be explored by methods that rely on something like exhaustive search. Such methods work well for simple instances of a problem, but fail when things get a bit more complicated. For instance, to prove a theorem that has a 5-line long proof in a deduction system with one inference rule and 5 axioms, one could simply enumerate the 3,125 possible combinations and check each one to see if it delivers the intended conclusion.

But as the task becomes more difficult, the method of exhaustive search soon runs into trouble. Proving a theorem with a 50-line proof does not take ten times longer than proving a theorem that has a 5-line proof: rather, if one uses exhaustive search, it requires combing through 550 ≈ 8.9 × 1034 possible sequences—which is computationally infeasible even with the fastest supercomputers. To overcome the combinatorial explosion, one needs algorithms that exploit structure in the target domain and take advantage of prior knowledge by using heuristic search, planning, and flexible abstract representations—capabilities that were poorly developed in the early AI systems. The performance of these early systems also suffered because of poor methods for handling uncertainty, reliance on brittle and ungrounded symbolic representations, data scarcity, and severe hardware limitations on memory capacity and processor speed.

In practice, however, getting evolutionary methods to work well requires skill and ingenuity, particularly in devising a good representational format. Without an efficient way to encode candidate solutions (a genetic language that matches latent structure in the target domain), evolutionary search tends to meander endlessly in a vast search space or get stuck at a local optimum. Even if a good representational format is found, evolution is computationally demanding and is often defeated by the combinatorial explosion. Neural networks and genetic algorithms are examples of methods that stimulated excitement in the 1990s by appearing to offer alternatives to the stagnating GOFAI paradigm. But the intention here is not to sing the praises of these two methods or to elevate them above the many other techniques in machine learning. In fact, one of the major theoretical developments of the past twenty years has been a clearer realization of how superficially disparate techniques can be understood as special cases within a common mathematical framework.


When Computers Can Think: The Artificial Intelligence Singularity by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann, Michelle Estes

3D printing, AI winter, anthropic principle, artificial general intelligence, Asilomar, augmented reality, Automated Insights, autonomous vehicles, availability heuristic, blue-collar work, brain emulation, call centre, cognitive bias, combinatorial explosion, computer vision, create, read, update, delete, cuban missile crisis, David Attenborough, Elon Musk, en.wikipedia.org, epigenetics, Ernest Rutherford, factory automation, feminist movement, finite state, Flynn Effect, friendly AI, general-purpose programming language, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, industrial robot, Isaac Newton, job automation, John von Neumann, Law of Accelerating Returns, license plate recognition, Mahatma Gandhi, mandelbrot fractal, natural language processing, Parkinson's law, patent troll, patient HM, pattern recognition, phenotype, ransomware, Ray Kurzweil, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, sorting algorithm, speech recognition, statistical model, stem cell, Stephen Hawking, Stuxnet, superintelligent machines, technological singularity, Thomas Malthus, Turing machine, Turing test, uranium enrichment, Von Neumann architecture, Watson beat the top human players on Jeopardy!, wikimedia commons, zero day

But looking ahead 20 half moves produces 100,000,000,000,000,000,000 combinations which is ridiculously large. The result of this is that chess programs that can just look ahead a few moves can play a passable game. A chess program that looked ahead 20 half moves would be unbeatable. But combinatorial explosion makes that impossible. The problem is known as as having exponential complexity. That is because the number of cases grows as a power to the problem size. In this case the size is 10n where n is the number of moves to look ahead. Many other problems are like that, and several very promising early results in artificial intelligence failed to scale to more realistic problems due to the resulting combinatorial explosion. This does not mean that problems cannot be solved. It just means that the naive brute force application of a simplistic algorithms cannot easily solve the world’s problems. Many techniques have been developed to improve the performance of algorithms and avoid or at least delay exponential complexity.

Moore's law, transistors 4. Core and disk storage 5. Limits to growth 6. Long term growth 7. Human intelligence now minimal for AGI 8. Definitions of singularity 4. Hollywood and HAL 2001 1. Anthropomorphic zap gun vs. virus 2. The two HAL's 3. HAL dialog 5. The Case Against Machine Intelligence 1. Turing halting problem 2. Gödel's incompleteness theorem 3. Incompleteness argument against general AGI 4. Combinatorial explosion 5. Chinese room 6. Simulated vs. real intelligence 7. Emperors new mind 8. Intentionality 9. Brain in a vat 10. Understanding the brain 11. Consciousness and the soul 12. Only what was programmed 13. What computers can't do 14. Over-hyped technologies 15. Nonlinear difficulty, chimpanzees 16. End of Moore's law 17. Bootstrap fallacy 18. Recursive self-improvement 19. Limited Self-improvement 20.

Alan Turing himself did not consider these issues to be relevant. Indeed, in 1950 Turing wrote a landmark paper “Computing machinery and intelligence”in which he discussed the proposition that computers will be able to really think. In the paper he addressed nine objections to the proposition, and specifically addressed the irrelevance of the halting problem and the incompleteness theorem to this question. Combinatorial explosion Many problems in artificial intelligence involve searching for a solution out of a large number of possibilities. For example, suppose a chess program considers ten plausible moves that it might make, then for each of those moves it considers ten moves its opponent might make. That would make a total of 100 moves it needs to consider. If it then considers what response it might make to those 100 moves, that would produce 1,000 combinations to explore.


pages: 509 words: 92,141

The Pragmatic Programmer by Andrew Hunt, Dave Thomas

A Pattern Language, Broken windows theory, business process, buy low sell high, c2.com, combinatorial explosion, continuous integration, database schema, domain-specific language, don't repeat yourself, Donald Knuth, general-purpose programming language, George Santayana, Grace Hopper, if you see hoof prints, think horses—not zebras, index card, lateral thinking, loose coupling, Menlo Park, MVC pattern, premature optimization, Ralph Waldo Emerson, revision control, Schrödinger's Cat, slashdot, sorting algorithm, speech recognition, traveling salesman, urban decay, Y2K

Rather than digging though a hierarchy yourself, just ask for what you need directly: We added a method to Selection to get the time zone on our behalf: the plotting routine doesn't care whether the time zone comes from the Recorder directly, from some contained object within Recorder, or whether Selection makes up a different time zone entirely. The selection routine, in turn, should probably just ask the recorder for its time zone, leaving it up to the recorder to get it from its contained Location object. Traversing relationships between objects directly can quickly lead to a combinatorial explosion[1] of dependency relationships. You can see symptoms of this phenomenon in a number of ways: [1] If n objects all know about each other, then a change to just one object can result in the other n – 1 objects needing changes. Large C or C++ projects where the command to link a unit test is longer than the test program itself "Simple" changes to one module that propagate through unrelated modules in the system Developers who are afraid to change code because they aren't sure what might be affected Systems with many unnecessary dependencies are very hard (and expensive) to maintain, and tend to be highly unstable.

Anyone can ask a witness questions in the pursuit of the case, post the transcript, and move that witness to another area of the blackboard, where he might respond differently (if you allow the witness to read the blackboard too). A big advantage of systems such as these is that you have a single, consistent interface to the blackboard. When building a conventional distributed application, you can spend a great deal of time crafting unique API calls for every distributed transaction and interaction in the system. With the combinatorial explosion of interfaces and interactions, the project can quickly become a nightmare. Organizing Your Blackboard When the detectives work on large cases, the blackboard may become cluttered, and it may become difficult to locate data on the board. The solution is to partition the blackboard and start to organize the data on the blackboard somehow. Different software systems handle this partitioning in different ways; some use fairly flat zones or interest groups, while others adopt a more hierarchical treelike structure.

Index A Accessor function, 31 ACM, see Association for Computing Machinery Active code generator, 104 Activity diagram, 150 Advanced C++ Programming Styles and Idioms, 265 Advanced Programming in the Unix Environment, 264 Aegis transaction-based configuration management, 246, 271 Agent, 76, 117, 297 Algorithm binary chop, 180 choosing, 182 combinatoric, 180 divide-and-conquer, 180 estimating, 177, 178 linear, 177 O() notation, 178, 181 quicksort, 180 runtime, 181 sublinear, 177 Allocations, nesting, 131 Analysis Patterns, 264 Anonymity, 258 AOP, see Aspect-Oriented Programming Architecture deployment, 156 flexibility, 46 prototyping, 55 temporal decoupling, 152 Art of Computer Programming, 183 Artificial intelligence, marauding, 26 Aspect-Oriented Programming (AOP), 39, 273 Assertion, 113, 122, 175 side effects, 124 turning off, 123 Association for Computing Machinery (ACM), 262 Communications of the ACM, 263 SIGPLAN, 263 Assumptions, testing, 175 “at” command, 231 Audience, 21 needs, 19 auto_ptr, 134 Automation, 230 approval procedures, 235 build, 88, 233 compiling, 232 cron, 231 documentation, 251 scripts, 234 team, 229 testing, 29, 238 Web site generation, 235 awk, 99 B Backus-Naur Form (BNF), 59n Base class, 112 bash shell, 80, 82n Bean, see Enterprise Java Beans (EJB) Beck, Kent, 194, 258 Beowulf project, 268 “Big O” notation, 177 “Big picture”, 8 Binary chop, 97, 180 Binary format, 73 problems parsing, 75 bison, 59, 269 BIST, see Built-In Self Test Blackboard system, 165 partitioning, 168 workflow, 169 Blender example contract for, 119, 289 regression test jig, 305 workflow, 151 BNF, see Backus-Naur Form (BNF) Boiled frog, 8, 175, 225 Boundary condition, 173, 243 Brain, Marshall, 265 Branding, 226 Brant, John, 268 “Broken Window Theory”, 5 vs. stone soup, 9 Brooks, Fred, 264 Browser, class, 187 Browser, refactoring, 187, 268 Bug, 90 failed contract as, 111 see also Debugging; Error Build automation, 88, 233 dependencies, 233 final, 234 nightly, 231 refactoring, 187 Built-In Self Test (BIST), 189 Business logic, 146 Business policy, 203 C C language assertions, 122 DBC, 114 duplication, 29 error handling, 121 error messages, 115 macros, 121 Object Pascal interface, 101 C++ language, 46 assertions, 122 auto_ptr, 134 books, 265 DBC, 114 decoupling, 142 DOC++, 251, 269 duplication, 29 error messages, 115 exceptions, 132 unit tests, 193 Caching, 31 Call, routine, 115, 173 Cascading Style Sheets (CSS), 253 Cat blaming, 3 herding, 224 Schrödinger’s, 47 Catalyzing change, 8 Cathedrals, xx Cetus links, 265 Change, catalyzing, 8 Christiansen, Tom, 81 Class assertions, 113 base, 112 coupling, 139, 142 coupling ratios, 242 encapsulating resource, 132 invariant, 110, 113 number of states, 245 resource allocation, 132 subclass, 112 wrapper, 132, 133, 135, 141 Class browser, 187 ClearCase, 271 Cockburn, Alistair, xxiii, 205, 264, 272 Code generator, 28, 102 active, 104 makefiles, 232 parsers, 105 passive, 103 Code profiler, 182 Code reviews, 33, 236 Coding algorithm speed, 177 comments, 29, 249 coupled, 130 coverage analysis, 245 database schema, 104 defensive, 107 and documentation, 29, 248 estimating, 68 exceptions, 125 implementation, 173 iterative, 69 “lazy”, 111 metrics, 242 modules, 138 multiple representations, 28 orthogonality, 34, 36, 40 ownership, 258 prototypes, 55 server code, 196 “shy”, 40, 138 specifications, 219 tracer bullets, 49–51 unit testing, 190, 192 see also Coupled code; Decoupled code; Metadata; Source code control system (SCCS) Cohesion, 35 COM, see Component Object Model Combinatorial explosion, 140, 167 Combinatoric algorithm, 180 Command shell, 77 bash, 80 Cygwin, 80 vs. GUI, 78 UWIN, 81 Windows, 80 Comment, 29, 249 avoiding duplication, 29 DBC, 113 parameters, 250 types of, 249 unnecessary, 250 see also Documentation Common Object Request Broker (CORBA), 29, 39, 46 Event Service, 160 Communicating, 18 audience, 19, 21 duplication, 32 e-mail, 22 and formal methods, 221 presentation, 20 style, 20 teams, 225 users, 256 writing, 18 Communications of the ACM, 263 Comp.object FAQ, 272 Compiling, 232 compilers, 267 DBC, 113 warnings and debugging, 92 Component Object Model (COM), 55 Component-based systems, see Modular system Concurrency, 150 design, 154 interfaces, 155 and Programming by Coincidence, 154 requirements analysis of, 150 workflow, 150 Concurrent Version System (CVS), 271 Configuration cooperative, 148 dynamic, 144 metadata, 147 Configuration management, 86, 271 Constantine, Larry L., 35 Constraint management, 213 Constructor, 132 initialization, 155 Contact, authors’ e-mail, xxiii Context, use instead of globals, 40 Contract, 109, 174 see also Design by contract (DBC) Controller (MVC), 162 Coplien, Jim, 265 CORBA, see Common Object Request Broker Coupled code, 130 coupling ratios, 242 minimizing, 138, 158 performance, 142 temporal coupling, 150 see also Decoupled code Coverage analysis, 245 Cox, Brad J., 189n Crash, 120 Critical thinking, 16 cron, 231 CSS, see Cascading Style Sheets CVS, see Concurrent Version System Cygwin, 80, 270 D Data blackboard system, 169 caching, 31 dictionary, 144 dynamic data structures, 135 global, 40 language, 60 normalizing, 30 readable vs. understandable, 75 test, 100, 243 views, 160 visualizing, 93 see also Metadata Data Display Debugger (DDD), 93, 268 Database active code generator, 104 schema, 105f, 141, 144 schema maintenance, 100 DBC, see Design by contract DDD, see Data Display Debugger Deadline, 6, 246 Deadlock, 131 Debugging, 90 assertions, 123 binary search, 97 bug location, 96 bug reproduction, 93 checklist, 98 compiler warnings and, 92 corrupt variables, 95 “Heisenbug”, 124 rubber ducking, 95 and source code branching, 87 surprise bug, 97 and testing, 92, 195 time bomb, 192 tracing, 94 view, 164 visualizing data, 93 Decision making, 46 Decoupled code, 38, 40 architecture, 152 blackboard system, 166 Law of Demeter, 140 metadata, 145 minimizing coupling, 138 modular testing, 244 physical decoupling, 142 temporal coupling, 150 workflow, 150 see also Coupled code Defensive coding, 107 Delegation, 304 Delphi, 55 see also Object Pascal Demeter project, 274 Demeter, Law of, 140 Dependency, reducing, see Modular system; Orthogonality Deployment, 156 Deployment descriptor, 148 Design accessor functions, 31 concurrency, 154 context, 174 deployment, 156 design/methodology testing, 242 metadata, 145 orthogonality, 34, 37 physical, 142 refactoring, 186 using services, 154 Design by contract (DBC), 109, 155 and agents, 117 assertions, 113 class invariant, 110 as comments, 113 dynamic contracts, 117 iContract, 268 language support, 114 list insertion example, 110 pre- and postcondition, 110, 113, 114 predicates, 110 unit testing, 190 Design Patterns, 264 observer, 158 singleton, 41 strategy, 41 Destructor, 132 Detectives, 165 Development tree, 87 Development, iterative, 69 Divide-and-conquer algorithm, 180 DOC++ documentation generator, 251, 269 DocBook, 254 Documentation automatic updating, 251 and code, 29, 248 comments, 29, 113, 249, 251 executable, 251 formats, 253 HTML, 101 hypertext, 210 internal/external, 248 invariant, 117 mark-up languages, 254 orthogonality, 42 outline, 18 requirements, 204 technical writers, 252 word processors, 252, 254 writing specifications, 218 see also Comment; Web documentation Dodo, 148 Domain, problem, 58, 66 Don’t repeat yourself, see DRY principle Downloading source code, see Example code Dr.


pages: 313 words: 91,098

The Knowledge Illusion by Steven Sloman

Affordable Care Act / Obamacare, Air France Flight 447, attribution theory, bitcoin, Black Swan, Cass Sunstein, combinatorial explosion, computer age, crowdsourcing, Dmitri Mendeleev, Elon Musk, Ethereum, Flynn Effect, Hernando de Soto, hindsight bias, hive mind, indoor plumbing, Isaac Newton, John von Neumann, libertarian paternalism, Mahatma Gandhi, Mark Zuckerberg, meta analysis, meta-analysis, obamacare, prediction markets, randomized controlled trial, Ray Kurzweil, Richard Feynman, Richard Thaler, Rodney Brooks, Rosa Parks, single-payer health, speech recognition, stem cell, Stephen Hawking, Steve Jobs, technological singularity, The Coming Technological Singularity, The Wisdom of Crowds, Vernor Vinge, web application, Whole Earth Review, Y Combinator

See also artificial intelligence (AI); thought body-brain cooperation in cognitive processing, 101–05 collective mind, 5–6 CRT (Cognitive Reflection Test), 80–84 division of cognitive labor, 14, 109–11, 120–21, 128–29 illusion of understanding, 8, 15 individual limitations, 4–5, 15 Landauer, Thomas, 24–26 the mind compared to a computer, 24–27 moving text window example, 93–95 origins of, 4 Turing, Alan, 25 collaboration, 14, 109–11, 115–18, 121–22, 149–50, 226 collective intelligence hypothesis, 209–10 combinatorial explosion, 34 “The Coming Technological Singularity” (Vinge), 132 communal learning Brown, Ann, 228–30 Fostering Communities of Learners program, 228–30 group/regroup strategy, 228–30 jigsaw method, 229–30 communication language, 113–14 non-verbal, 114–15, 117 community development of, 112–13 intelligence, 259 responsibility, 259–61 community of knowledge, 80, 200, 206–14, 221, 223–27, 241–42 compatibility of different group members’ knowledge, 126 complexity airplane example, 28 beehive example, 107–08, 113–14 car example, 28 chaos theory, 34–35 class reunion example, 31 combinatorial explosion, 34 fractals, 33–34 hairpin example, 34 of the human brain, 29–30 military strategy, 32–33 in the natural world, 29–31 of politics, 16 recognizing, 35 reducing, 250 of technology, 134–35 weather prediction, 30–31 comprehension illusion of, 217–18 inverted text example, 217 Pledge of Allegiance example, 217–18 “Purple Haze” example, 218 computer checkers example of testing intelligence of a team, 210–11 “Computing Machinery and Intelligence” (Turing), 25 consequences of tiny changes.

To fully understand a hairpin would entail understanding all the uses and potential uses of a hairpin: the various materials it is made of, where each material comes from, how each material is used to manufacture hairpins, where hairpins are sold, and who buys them. And to fully appreciate the answer to each of these questions would require understanding the answer to a number of other questions. Fully understanding who buys hairpins would require an analysis of hairstyles, which in turn would require understanding fashion and its underlying social structure. Computer scientists refer to this problem of ever-growing information needs as combinatorial explosion. To achieve complete understanding necessitates understanding increasingly more and more, and the combination of everything you need to understand to achieve complete understanding quickly becomes more than you can bear without, well, exploding. Chaos theory is another mathematical tool that shows that the complexity of the world is too much to handle. In a chaotic system, tiny differences at the beginning of a process can lead to massive differences down the road.


pages: 174 words: 56,405

Machine Translation by Thierry Poibeau

AltaVista, augmented reality, call centre, Claude Shannon: information theory, cloud computing, combinatorial explosion, crowdsourcing, easy for humans, difficult for computers, en.wikipedia.org, Google Glasses, information retrieval, Internet of things, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, natural language processing, Necker cube, Norbert Wiener, RAND corporation, Robert Mercer, Skype, speech recognition, statistical model, technological singularity, Turing test, wikimedia commons

It appeals to a more general principle of learning multiple levels of composition, which can be applied in machine learning frameworks that are not necessarily neurally inspired.” This approach has received extensive press coverage. This was particularly the case in March 2016, when Google Deepmind’s system AlphaGo—based on deep learning—beat the world champion in the game of Go. This approach is especially efficient in complex environments such as Go, where it is impossible to systematically explore all the possible combinations due to combinatorial explosion (i.e., there are very quickly too many possibilities to be able to explore all of them systematically). The complexity of human languages is somewhat different: the overall meaning of a sentence or of a text is based on ambiguous words, with no clear-cut boundaries between word senses, and all in relation to one another. Moreover, word senses do not directly correspond across different languages, and the same notion can be expressed by a single word or by a group of words, depending on the context and language considered.

See Mobile phone Centre d’Études sur la Traduction Automatique (CETA), 67–68, 84 Centre National de la Recherche Scientifique (CNRS), 67 Chandioux, John, 87 Child language acquisition, 255 China, 67, 86 Chinese, 56, 88, 163–165, 192, 209, 215, 228, 232, 250 Chomsky, Noam, 63, 65 Church, Kenneth, 105 Co-construction of meaning, 20 Cognate, 11, 107–108, 261 Cognitive plausibility, 20, 23, 178, 181–184, 187, 251–256 Cognitive sciences, 2. See also Cognitive plausibility Cold War, 49, 60 Colmerauer, Alain, 84 Combinatorial explosion, 182 Communication network, 249. See also Social network Compendium of translation software, 229 Complexity (linguistic), 18, 23, 182, 195, 255 Compound words, 15, 23, 33, 46, 164–165, 214, 261 Comprehension evaluation. See Evaluation measure and test Computational linguistics, 15, 36, 37, 68, 82–84 Computation time, 54, 149, 155, 170, Computer documentation, 119 Confidential data 230–231.


pages: 696 words: 143,736

The Age of Spiritual Machines: When Computers Exceed Human Intelligence by Ray Kurzweil

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Buckminster Fuller, call centre, cellular automata, combinatorial explosion, complexity theory, computer age, computer vision, cosmological constant, cosmological principle, Danny Hillis, double helix, Douglas Hofstadter, Everything should be made as simple as possible, first square of the chessboard / second half of the chessboard, fudge factor, George Gilder, Gödel, Escher, Bach, I think there is a world market for maybe five computers, information retrieval, invention of movable type, Isaac Newton, iterative process, Jacquard loom, John Markoff, John von Neumann, Lao Tzu, Law of Accelerating Returns, mandelbrot fractal, Marshall McLuhan, Menlo Park, natural language processing, Norbert Wiener, optical character recognition, ought to be enough for anybody, pattern recognition, phenotype, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Robert Metcalfe, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, social intelligence, speech recognition, Steven Pinker, Stewart Brand, stochastic process, technological singularity, Ted Kaczynski, telepresence, the medium is the message, There's no reason for any individual to have a computer in his home - Ken Olsen, traveling salesman, Turing machine, Turing test, Whole Earth Review, Y2K

He was probably playing ball with one of his sons. He saw the ball rolling on a curved surface ... AND CONCLUDED—EUREKA—SPACE IS CURVED! CHAPTER FIVE CONTEXT AND KNOWLEDGE PUTTING IT ALL TOGETHER So how well have we done? Many apparently difficult problems do yield to the application of a few simple formulas. The recursive formula is a master at analyzing problems that display inherent combinatorial explosion, ranging from the playing of board games to proving mathematical theorems. Neural nets and related self-organizing paradigms emulate our pattern-recognition faculties, and do a fine job of discerning such diverse phenomena as human speech, letter shapes, visual objects, faces, fingerprints, and land terrain images. Evolutionary algorithms are effective at analyzing complex problems, ranging from making financial investment decisions to optimizing industrial processes, in which the number of variables is too great for precise analytic solutions.

With little fingers and computation, nanomachines would have in their Lilliputian world what people have in the big world: intelligence and the ability to manipulate their environment. Then these little machines could build replicas of themselves, achieving the field’s key objective. The reason that self-replication is important is that it is too expensive to build these tiny machines one at a time. To be effective, nanometer-sized machines need to come in the trillions. The only way to achieve this economically is through combinatorial explosion: let the machines build themselves. Drexler, Merkle (a coinventor of public key encryption, the primary method of encrypting messages), and others have convincingly described how such a self-replicating nanorobot—nanobot—could be constructed. The trick is to provide the nanobot with sufficiently flexible manipulators—arms and hands—so that it is capable of building a copy of itself. It needs some means for mobility so that it can find the requisite raw materials.

I do this not to belabor the issue of chess playing, but rather because it illustrates a clear contrast. Raj Reddy, Carnegie Mellon University’s AI guru, cites studies of chess as playing the same role in artificial intelligence that studies of E. coli play in biology: an ideal laboratory for studying fundamental questions.5 Computers use their extreme speed to analyze the vast combinations created by the combinatorial explosion of moves and countermoves. While chess programs may use a few other tricks (such as storing the openings of all master chess games in this century and precomputing endgames), they essentially rely on their combination of speed and precision. In comparison, humans, even chess masters, are extremely slow and imprecise. So we precompute all of our chess moves. That’s why it takes so long to become a chess master, or the master of any pursuit.


Hands-On Machine Learning With Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems by Aurelien Geron

Amazon Mechanical Turk, Bayesian statistics, centre right, combinatorial explosion, constrained optimization, correlation coefficient, crowdsourcing, en.wikipedia.org, iterative process, Netflix Prize, NP-complete, optical character recognition, P = NP, p-value, pattern recognition, performance metric, recommendation engine, self-driving car, SpamAssassin, speech recognition, statistical model

This is made possible by the fact that PolynomialFeatures also adds all combinations of features up to the given degree. For example, if there were two features a and b, PolynomialFeatures with degree=3 would not only add the features a2, a3, b2, and b3, but also the combinations ab, a2b, and ab2. Warning PolynomialFeatures(degree=d) transforms an array containing n features into an array containing features, where n! is the factorial of n, equal to 1 × 2 × 3 × ⋯ × n. Beware of the combinatorial explosion of the number of features! Learning Curves If you perform high-degree Polynomial Regression, you will likely fit the training data much better than with plain Linear Regression. For example, Figure 4-14 applies a 300-degree polynomial model to the preceding training data, and compares the result with a pure linear model and a quadratic model (2nd-degree polynomial). Notice how the 300-degree polynomial model wiggles around to get as close as possible to the training instances.

Linear SVM classifier using polynomial features Polynomial Kernel Adding polynomial features is simple to implement and can work great with all sorts of Machine Learning algorithms (not just SVMs), but at a low polynomial degree it cannot deal with very complex datasets, and with a high polynomial degree it creates a huge number of features, making the model too slow. Fortunately, when using SVMs you can apply an almost miraculous mathematical technique called the kernel trick (it is explained in a moment). It makes it possible to get the same result as if you added many polynomial features, even with very high-degree polynomials, without actually having to add them. So there is no combinatorial explosion of the number of features since you don’t actually add any features. This trick is implemented by the SVC class. Let’s test it on the moons dataset: from sklearn.svm import SVC poly_kernel_svm_clf = Pipeline([ ("scaler", StandardScaler()), ("svm_clf", SVC(kernel="poly", degree=3, coef0=1, C=5)) ]) poly_kernel_svm_clf.fit(X, y) This code trains an SVM classifier using a 3rd-degree polynomial kernel.


pages: 721 words: 197,134

Data Mining: Concepts, Models, Methods, and Algorithms by Mehmed Kantardzić

Albert Einstein, bioinformatics, business cycle, business intelligence, business process, butter production in bangladesh, combinatorial explosion, computer vision, conceptual framework, correlation coefficient, correlation does not imply causation, data acquisition, discrete time, El Camino Real, fault tolerance, finite state, Gini coefficient, information retrieval, Internet Archive, inventory management, iterative process, knowledge worker, linked data, loose coupling, Menlo Park, natural language processing, Netflix Prize, NP-complete, PageRank, pattern recognition, peer-to-peer, phenotype, random walk, RFID, semantic web, speech recognition, statistical model, Telecommunications Act of 1996, telemarketer, text mining, traveling salesman, web application

In other words, we suppose that these values do not have any influence on the final data-mining results. In that case, a sample with the missing value may be extended to the set of artificial samples, where, for each new sample, the missing value is replaced with one of the possible feature values of a given domain. Although this interpretation may look more natural, the problem with this approach is the combinatorial explosion of artificial samples. For example, if one 3-D sample X is given as X = {1, ?, 3}, where the second feature’s value is missing, the process will generate five artificial samples for the feature domain [0, 1, 2, 3, 4] Finally, the data miner can generate a predictive model to predict each of the missing values. For example, if three features A, B, and C are given for each sample, then based on samples that have all three values as a training set, the data miner can generate a model of correlation between features.

Ideally, we would like to choose a test at each stage of sample-set splitting so that the final tree is small. Since we are looking for a compact decision tree that is consistent with the training set, why not explore all possible trees and select the simplest? Unfortunately, the problem of finding the smallest decision tree consistent with a training data set is NP-complete. Enumeration and analysis of all possible trees will cause a combinatorial explosion for any real-world problem. For example, for a small database with five attributes and only 20 training examples, the possible number of decision trees is greater than 106, depending on the number of different values for every attribute. Therefore, most decision tree-construction methods are non-backtracking, greedy algorithms. Once a test has been selected using some heuristics to maximize the measure of progress and the current set of training cases has been partitioned, the consequences of alternative choices are not explored.

The increasing prominence of data streams arising in a wide range of advanced applications such as fraud detection and trend learning has led to the study of online mining of frequent itemsets. Unlike mining static databases, mining data streams poses many new challenges. In addition to the one-scan nature, the unbounded memory requirement and the high data arrival rate of data streams, the combinatorial explosion of itemsets exacerbates the mining task. The high complexity of the frequent itemset mining problem hinders the application of the stream-mining techniques. We recognize that a critical review of existing techniques is needed in order to design and develop efficient mining algorithms and data structures that are able to match the processing rate of the mining with the high arrival rate of data streams.


pages: 72 words: 21,361

Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy by Erik Brynjolfsson

"Robert Solow", Amazon Mechanical Turk, Any sufficiently advanced technology is indistinguishable from magic, autonomous vehicles, business cycle, business process, call centre, combinatorial explosion, corporate governance, creative destruction, crowdsourcing, David Ricardo: comparative advantage, easy for humans, difficult for computers, Erik Brynjolfsson, factory automation, first square of the chessboard, first square of the chessboard / second half of the chessboard, Frank Levy and Richard Murnane: The New Division of Labor, hiring and firing, income inequality, intangible asset, job automation, John Markoff, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Khan Academy, Kickstarter, knowledge worker, Loebner Prize, low skilled workers, minimum wage unemployment, patent troll, pattern recognition, Paul Samuelson, Ray Kurzweil, rising living standards, Robert Gordon, self-driving car, shareholder value, Skype, too big to fail, Turing test, Tyler Cowen: Great Stagnation, Watson beat the top human players on Jeopardy!, wealth creators, winner-take-all economy, zero-sum game

Here’s a simple proof: suppose the people in a small company write down their work tasks— one task per card. If there were only 52 tasks in the company, as many as in a standard deck of cards, then there would be 52! different ways to arrange these tasks.8 This is far more than the number of grains of rice on the second 32 squares of a chessboard or even a second or third full chessboard. Combinatorial explosion is one of the few mathematical functions that outgrows an exponential trend. And that means that combinatorial innovation is the best way for human ingenuity to stay in the race with Moore’s Law. Most of the combinations may be no better than what we already have, but some surely will be, and a few will be “home runs” that are vast improvements. The trick is finding the ones that make a positive difference.


pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, animal electricity, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, British Empire, business process, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, complexity theory, continuous integration, Conway's Game of Life, cosmological principle, dark matter, dematerialisation, double helix, Douglas Hofstadter, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, income inequality, index card, industrial robot, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, Kickstarter, liberal capitalism, lifelogging, millennium bug, Moravec's paradox, natural language processing, Norbert Wiener, off grid, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, post-industrial society, prediction markets, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, social intelligence, speech recognition, stem cell, Stephen Hawking, Steven Pinker, strong AI, technological singularity, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

This meant that the machine ought to be able to solve any problem using first principles and experience derived from learning. Early models of general-solving were built, but could not scale up. Systems could solve one general problem but not any general problem.6 Algorithms that searched data in order to make general inferences failed quickly because of something called ‘combinatorial explosion’: there were simply too many interrelated parameters and variables to calculate after a number of steps. An approach called ‘heuristics’ tried to solve the combinatorial explosion problem by ‘pruning’ branches off the tree of the search executed by any given algorithm; but even this was shown to be of limited value. In the end, AI researchers came to realise that problems such as the recognition of faces or objects required ‘common sense’ reasoning, which was fiendishly difficult to code.


pages: 196 words: 58,122

AngularJS by Brad Green, Shyam Seshadri

combinatorial explosion, continuous integration, Firefox, Google Chrome, Kickstarter, MVC pattern, node package manager, single page application, web application, WebSocket

End-to-End/Integration Tests As applications grow (and they tend to, really fast, before you even realize it), testing whether they work as intended manually just doesn’t cut it anymore. After all, every time you add a new feature, you have to not only verify that the new feature works, but also that your old features still work, and that there are no bugs or regressions. If you start adding multiple browsers, you can easily see how this can become a combinatorial explosion! AngularJS tries to ease that by providing a Scenario Runner that simulates user interactions with your application. The Scenario Runner allows you to describe your application in a Jasmine-like syntax. Just as with the unit tests before, we will have a series of describes (for the feature), and individual its (to describe each individual functionality of the feature). As always, you can have some common actions, to be performed before and after each spec (as we call a test).


pages: 222 words: 53,317

Overcomplicated: Technology at the Limits of Comprehension by Samuel Arbesman

algorithmic trading, Anton Chekhov, Apple II, Benoit Mandelbrot, citation needed, combinatorial explosion, Danny Hillis, David Brooks, digital map, discovery of the americas, en.wikipedia.org, Erik Brynjolfsson, Flash crash, friendly AI, game design, Google X / Alphabet X, Googley, HyperCard, Inbox Zero, Isaac Newton, iterative process, Kevin Kelly, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, mandelbrot fractal, Minecraft, Netflix Prize, Nicholas Carr, Parkinson's law, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman: Challenger O-ring, Second Machine Age, self-driving car, software studies, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, superintelligent machines, Therac-25, Tyler Cowen: Great Stagnation, urban planning, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Y2K

Modularity embodies the principle of abstraction, allowing a certain amount of managed complexity through compartmentalization. Unfortunately, understanding individual modules—or building them to begin with—doesn’t always yield the kinds of expected behaviors we might hope for. If each module has multiple inputs and multiple outputs, when they are connected the resulting behavior can still be difficult to comprehend or to predict. We often end up getting a combinatorial explosion of interactions: so many different potential interactions that the number of combinations balloons beyond our ability to handle them all. For example, if each module in a system has a total of six distinct inputs and outputs, and we have only ten modules, there are more ways of connecting all these modules together than there are stars in the universe. In some realms that can be heavily regulated, such as finance or corporate structures, our dreams of increasing modularity or finding the ideal level of interoperability might work.


pages: 913 words: 265,787

How the Mind Works by Steven Pinker

affirmative action, agricultural Revolution, Alfred Russel Wallace, Buckminster Fuller, cognitive dissonance, Columbine, combinatorial explosion, complexity theory, computer age, computer vision, Daniel Kahneman / Amos Tversky, delayed gratification, double helix, experimental subject, feminist movement, four colour theorem, Gordon Gekko, greed is good, hedonic treadmill, Henri Poincaré, income per capita, information retrieval, invention of agriculture, invention of the wheel, Johannes Kepler, John von Neumann, lake wobegon effect, lateral thinking, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, Mikhail Gorbachev, Murray Gell-Mann, mutually assured destruction, Necker cube, out of africa, pattern recognition, phenotype, plutocrats, Plutocrats, random walk, Richard Feynman, Ronald Reagan, Rubik’s Cube, Saturday Night Live, scientific worldview, Search for Extraterrestrial Intelligence, sexual politics, social intelligence, Steven Pinker, theory of mind, Thorstein Veblen, Turing machine, urban decay, Yogi Berra

Perhaps, then, one could dedicate a node to each combination of concepts and roles. There would be a baby-eats-slug node and a slug-eats-baby node. The brain contains a massive number of neurons, one might think, so why not do it that way? One reason not to is that there is massive and then there is really massive. The number of combinations grows exponentially with their allowable size, setting off a combinatorial explosion whose numbers surpass even our most generous guess of the brain’s capacity. According to legend, the vizier Sissa Ben Dahir claimed a humble reward from King Shirham of India for inventing the game of chess. All he asked for was a grain of wheat to be placed on the first square of a chessboard, two grains of wheat on the second, four on the third, and so on. Well before they reached the sixty-fourth square the king discovered he had unwittingly committed all the wheat in his kingdom.

One cost is space: the hardware to hold the information. The limitation is all too clear to microcomputer owners deciding whether to invest in more RAM. Of course the brain, unlike a computer, comes with vast amounts of parallel hardware for storage. Sometimes theorists infer that the brain can store all contingencies in advance and that thought can be reduced to one-step pattern recognition. But the mathematics of a combinatorial explosion bring to mind the old slogan of MTV: Too much is never enough. Simple calculations show that the number of humanly graspable sentences, sentence meanings, chess games, melodies, seeable objects, and so on can exceed the number of particles in the universe. For example, there are thirty to thirty-five possible moves at each point in a chess game, each of which can be followed by thirty to thirty-five responses, defining about a thousand complete turns.

One is that memory cannot hold all the events that bombard our senses; by storing only their categories, we cut down on the load. But the brain, with its trillion synapses, hardly seems short of storage space. It’s reasonable to say that entities cannot fit in memory when the entities are combinatorial—English sentences, chess games, all shapes in all colors and sizes at all locations—because the numbers from combinatorial explosions can exceed the number of particles in the universe and overwhelm even the most generous reckoning of the brain’s capacity. But people live for a paltry two billion seconds, and there is no known reason why the brain could not record every object and event we experience if it had to. Also, we often remember both a category and its members, such as months, family members, continents, and baseball teams, so the category adds to the memory load.


pages: 202 words: 62,901

The People's Republic of Walmart: How the World's Biggest Corporations Are Laying the Foundation for Socialism by Leigh Phillips, Michal Rozworski

Berlin Wall, Bernie Sanders, call centre, carbon footprint, central bank independence, Colonization of Mars, combinatorial explosion, complexity theory, computer age, corporate raider, decarbonisation, discovery of penicillin, Elon Musk, G4S, Georg Cantor, germ theory of disease, Gordon Gekko, greed is good, hiring and firing, index fund, Intergovernmental Panel on Climate Change (IPCC), Internet of things, inventory management, invisible hand, Jeff Bezos, Joseph Schumpeter, linear programming, liquidity trap, mass immigration, Mont Pelerin Society, new economy, Norbert Wiener, oil shock, passive investing, Paul Samuelson, post scarcity, profit maximization, profit motive, purchasing power parity, recommendation engine, Ronald Coase, Ronald Reagan, sharing economy, Silicon Valley, Skype, sovereign wealth fund, strikebreaker, supply-chain management, technoutopianism, The Nature of the Firm, The Wealth of Nations by Adam Smith, theory of mind, transaction costs, Turing machine, union organizing

The unexpected development requires a revision of the projections and schedule of the factory that produces the viscose machines, and this in turn forces an alteration of the projections and schedule of all the factories that produce the parts that make the machine, and in turn the raw materials that make those parts. Waves of impact ripple out across the entire economy in what one reviewer called a “nightmare combinatorial explosion.” And the episode is only there to illustrate what occurs, moment to moment, as a result of what happens to every single one of billions of commodities throughout the economy. Everything affects everything. How is it possible to gather all of these variables? And then, even if it were somehow possible to track all of this, using thousands of the most modern supercomputers with our early twenty-first-century processing speeds, how could we calculate all of that, and constantly reassess it on a daily or even moment to moment basis?


pages: 1,331 words: 163,200

Hands-On Machine Learning With Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems by Aurélien Géron

Amazon Mechanical Turk, Anton Chekhov, combinatorial explosion, computer vision, constrained optimization, correlation coefficient, crowdsourcing, don't repeat yourself, Elon Musk, en.wikipedia.org, friendly AI, ImageNet competition, information retrieval, iterative process, John von Neumann, Kickstarter, natural language processing, Netflix Prize, NP-complete, optical character recognition, P = NP, p-value, pattern recognition, pull request, recommendation engine, self-driving car, sentiment analysis, SpamAssassin, speech recognition, stochastic process

This is made possible by the fact that PolynomialFeatures also adds all combinations of features up to the given degree. For example, if there were two features a and b, PolynomialFeatures with degree=3 would not only add the features a2, a3, b2, and b3, but also the combinations ab, a2b, and ab2. Warning PolynomialFeatures(degree=d) transforms an array containing n features into an array containing features, where n! is the factorial of n, equal to 1 × 2 × 3 × ⋯ × n. Beware of the combinatorial explosion of the number of features! Learning Curves If you perform high-degree Polynomial Regression, you will likely fit the training data much better than with plain Linear Regression. For example, Figure 4-14 applies a 300-degree polynomial model to the preceding training data, and compares the result with a pure linear model and a quadratic model (2nd-degree polynomial). Notice how the 300-degree polynomial model wiggles around to get as close as possible to the training instances.

Linear SVM classifier using polynomial features Polynomial Kernel Adding polynomial features is simple to implement and can work great with all sorts of Machine Learning algorithms (not just SVMs), but at a low polynomial degree it cannot deal with very complex datasets, and with a high polynomial degree it creates a huge number of features, making the model too slow. Fortunately, when using SVMs you can apply an almost miraculous mathematical technique called the kernel trick (it is explained in a moment). It makes it possible to get the same result as if you added many polynomial features, even with very high-degree polynomials, without actually having to add them. So there is no combinatorial explosion of the number of features since you don’t actually add any features. This trick is implemented by the SVC class. Let’s test it on the moons dataset: from sklearn.svm import SVC poly_kernel_svm_clf = Pipeline(( ("scaler", StandardScaler()), ("svm_clf", SVC(kernel="poly", degree=3, coef0=1, C=5)) )) poly_kernel_svm_clf.fit(X, y) This code trains an SVM classifier using a 3rd-degree polynomial kernel.


pages: 237 words: 64,411

Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence by Jerry Kaplan

Affordable Care Act / Obamacare, Amazon Web Services, asset allocation, autonomous vehicles, bank run, bitcoin, Bob Noyce, Brian Krebs, business cycle, buy low sell high, Capital in the Twenty-First Century by Thomas Piketty, combinatorial explosion, computer vision, corporate governance, crowdsourcing, en.wikipedia.org, Erik Brynjolfsson, estate planning, Flash crash, Gini coefficient, Goldman Sachs: Vampire Squid, haute couture, hiring and firing, income inequality, index card, industrial robot, information asymmetry, invention of agriculture, Jaron Lanier, Jeff Bezos, job automation, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, Loebner Prize, Mark Zuckerberg, mortgage debt, natural language processing, Own Your Own Home, pattern recognition, Satoshi Nakamoto, school choice, Schrödinger's Cat, Second Machine Age, self-driving car, sentiment analysis, Silicon Valley, Silicon Valley startup, Skype, software as a service, The Chicago School, The Future of Employment, Turing test, Watson beat the top human players on Jeopardy!, winner-take-all economy, women in the workforce, working poor, Works Progress Administration

Ultimately, this style of AI came to be called the symbolic systems approach. But the early AI researchers quickly ran into a problem: the computers didn’t seem to be powerful enough to do very many interesting tasks. Formalists who studied the arcane field of theory of computation understood that building faster computers could not address this problem. No matter how speedy the computer, it could never tame what was called the “combinatorial explosion.” Solving real-world problems through step-wise analysis had this nasty habit of running out of steam the same way pressure in a city’s water supply drops when vast new tracts of land are filled with housing developments. Imagine finding the quickest driving route from San Francisco to New York by measuring each and every way you could possibly go; your trip would never get started. And even today, that’s not how contemporary mapping applications give you driving instructions, which is why you may notice that they don’t always take the most efficient route.


pages: 719 words: 181,090

Site Reliability Engineering: How Google Runs Production Systems by Betsy Beyer, Chris Jones, Jennifer Petoff, Niall Richard Murphy

Air France Flight 447, anti-pattern, barriers to entry, business intelligence, business process, Checklist Manifesto, cloud computing, combinatorial explosion, continuous integration, correlation does not imply causation, crowdsourcing, database schema, defense in depth, DevOps, en.wikipedia.org, fault tolerance, Flash crash, George Santayana, Google Chrome, Google Earth, information asymmetry, job automation, job satisfaction, Kubernetes, linear programming, load shedding, loose coupling, meta analysis, meta-analysis, microservices, minimum viable product, MVC pattern, performance metric, platform as a service, revision control, risk tolerance, side project, six sigma, the scientific method, Toyota Production System, trickle-down economics, web application, zero day

However, once Backend B determines that the request to the DB Frontend can’t be served (for example, because the request has already been attempted and rejected three times), Backend B has to return to Backend A either an “overloaded; don’t retry” error or a degraded response (assuming that it can produce some moderately useful response even when its request to the DB Frontend failed). Backend A has exactly the same options for the request it received from the Frontend, and proceeds accordingly. Figure 21-2. A stack of dependencies The key point is that a failed request from the DB Frontend should only be retried by Backend B, the layer immediately above it. If multiple layers retried, we’d have a combinatorial explosion. Load from Connections The load associated with connections is one last factor worth mentioning. We sometimes only take into account load at the backends that is caused directly by the requests they receive (which is one of the problems with approaches that model load based upon queries per second). However, doing so overlooks the CPU and memory costs of maintaining a large pool of connections or the cost of a fast rate of churn of connections.

By hosting the code that supports new functionality in the client application before we activate that feature, we greatly reduce the risk associated with a launch. Releasing a new version becomes much easier if we don’t need to maintain parallel release tracks for a version with the new functionality versus without the functionality. This holds particularly true if we’re not dealing with a single piece of new functionality, but a set of independent features that might be released on different schedules, which would necessitate maintaining a combinatorial explosion of different versions. Having this sort of dormant functionality also makes aborting launches easier when adverse effects are discovered during a rollout. In such cases, we can simply switch the feature off, iterate, and release an updated version of the app. Without this type of client configuration, we would have to provide a new version of the app without the feature, and update the app on all users’ phones.


pages: 301 words: 85,126

AIQ: How People and Machines Are Smarter Together by Nick Polson, James Scott

Air France Flight 447, Albert Einstein, Amazon Web Services, Atul Gawande, autonomous vehicles, availability heuristic, basic income, Bayesian statistics, business cycle, Cepheid variable, Checklist Manifesto, cloud computing, combinatorial explosion, computer age, computer vision, Daniel Kahneman / Amos Tversky, Donald Trump, Douglas Hofstadter, Edward Charles Pickering, Elon Musk, epigenetics, Flash crash, Grace Hopper, Gödel, Escher, Bach, Harvard Computers: women astronomers, index fund, Isaac Newton, John von Neumann, late fees, low earth orbit, Lyft, Magellanic Cloud, mass incarceration, Moneyball by Michael Lewis explains big data, Moravec's paradox, more computing power than Apollo, natural language processing, Netflix Prize, North Sea oil, p-value, pattern recognition, Pierre-Simon Laplace, ransomware, recommendation engine, Ronald Reagan, self-driving car, sentiment analysis, side project, Silicon Valley, Skype, smart cities, speech recognition, statistical model, survivorship bias, the scientific method, Thomas Bayes, Uber for X, uber lyft, universal basic income, Watson beat the top human players on Jeopardy!, young professional

The second issue is “missingness.” Most subscribers haven’t watched most films, so most of those trillion-plus entries in the ratings matrix are missing. Moreover, as in the case of the World War II bombers, that missingness pattern is informative. If you haven’t watched Fight Club, maybe you just haven’t gotten around to it—but then again, maybe films about nihilism just do nothing for you. The final issue is combinatorial explosion. Or, if you’d rather stick with Fight Club and philosophy over mathematics: each Netflix subscriber is a beautiful and unique phenomenological snowflake. In a database with only two films, millions of users will share identical like/dislike experiences, since only four such experiences are possible: liked both, liked neither, or liked one but not the other. Not so in a database with 10,000 films.


pages: 301 words: 85,263

New Dark Age: Technology and the End of the Future by James Bridle

AI winter, Airbnb, Alfred Russel Wallace, Automated Insights, autonomous vehicles, back-to-the-land, Benoit Mandelbrot, Bernie Sanders, bitcoin, British Empire, Brownian motion, Buckminster Fuller, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, cognitive bias, cognitive dissonance, combinatorial explosion, computer vision, congestion charging, cryptocurrency, data is the new oil, Donald Trump, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, drone strike, Edward Snowden, fear of failure, Flash crash, Google Earth, Haber-Bosch Process, hive mind, income inequality, informal economy, Internet of things, Isaac Newton, John von Neumann, Julian Assange, Kickstarter, late capitalism, lone genius, mandelbrot fractal, meta analysis, meta-analysis, Minecraft, mutually assured destruction, natural language processing, Network effects, oil shock, p-value, pattern recognition, peak oil, recommendation engine, road to serfdom, Robert Mercer, Ronald Reagan, self-driving car, Silicon Valley, Silicon Valley ideology, Skype, social graph, sorting algorithm, South China Sea, speech recognition, Spread Networks laid a new fibre optics cable between New York and Chicago, stem cell, Stuxnet, technoutopianism, the built environment, the scientific method, Uber for X, undersea cable, University of East Anglia, uranium enrichment, Vannevar Bush, WikiLeaks

In Arimaa, players can arrange their pieces in any configuration, and must move one of their weakest pieces – pawns renamed as rabbits – to the far side of the board to win. They can also use their stronger pieces to push and pull weaker pieces towards a series of trap squares, removing them from the board and clearing the way for the rabbits. The combination of many different initial setups, the ability of pieces to move other pieces, and the possibility of making up to four moves per turn results in combinatorial explosion: a vast increase in possibilities that rapidly becomes too great for a computer programme to handle – the Alterman Wall taken to exponential extremes. Or so it was hoped. The first computer Arimaa tournament was held in 2004, with the most successful programme winning the right to challenge a group of top human players for a cash prize. In the first few years, the humans easily beat their computer opponents, even increasing the margin of victory as their skills in the new game improved faster than the programmes challenging them.


pages: 339 words: 88,732

The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies by Erik Brynjolfsson, Andrew McAfee

"Robert Solow", 2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 3D printing, access to a mobile phone, additive manufacturing, Airbnb, Albert Einstein, Amazon Mechanical Turk, Amazon Web Services, American Society of Civil Engineers: Report Card, Any sufficiently advanced technology is indistinguishable from magic, autonomous vehicles, barriers to entry, basic income, Baxter: Rethink Robotics, British Empire, business cycle, business intelligence, business process, call centre, Charles Lindbergh, Chuck Templeton: OpenTable:, clean water, combinatorial explosion, computer age, computer vision, congestion charging, corporate governance, creative destruction, crowdsourcing, David Ricardo: comparative advantage, digital map, employer provided health coverage, en.wikipedia.org, Erik Brynjolfsson, factory automation, falling living standards, Filter Bubble, first square of the chessboard / second half of the chessboard, Frank Levy and Richard Murnane: The New Division of Labor, Freestyle chess, full employment, G4S, game design, global village, happiness index / gross national happiness, illegal immigration, immigration reform, income inequality, income per capita, indoor plumbing, industrial robot, informal economy, intangible asset, inventory management, James Watt: steam engine, Jeff Bezos, jimmy wales, job automation, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, Khan Academy, knowledge worker, Kodak vs Instagram, law of one price, low skilled workers, Lyft, Mahatma Gandhi, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Mars Rover, mass immigration, means of production, Narrative Science, Nate Silver, natural language processing, Network effects, new economy, New Urbanism, Nicholas Carr, Occupy movement, oil shale / tar sands, oil shock, pattern recognition, Paul Samuelson, payday loans, post-work, price stability, Productivity paradox, profit maximization, Ralph Nader, Ray Kurzweil, recommendation engine, Report Card for America’s Infrastructure, Robert Gordon, Rodney Brooks, Ronald Reagan, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Simon Kuznets, six sigma, Skype, software patent, sovereign wealth fund, speech recognition, statistical model, Steve Jobs, Steven Pinker, Stuxnet, supply-chain management, TaskRabbit, technological singularity, telepresence, The Bell Curve by Richard Herrnstein and Charles Murray, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, total factor productivity, transaction costs, Tyler Cowen: Great Stagnation, Vernor Vinge, Watson beat the top human players on Jeopardy!, winner-take-all economy, Y2K

In the early 1950s, machines were taught how to play checkers and could soon beat respectable amateurs.28 In January 1956, Herbert Simon returned to teaching his class and told his students, “Over Christmas, Al Newell and I invented a thinking machine.” Three years later, they created a computer program modestly called the “General Problem Solver,” which was designed to solve, in principle, any logic problem that could be described by a set of formal rules. It worked well on simple problems like Tic-Tac-Toe or the slightly harder Tower of Hanoi puzzle, although it didn’t scale up to most real-world problems because of the combinatorial explosion of possible options to consider. Cheered by their early successes and those of other artificial intelligence pioneers like Marvin Minsky, John McCarthy and Claude Shannon, and Simon and Newell were quite optimistic about how rapidly machines would master human skills, predicting in 1958 that a digital computer would be the world chess champion by 1968.29 In 1965, Simon went so far as to predict, “machines will be capable, within twenty years, of doing any work a man can do.”30 Simon won the Nobel Prize in Economics in 1978, but he was wrong about chess, not to mention all the other tasks that humans can do.


pages: 315 words: 92,151

Ten Billion Tomorrows: How Science Fiction Technology Became Reality and Shapes the Future by Brian Clegg

Albert Einstein, anthropic principle, Brownian motion, call centre, Carrington event, combinatorial explosion, don't be evil, Ernest Rutherford, experimental subject, game design, gravity well, hive mind, invisible hand, Isaac Newton, Johannes Kepler, John von Neumann, Kickstarter, nuclear winter, pattern recognition, RAND corporation, Ray Kurzweil, RFID, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, silicon-based life, speech recognition, stem cell, Stephen Hawking, Steve Jobs, Turing test

These memories are inserted into the doll’s brain using a chair with some kind of remote electromagnetic stimulus, transferring information stored on what appear to be computer hard drives (referred to in the show as “wedges”). The Dollhouse approach, which bears a resemblance to a whole-brain version of the learning process in The Matrix, seems to underestimate the complexity of what’s going on inside a human skull. The number of potential connections of all the neurons in the brain provides a combinatorial explosion that would require every atom in the universe if we were to try to map out every possible combination. Of course, if the brain can store the data, so can an electronic device, but even in the actual connections in any particular brain, we are talking far more storage than is feasible in a compact device at the moment. In a sense, the Dollhouse approach is more sensible than that in The Matrix, as it doesn’t require the programmer to pinpoint just where the expertise is recorded in order to be able to reproduce it.


The Art of Scalability: Scalable Web Architecture, Processes, and Organizations for the Modern Enterprise by Martin L. Abbott, Michael T. Fisher

always be closing, anti-pattern, barriers to entry, Bernie Madoff, business climate, business continuity plan, business intelligence, business process, call centre, cloud computing, combinatorial explosion, commoditize, Computer Numeric Control, conceptual framework, database schema, discounted cash flows, en.wikipedia.org, fault tolerance, finite state, friendly fire, hiring and firing, Infrastructure as a Service, inventory management, new economy, packet switching, performance metric, platform as a service, Ponzi scheme, RFC: Request For Comment, risk tolerance, Rubik’s Cube, Search for Extraterrestrial Intelligence, SETI@home, shareholder value, Silicon Valley, six sigma, software as a service, the scientific method, transaction costs, Vilfredo Pareto, web application, Y2K

After twenty and a half chapters, you probably can sense where we are going. 319 320 C HAPTER 21 C REATING F AULT I SOLATIVE A RCHITECTURAL S TRUCTURES You should implement just the right amount of fault isolation in your system to generate a positive shareholder return. “OK, thanks, how about telling me how to do that?” you might ask. The answer, unfortunately, is going to depend on your particular needs, the rate of growth and unavailability and causes of unavailability in your system, customer expectation with respect to availability, contractual availability commitments, and a whole host of things that result in a combinatorial explosion, which make it impossible for us to describe for you what you need to do in your environment. That said, there are some simple rules to apply to increase your scalability and availability. We present some of the most useful here to help you in your fault isolation endeavors. Approach 1: Swim Lane the Money-Maker Whatever you do, always make sure that the thing that is most closely related to making money is appropriately isolated from the failures and demand limitations of other systems.

is to answer build systems to answer “Where is the problem?” Often, these systems are out-of-the-box third-party or open source solutions that you install on systems to monitor resource utilization. Some application monitors might also be employed. The data collected by these systems help inform other processes such as our capacity planning process and problem resolution process. Care must be taken to avoid a combinatorial explosion of data, as that data is costly and the value of immense amounts of old data is very low. Finally, we move to answer the question of “What is the problem?” This very often requires us to rely heavily on our architectural principal Design to Be Monitored. Here, we are monitoring individual components, and often these are proprietary applications for which we are responsible. Again, the concerns of data explosion are present, and we must fight to ensure that we are keeping the right data and not diluting shareholder value.


pages: 311 words: 94,732

The Rapture of the Nerds by Cory Doctorow, Charles Stross

3D printing, Ayatollah Khomeini, butterfly effect, cognitive dissonance, combinatorial explosion, complexity theory, Credit Default Swap, dematerialisation, Drosophila, epigenetics, Extropian, gravity well, greed is good, haute couture, hive mind, margin call, negative equity, phenotype, plutocrats, Plutocrats, rent-seeking, Richard Feynman, telepresence, Turing machine, Turing test, union organizing

* * * When the limit is reached, it jars Huw’s self-sense like a long fall to a hard floor, every virtual bone and joint buckling and bending, spine compressing, jaws clacking together. It has been going so well, the end in sight, the time running fast but Huw and father-thing and ambassador running faster, and now— “I’m stuck,” Huw says. “Not a problem. We could play this game forever—the number of variables gives rise to such a huge combinatorial explosion that there isn’t enough mass in this universe to explore all the possible states. The objective of the exercise was to procure a representative sample of moves, played by a proficient emissary, and we’ve now delivered that.” “Hey, wait a minute! ...” Huw’s stomach does a backflip, followed by a triple somersault, and is preparing to unicycle across a tightrope across the Niagara Falls while carrying a drunken hippo on his back: “You mean that was it?”


pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, British Empire, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, low earth orbit, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, spectrum auction, strong AI, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

The success of such a network may be evaluated by examining the number of congressmen surviving an attack and comparing such number to the number of congressmen able to communicate with one another and vote via the communications network. Such an example is, of course, farfetched but not completely without utility.”51 The more alternative connection paths there are between the nodes of a communications net, the more resistant it is to damage from within or without. But there is a combinatorial explosion working the other way: the more you increase the connectivity, the more intelligence and memory is required to route messages efficiently through the net. In a conventional circuit-switched communications network, such as the telephone system, a central switching authority establishes an unbroken connection for every communication, mediating possible conflicts with other connections being made at the same time.


pages: 352 words: 120,202

Tools for Thought: The History and Future of Mind-Expanding Technology by Howard Rheingold

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, card file, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, conceptual framework, Conway's Game of Life, Douglas Engelbart, Dynabook, experimental subject, Hacker Ethic, Howard Rheingold, interchangeable parts, invention of movable type, invention of the printing press, Jacquard loom, John von Neumann, knowledge worker, Marshall McLuhan, Menlo Park, Norbert Wiener, packet switching, pattern recognition, popular electronics, post-industrial society, RAND corporation, Robert Metcalfe, Silicon Valley, speech recognition, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, telemarketer, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

Shannon pointed out that the way most people would design a machine to play chess -- to mechanically examine each alternative move and evaluate it, the so-called brute-force method -- would be virtually impossible, even on the fastest imaginable computer. He estimated that a typical chess game has about 10^120 possible moves, so "A machine calculating one variation each millionth of a second would require over 10^95 years to decide on its first move!" This "combinatorial explosion" -- the rapid and overwhelming buildup of alternatives in any system in which each level leads to two or more deeper levels -- was another one of those secrets of nature that Claude Shannon was in the habit of turning up. The explosive expansion of the number of alternative decisions is a barrier that confronts any attempt to exhaustively examine a branching structure, and continues to confront programmers who seek to emulate cognitive functions by performing searches through problem spaces.


pages: 390 words: 113,737

Someone comes to town, someone leaves town by Cory Doctorow

Burning Man, clean water, combinatorial explosion, dumpster diving

It was easy enough to understand why the arbiters of the system subdivided Motorized Land Vehicles (629.2) into several categories, but here in the 629.22s, where the books on automobiles were, you could see the planners' deficiencies. Automobiles divided into dozens of major subcategories (taxis and limousines, buses, light trucks, cans, lorries, tractor trailers, campers, motorcycles, racing cars, and so on), then ramified into a combinatorial explosion of sub-sub-sub categories. There were Dewey numbers on some of the automotive book spines that had twenty digits or more after the decimal, an entire Dewey Decimal system hidden between 629.2 and 629.3. To the librarian, this shelf-reading looked like your garden-variety screwing around, but what really made her nervous were Alan's excursions through the card catalogue, which required constant tending to replace the cards that errant patrons made unauthorized reorderings of.


pages: 410 words: 119,823

Radical Technologies: The Design of Everyday Life by Adam Greenfield

3D printing, Airbnb, augmented reality, autonomous vehicles, bank run, barriers to entry, basic income, bitcoin, blockchain, business intelligence, business process, call centre, cellular automata, centralized clearinghouse, centre right, Chuck Templeton: OpenTable:, cloud computing, collective bargaining, combinatorial explosion, Computer Numeric Control, computer vision, Conway's Game of Life, cryptocurrency, David Graeber, dematerialisation, digital map, disruptive innovation, distributed ledger, drone strike, Elon Musk, Ethereum, ethereum blockchain, facts on the ground, fiat currency, global supply chain, global village, Google Glasses, IBM and the Holocaust, industrial robot, informal economy, information retrieval, Internet of things, James Watt: steam engine, Jane Jacobs, Jeff Bezos, job automation, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Kevin Kelly, Kickstarter, late capitalism, license plate recognition, lifelogging, M-Pesa, Mark Zuckerberg, means of production, megacity, megastructure, minimum viable product, money: store of value / unit of account / medium of exchange, natural language processing, Network effects, New Urbanism, Occupy movement, Oculus Rift, Pareto efficiency, pattern recognition, Pearl River Delta, performance metric, Peter Eisenman, Peter Thiel, planetary scale, Ponzi scheme, post scarcity, post-work, RAND corporation, recommendation engine, RFID, rolodex, Satoshi Nakamoto, self-driving car, sentiment analysis, shareholder value, sharing economy, Silicon Valley, smart cities, smart contracts, social intelligence, sorting algorithm, special economic zone, speech recognition, stakhanovite, statistical model, stem cell, technoutopianism, Tesla Model S, the built environment, The Death and Life of Great American Cities, The Future of Employment, transaction costs, Uber for X, undersea cable, universal basic income, urban planning, urban sprawl, Whole Earth Review, WikiLeaks, women in the workforce

The problematic before us then actually would become the Keynesian (or Olympian) one of learning to live “wisely and agreeably and well” under conditions of absolute and universal freedom from want. In the end, what is it that people want from these technologies? As near as I can tell, a few want just exactly what some have always wanted from other human beings: a cheap, reliable, docile labor force. Others, though, are seeking something less tangible: sense, meaning, order, a ward against uncertainty. They’re looking for something that might help them master the combinatorial explosion of possibility on a planet where nine billion people are continually knitting their own world-lines; for just a little reassurance, in a world populated by so many conscious actors that it often feels like it’s spinning out of anyone’s control. These are impulses I think most of us can relate to, and intuitively react to with some sympathy. And it’s this class of desires that I think we should keep in mind as we explore the mechanics of machine learning, automated pattern recognition and decision-making.


pages: 303 words: 67,891

Advances in Artificial General Intelligence: Concepts, Architectures and Algorithms: Proceedings of the Agi Workshop 2006 by Ben Goertzel, Pei Wang

AI winter, artificial general intelligence, bioinformatics, brain emulation, combinatorial explosion, complexity theory, computer vision, conceptual framework, correlation coefficient, epigenetics, friendly AI, G4S, information retrieval, Isaac Newton, John Conway, Loebner Prize, Menlo Park, natural language processing, Occam's razor, p-value, pattern recognition, performance metric, Ray Kurzweil, Rodney Brooks, semantic web, statistical model, strong AI, theory of mind, traveling salesman, Turing machine, Turing test, Von Neumann architecture, Y2K

But we could move toward AGI a lot faster if there were a nicer programming language with anywhere near the same scalability as C++. Moving on: This is not quite a bottleneck, but I would say that if the Novamente system is going to fail to achieve AGI, which I think is quite unlikely, then it would be because of a failure in the aspect of the design wherein the different parts of the system all interact with each other dynamically, to stop each other from coming to horrible combinatorial explosions. A difficult thing is that AI is all about emergence and synergy, so that in order to really test your system, you have test all the parts, put them together in combination, and look at the emergence effects. And that’s actually hard. The most basic bottleneck is that you are building an emergent system that has to be understood and tested as a whole, rather than a system that can be implemented and tested piece by piece.


pages: 528 words: 146,459

Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray, Nathan L. Ensmenger, Jeffrey R. Yost

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, borderless world, Buckminster Fuller, Build a better mousetrap, Byte Shop, card file, cashless society, cloud computing, combinatorial explosion, computer age, deskilling, don't be evil, Donald Davies, Douglas Engelbart, Douglas Engelbart, Dynabook, fault tolerance, Fellow of the Royal Society, financial independence, Frederick Winslow Taylor, game design, garden city movement, Grace Hopper, informal economy, interchangeable parts, invention of the wheel, Jacquard loom, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Kickstarter, light touch regulation, linked data, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, natural language processing, Network effects, New Journalism, Norbert Wiener, Occupy movement, optical character recognition, packet switching, PageRank, pattern recognition, Pierre-Simon Laplace, pirate software, popular electronics, prediction markets, pre–internet, QWERTY keyboard, RAND corporation, Robert X Cringely, Silicon Valley, Silicon Valley startup, Steve Jobs, Steven Levy, Stewart Brand, Ted Nelson, the market place, Turing machine, Vannevar Bush, Von Neumann architecture, Whole Earth Catalog, William Shockley: the traitorous eight, women in the workforce, young professional

The resulting rationalization of production processes and standardization of components had reduced manufacturing costs to such an extent that IBM had no effective competition in punched-card machines at all. The biggest problem, however, was not in hardware but in software. Because the number of software packages IBM offered to its customers was constantly increasing, the proliferation of computer models created a nasty gearing effect: given m different computer models, each requiring n different software packages, a total of m × n programs had to be developed and supported. This was a combinatorial explosion that threatened to overwhelm IBM at some point in the not-too-distant future. Just as great a problem was that of the software written by IBM’s customers. Because computers were so narrowly targeted at a specific market niche, it was not possible for a company to expand its computer system in size by more than a factor of about two without changing to a different computer model. If this was done, then all the user’s applications had to be reprogrammed.


pages: 834 words: 180,700

The Architecture of Open Source Applications by Amy Brown, Greg Wilson

8-hour work day, anti-pattern, bioinformatics, c2.com, cloud computing, collaborative editing, combinatorial explosion, computer vision, continuous integration, create, read, update, delete, David Heinemeier Hansson, Debian, domain-specific language, Donald Knuth, en.wikipedia.org, fault tolerance, finite state, Firefox, friendly fire, Guido van Rossum, linked data, load shedding, locality of reference, loose coupling, Mars Rover, MITM: man-in-the-middle, MVC pattern, peer-to-peer, Perl 6, premature optimization, recommendation engine, revision control, Ruby on Rails, side project, Skype, slashdot, social web, speech recognition, the scientific method, The Wisdom of Crowds, web application, WebSocket

For example, there is a JavascriptExecutor interface that provides the ability to execute arbitrary chunks of Javascript in the context of the current page. A successful cast of a WebDriver instance to that interface indicates that you can expect the methods on it to work. Figure 16.1: Accountant and Stockist Depend on Shop Figure 16.2: Shop Implements HasBalance and Stockable 16.4.2. Dealing with the Combinatorial Explosion One of the first things that is apparent from a moment's thought about the wide range of browsers and languages that WebDriver supports is that unless care is taken it would quickly face an escalating cost of maintenance. With X browsers and Y languages, it would be very easy to fall into the trap of maintaining X×Y implementations. Reducing the number of languages that WebDriver supports would be one way to reduce this cost, but we don't want to go down this route for two reasons.


pages: 612 words: 187,431

The Art of UNIX Programming by Eric S. Raymond

A Pattern Language, Albert Einstein, barriers to entry, bioinformatics, Clayton Christensen, combinatorial explosion, commoditize, correlation coefficient, David Brooks, Debian, domain-specific language, don't repeat yourself, Donald Knuth, Everything should be made as simple as possible, facts on the ground, finite state, general-purpose programming language, George Santayana, Innovator's Dilemma, job automation, Larry Wall, MVC pattern, pattern recognition, Paul Graham, peer-to-peer, premature optimization, pre–internet, publish or perish, revision control, RFC: Request For Comment, Richard Stallman, Robert Metcalfe, Steven Levy, transaction costs, Turing complete, Valgrind, wage slave, web application

The very fact that no-commercial-use licenses create uncertainty about a redistributor's legal exposure is a serious strike against them. One of the objectives of the OSD is to ensure that people in the distribution chain of OSD-conforming software do not need to consult with intellectual-property lawyers to know what their rights are. OSD forbids complicated restrictions against persons, groups, and occupations partly so that people dealing with collections of software will not face a combinatorial explosion of slightly differing (and perhaps conflicting) restrictions on what they can do with it. This concern is not hypothetical, either. One important part of the open-source distribution chain is CD-ROM distributors who aggregate it in useful collections ranging from simple anthology CDs up to bootable operating systems. Restrictions that would make life prohibitively complicated for CD-ROM distributors, or others trying to spread open-source software commercially, have to be forbidden.


pages: 647 words: 43,757

Types and Programming Languages by Benjamin C. Pierce

Albert Einstein, combinatorial explosion, experimental subject, finite state, Henri Poincaré, Perl 6, Russell's paradox, sorting algorithm, Turing complete, Turing machine, type inference, Y Combinator

Our goal is to develop algorithms for checking membership in the least and greatest fixed points of a generating function F. The basic steps in these algorithms will involve "running F backwards": to check membership for an element x, we need to ask how x could have been generated by F. The advantage of an invertible F is that there is at most one way to generate a given x. For a non-invertible F, elements can be generated in multiple ways, leading to a combinatorial explosion in the number of paths that the algorithm must explore. From now on, we restrict our attention to invertible generating functions. 21.5.3 Definition An element x is F-supported if supportF(x)↓; otherwise, x is F-unsupported. An F-supported element is called F-ground if supportF(x) = ø. Note that an unsupported element x does not appear in F(X) for any X, while a ground x is in F(X) for every X.


pages: 746 words: 221,583

The Children of the Sky by Vernor Vinge

combinatorial explosion, epigenetics, indoor plumbing, megacity, MITM: man-in-the-middle, random walk, risk tolerance, technological singularity, the scientific method, Vernor Vinge

Similarly, Amdi had probably said that “someone” had betrayed “something”—but the software had generated the particular nouns from a long list of suspects. It was amazing that Jefri had even made it onto that list, much less coming out at the top. So what logic had put him there? She drilled down through the program’s reasoning, into depths she had never visited. As suspected, the “why I chose ‘this’ over ‘that’” led to a combinatorial explosion. She could spend centuries studying this—and get nowhere. Ravna leaned back in her chair, turning her head this way and that, trying to get the stress out her neck. What am I missing? Of course, the program could simply be broken. Oobii’s emergency automation was specially designed to run in the Slow Zone, but the surveillance program was a bit of purely Beyonder software, not on the ship’s Usables manifest.


pages: 933 words: 205,691

Hadoop: The Definitive Guide by Tom White

Amazon Web Services, bioinformatics, business intelligence, combinatorial explosion, database schema, Debian, domain-specific language, en.wikipedia.org, fault tolerance, full text search, Grace Hopper, information retrieval, Internet Archive, Kickstarter, linked data, loose coupling, openstreetmap, recommendation engine, RFID, SETI@home, social graph, web application

It also specifies many of the other features of Avro that implementations should support. One area that the specification does not rule on, however, is APIs: implementations have complete latitude in the API they expose for working with Avro data, since each one is necessarily language-specific. The fact that there is only one binary format is significant, since it means the barrier for implementing a new language binding is lower, and avoids the problem of a combinatorial explosion of languages and formats, which would harm interoperability. Avro has rich schema resolution capabilities. Within certain carefully defined constraints, the schema used to read data need not be identical to the schema that was used to write the data. This is the mechanism by which Avro supports schema evolution. For example, a new, optional field may be added to a record by declaring it in the schema used to read the old data.


pages: 556 words: 46,885

The World's First Railway System: Enterprise, Competition, and Regulation on the Railway Network in Victorian Britain by Mark Casson

banking crisis, barriers to entry, Beeching cuts, British Empire, business cycle, combinatorial explosion, Corn Laws, corporate social responsibility, David Ricardo: comparative advantage, intermodal, iterative process, joint-stock company, joint-stock limited liability company, Kickstarter, knowledge economy, linear programming, Network effects, New Urbanism, performance metric, railway mania, rent-seeking, strikebreaker, the market place, transaction costs

The method by which his counterfactual canal system was derived is not fully explained in the Appendix, and his estimate of its performance is based on guesswork (p. 38). Network optimization cannot be eVected by linear programming, as Fogel mistakenly suggests. Making a connection between two locations involves a binary decision: the two locations are either connected or they are not. Network optimization is therefore an integer programming problem, and problems of this kind encounter combinatorial explosion: the number of possible network structures increases at an accelerating rate as the number of locations to be served rises. An additional complexity arises from the fact that the optimal location for a railway junction may be in the middle of the countryside rather than at a town. Constraining all junctions to be at towns may reduce the performance of a network quite considerably. As indicated above, the actual network made extensive use of rural junctions, at places such as Crewe, Swindon, and Eastleigh, and lesser-known centres such as Evercreech, Broom, and Melton Constable.


pages: 798 words: 240,182

The Transhumanist Reader by Max More, Natasha Vita-More

23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Douglas Engelbart, Drosophila, en.wikipedia.org, endogenous growth, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, pattern recognition, Pepto Bismol, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Ronald Reagan, scientific worldview, silicon-based life, Singularitarianism, social intelligence, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce, zero-sum game

Finally, there are limiting factors to fast growth, such as economic returns (if very few can afford a new technology it will be very expensive and not as profitable as a mass market technology), constraints on development speed (even advanced manufacturing processes need time for reconfiguration, development, and testing), human adaptability, and especially the need for knowledge. As the amount of knowledge grows, it becomes harder and harder to keep up and to get an overview, necessitating specialization. Even if information technologies can help somewhat, the basic problem remains, with the combinatorial explosion of possible combinations of different fields. This means that a development project might need specialists in many areas, which in turns means that there is a smaller size of group able to do the development. In turn, this means that it is very hard for a small group to get far ahead of everybody else in all areas, simply because it will not have the necessary know-how in all necessary areas.


pages: 846 words: 232,630

Darwin's Dangerous Idea: Evolution and the Meanings of Life by Daniel C. Dennett

Albert Einstein, Alfred Russel Wallace, anthropic principle, assortative mating, buy low sell high, cellular automata, combinatorial explosion, complexity theory, computer age, conceptual framework, Conway's Game of Life, Danny Hillis, double helix, Douglas Hofstadter, Drosophila, finite state, Gödel, Escher, Bach, In Cold Blood by Truman Capote, invention of writing, Isaac Newton, Johann Wolfgang von Goethe, John von Neumann, Murray Gell-Mann, New Journalism, non-fiction novel, Peter Singer: altruism, phenotype, price mechanism, prisoner's dilemma, QWERTY keyboard, random walk, Richard Feynman, Rodney Brooks, Schrödinger's Cat, selection bias, Stephen Hawking, Steven Pinker, strong AI, the scientific method, theory of mind, Thomas Malthus, Turing machine, Turing test

I do not at all intend this to be a shocking indictment, just a reminder of something quite obvious: no remotely compelling system of ethics has ever been made computationally tractable, even indirectly, for real-world moral problems. So, even though there has been no dearth of utilitarian (and Kantian, and contractarian, etc.) arguments in favor of particular policies, institutions, practices, and acts, these have all been heavily hedged with ceteris paribus clauses and plausibility claims about their idealizing assumptions. These hedges are designed to overcome the combinatorial explosion of calculation that threatens if one actually attempts — as theory says one must — to consider all things. And as arguments — not derivations — they have all been controversial (which is not to say that none of them could be sound in the last analysis). To get a better sense of the difficulties that contribute to actual moral reasoning, let us give ourselves a smallish moral problem and see what we do with it.


pages: 923 words: 516,602

The C++ Programming Language by Bjarne Stroustrup

combinatorial explosion, conceptual framework, database schema, distributed generation, Donald Knuth, fault tolerance, general-purpose programming language, index card, iterative process, job-hopping, locality of reference, Menlo Park, Parkinson's law, premature optimization, sorting algorithm

Consequently, we could simply declare only one version of the equality operator for ccoom mpplleexx: bbooooll ooppeerraattoorr==(ccoom mpplleexx,ccoom mpplleexx); vvooiidd ff(ccoom mpplleexx { xx==yy; xx==33; 33==yy; } xx, ccoom mpplleexx yy) // means operator==(x,y) // means operator==(x,complex(3)) // means operator==(complex(3),y) There can be reasons for preferring to define separate functions. For example, in some cases the conversion can impose overheads, and in other cases, a simpler algorithm can be used for specific argument types. Where such issues are not significant, relying on conversions and providing only the most general variant of a function – plus possibly a few critical variants – contains the combinatorial explosion of variants that can arise from mixed-mode arithmetic. Where several variants of a function or an operator exist, the compiler must pick ‘‘the right’’ variant based on the argument types and the available (standard and user-defined) conversions. Unless a best match exists, an expression is ambiguous and is an error (see §7.4). An object constructed by explicit or implicit use of a constructor is automatic and will be destroyed at the first opportunity (see §10.4.10).


Engineering Security by Peter Gutmann

active measures, algorithmic trading, Amazon Web Services, Asperger Syndrome, bank run, barriers to entry, bitcoin, Brian Krebs, business process, call centre, card file, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, Credit Default Swap, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, Debian, domain-specific language, Donald Davies, Donald Knuth, double helix, en.wikipedia.org, endowment effect, fault tolerance, Firefox, fundamental attribution error, George Akerlof, glass ceiling, GnuPG, Google Chrome, iterative process, Jacob Appelbaum, Jane Jacobs, Jeff Bezos, John Conway, John Markoff, John von Neumann, Kickstarter, lake wobegon effect, Laplace demon, linear programming, litecoin, load shedding, MITM: man-in-the-middle, Network effects, Parkinson's law, pattern recognition, peer-to-peer, Pierre-Simon Laplace, place-making, post-materialism, QR code, race to the bottom, random walk, recommendation engine, RFID, risk tolerance, Robert Metcalfe, Ruby on Rails, Sapir-Whorf hypothesis, Satoshi Nakamoto, security theater, semantic web, Skype, slashdot, smart meter, social intelligence, speech recognition, statistical model, Steve Jobs, Steven Pinker, Stuxnet, telemarketer, text mining, the built environment, The Death and Life of Great American Cities, The Market for Lemons, the payments system, Therac-25, too big to fail, Turing complete, Turing machine, Turing test, web application, web of trust, x509 certificate, Y2K, zero day, Zimmermann PGP

In the absence of this, it’s safer to not explicitly depend on any of it working as intended and build your security controls appropriately. The second option is to use the approach suggested earlier by Michael Ströder and try and field-qualify every version of every application that you plan to use for processing certificates to see how it’ll react to every certificate extension and feature that you care about. Obviously this is impossible to do in general because of the 724 PKI combinatorial explosion of certificate extensions and features (one survey that covered only SSL/TLS server certificates found 219 different combinations of just the keyUsage and basicConstraints.cA flags, including many that were completely illogical [504]), but if you’re using a very small number of features and only one or two applications and can control which versions get deployed and in which configurations they get used then it may be feasible.