iterative process

89 results back to index

pages: 893 words: 199,542

Structure and interpretation of computer programs by Harold Abelson, Gerald Jay Sussman, Julie Sussman


Andrew Wiles, conceptual framework, Douglas Hofstadter, Eratosthenes, Fermat's Last Theorem, Gödel, Escher, Bach, industrial robot, information retrieval, iterative process, loose coupling, probability theory / Blaise Pascal / Pierre de Fermat, Richard Stallman, Turing machine

By contrast, the second process does not grow and shrink. At each step, all we need to keep track of, for any n, are the current values of the variables product, counter, and max-count. We call this an iterative process. In general, an iterative process is one whose state can be summarized by a fixed number of state variables, together with a fixed rule that describes how the state variables should be updated as the process moves from state to state and an (optional) end test that specifies conditions under which the process should terminate. In computing n!, the number of steps required grows linearly with n. Such a process is called a linear iterative process. The contrast between the two processes can be seen in another way. In the iterative case, the program variables provide a complete description of the state of the process at any point.

It may seem disturbing that we refer to a recursive procedure such as fact-iter as generating an iterative process. However, the process really is iterative: Its state is captured completely by its three state variables, and an interpreter need keep track of only three variables in order to execute the process. One reason that the distinction between process and procedure may be confusing is that most implementations of common languages (including Ada, Pascal, and C) are designed in such a way that the interpretation of any recursive procedure consumes an amount of memory that grows with the number of procedure calls, even when the process described is, in principle, iterative. As a consequence, these languages can describe iterative processes only by resorting to special-purpose “looping constructs” such as do, repeat, until, for, and while.

As a consequence, these languages can describe iterative processes only by resorting to special-purpose “looping constructs” such as do, repeat, until, for, and while. The implementation of Scheme we shall consider in chapter 5 does not share this defect. It will execute an iterative process in constant space, even if the iterative process is described by a recursive procedure. An implementation with this property is called tail-recursive. With a tail-recursive implementation, iteration can be expressed using the ordinary procedure call mechanism, so that special iteration constructs are useful only as syntactic sugar.31 Exercise 1.9. Each of the following two procedures defines a method for adding two positive integers in terms of the procedures inc, which increments its argument by 1, and dec, which decrements its argument by 1. (define (+ a b) (if (= a 0) b (inc (+ (dec a) b)))) (define (+ a b) (if (= a 0) b (+ (dec a) (inc b)))) Using the substitution model, illustrate the process generated by each procedure in evaluating (+ 4 5).

pages: 1,387 words: 202,295

Structure and Interpretation of Computer Programs, Second Edition by Harold Abelson, Gerald Jay Sussman, Julie Sussman


Andrew Wiles, conceptual framework, Douglas Hofstadter, Eratosthenes, Gödel, Escher, Bach, industrial robot, information retrieval, iterative process, loose coupling, probability theory / Blaise Pascal / Pierre de Fermat, Richard Stallman, Turing machine, wikimedia commons

By contrast, the second process does not grow and shrink. At each step, all we need to keep track of, for any , are the current values of the variables product, counter, and max-count. We call this an iterative process. In general, an iterative process is one whose state can be summarized by a fixed number of state variables, together with a fixed rule that describes how the state variables should be updated as the process moves from state to state and an (optional) end test that specifies conditions under which the process should terminate. In computing , the number of steps required grows linearly with . Such a process is called a linear iterative process. The contrast between the two processes can be seen in another way. In the iterative case, the program variables provide a complete description of the state of the process at any point.

It may seem disturbing that we refer to a recursive procedure such as fact-iter as generating an iterative process. However, the process really is iterative: Its state is captured completely by its three state variables, and an interpreter need keep track of only three variables in order to execute the process. One reason that the distinction between process and procedure may be confusing is that most implementations of common languages (including Ada, Pascal, and C) are designed in such a way that the interpretation of any recursive procedure consumes an amount of memory that grows with the number of procedure calls, even when the process described is, in principle, iterative. As a consequence, these languages can describe iterative processes only by resorting to special-purpose “looping constructs” such as do, repeat, until, for, and while.

As a consequence, these languages can describe iterative processes only by resorting to special-purpose “looping constructs” such as do, repeat, until, for, and while. The implementation of Scheme we shall consider in Chapter 5 does not share this defect. It will execute an iterative process in constant space, even if the iterative process is described by a recursive procedure. An implementation with this property is called tail-recursive. With a tail-recursive implementation, iteration can be expressed using the ordinary procedure call mechanism, so that special iteration constructs are useful only as syntactic sugar.31 Exercise 1.9: Each of the following two procedures defines a method for adding two positive integers in terms of the procedures inc, which increments its argument by 1, and dec, which decrements its argument by 1. (define (+ a b) (if (= a 0) b (inc (+ (dec a) b)))) (define (+ a b) (if (= a 0) b (+ (dec a) (inc b)))) Using the substitution model, illustrate the process generated by each procedure in evaluating (+ 4 5).

pages: 132 words: 31,976

Getting Real by Jason Fried, David Heinemeier Hansson, Matthew Linderman, 37 Signals


call centre, collaborative editing, iterative process, John Gruber, knowledge worker, Merlin Mann, Metcalfe's law, performance metric, premature optimization, slashdot, Steve Jobs, web application

—Matt Hamer, developer and product manager, Kinja Table of contents | Essay list for this chapter | Next essay Rinse and Repeat Work in iterations Don't expect to get it right the first time. Let the app grow and speak to you. Let it morph and evolve. With web-based software there's no need to ship perfection. Design screens, use them, analyze them, and then start over again. Instead of banking on getting everything right upfront, the iterative process lets you continue to make informed decisions as you go along. Plus, you'll get an active app up and running quicker since you're not striving for perfection right out the gate. The result is real feedback and real guidance on what requires your attention. Iterations lead to liberation You don't need to aim for perfection on the first try if you know it's just going to be done again later anyway.

No One's Going to Read It I can't even count how many multi-page product specifications or business requirement documents that have languished, unread, gathering dust nearby my dev team while we coded away, discussing problems, asking questions and user testing as we went. I've even worked with developers who've spent hours writing long, descriptive emails or coding standards documents that also went unread. Webapps don't move forward with copious documentation. Software development is a constantly shifting, iterative process that involves interaction, snap decisions, and impossible-to-predict issues that crop up along the way. None of this can or should be captured on paper. Don't waste your time typing up that long visionary tome; no one's going to read it. Take consolation in the fact that if you give your product enough room to grow itself, in the end it won't resemble anything you wrote about anyway. —Gina Trapani, web developer and editor of Lifehacker, the productivity and software guide Table of contents | Essay list for this chapter | Next essay Tell Me a Quick Story Write stories, not details If you do find yourself requiring words to explain a new feature or concept, write a brief story about it.

pages: 1,758 words: 342,766

Code Complete (Developer Best Practices) by Steve McConnell


Ada Lovelace, Albert Einstein, Buckminster Fuller, call centre, choice architecture, continuous integration, data acquisition, database schema, fault tolerance, Grace Hopper, haute cuisine, if you see hoof prints, think horses—not zebras, index card, inventory management, iterative process, late fees, loose coupling, Menlo Park, place-making, premature optimization, revision control, slashdot, sorting algorithm, statistical model, Tacoma Narrows Bridge, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Turing machine, web application

The quality of the thinking that goes into a program largely determines the quality of the program, so paying attention to warnings about the quality of thinking directly affects the final product. 34.8. Iterate, Repeatedly, Again and Again Iteration is appropriate for many software-development activities. During your initial specification of a system, you work with the user through several versions of requirements until you're sure you agree on them. That's an iterative process. When you build flexibility into your process by building and delivering a system in several increments, that's an iterative process. If you use prototyping to develop several alternative solutions quickly and cheaply before crafting the final product, that's another form of iteration. Iterating on requirements is perhaps as important as any other aspect of the software-development process. Projects fail because they commit themselves to a solution before exploring alternatives.

—Scott Meyers The interface to a class should reveal as little as possible about its inner workings. As shown in Figure 5-9, a class is a lot like an iceberg: seven-eighths is under water, and you can see only the one-eighth that's above the surface. Figure 5-9. A good class interface is like the tip of an iceberg, leaving most of the class unexposed [View full size image] Designing the class interface is an iterative process just like any other aspect of design. If you don't get the interface right the first time, try a few more times until it stabilizes. If it doesn't stabilize, you need to try a different approach. An Example of Information Hiding Suppose you have a program in which each object is supposed to have a unique ID stored in a member variable called id. One design approach would be to use integers for the IDs and to store the highest ID assigned so far in a global variable called g_maxId.

Iterate You might have had an experience in which you learned so much from writing a program that you wished you could write it again, armed with the insights you gained from writing it the first time. The same phenomenon applies to design, but the design cycles are shorter and the effects downstream are bigger, so you can afford to whirl through the design loop a few times. Design is an iterative process. You don't usually go from point A only to point B; you go from point A to point B and back to point A. As you cycle through candidate designs and try different approaches, you'll look at both high-level and low-level views. The big picture you get from working with high-level issues will help you to put the low-level details in perspective. The details you get from working with low-level issues will provide a foundation in solid reality for the high-level decisions.

pages: 416 words: 39,022

Asset and Risk Management: Risk Oriented Finance by Louis Esch, Robert Kieffer, Thierry Lopez


asset allocation, Brownian motion, business continuity plan, business process, capital asset pricing model, computer age, corporate governance, discrete time, diversified portfolio, implied volatility, index fund, interest rate derivative, iterative process, P = NP, p-value, random walk, risk/return, shareholder value, statistical model, stochastic process, transaction costs, value at risk, Wiener process, yield curve, zero-coupon bond

Let us now assume that we wish to determine a solution a with a degree of precision ε. We could stop the iterative process on the basis of the error estimation formula. These formulae, however, require a certain level of information on the derivative f (x), information that is not easy to obtain. On the other hand, the limit specification εa will not generally be known beforehand.3 Consequently, we are running the risk of ε, the accuracy level sought, never being reached, as it is better than the limit precision εa (ε < εa ). In this case, the iterative process will carry on indefinitely. This leads us to accept the following stop criterion: |xn − xn−1 | < ε |xn+1 − xn | ≥ |xn − xn−1 | This means that the iteration process will be stopped when the iteration n produces a variation in value less than that of the iteration n + 1.

In addition, if the Jacobian matrix J(x), defined by [J(x)]ij = gj (x) xi is such that for every x ∈ I , ||J(x)|| ≤ m for a norm compatible with m < 1, Lipschitz’s condition is satisfied. The order of convergence is defined by lim k→∞ ||ek+1 || =C ||ek ||p where C is the constant for the asymptotic error. 8.3.2 Principal methods If one chooses a constant matrix A as the value for A(x), the iterative process is the generalisation in n dimensions of the chord method. If the inverse of the Jacobian matrix of f is chosen as the value of A(x), we will obtain the generalisation in n dimensions of the Newton–Raphson method. Another approach to solving the equation f (x) = 0 involves using the i th equation to determine the (i + 1)th component. Therefore, for i = 1, 2, . . . , n, the following equations will be solved in succession: (k+1) (k) fi (x1(k+1) , . . . , xi−1 , xi , xi+1 , . . . , xn(k) ) = 0 with respect to xi .

pages: 398 words: 31,161

Gnuplot in Action: Understanding Data With Graphs by Philipp Janert


bioinformatics, business intelligence, centre right, Debian, general-purpose programming language, iterative process, mandelbrot fractal, pattern recognition, random walk, Richard Stallman, six sigma

Here are just a few items of practical advice to get you started. You may also want to take a look at the gnuplot reference documention for further discussion and additional features. Since the fitting algorithm is an iterative process, it’s not guaranteed to converge. If the iteration doesn’t converge, or converges to an obviously wrong solution, try to initialize the fitting parameters with better starting values. Unless the variables have been initialized explicitly, they’ll be equal to zero, which is often a particularly bad starting value. In special situations, you may also want to try hand-tuning the iteration process itself by fiddling with values of FIT_START_LAMBDA and FIT_LAMBDA_FACTOR. All fitting parameters should be of roughly equal scale. If some of the parameters differ wildly (by many orders of magnitude) from one another, the fitting function should be modified to take these factors into account explicitly.

It takes only one line to read and plot a data file, and most of the command syntax is straightforward and quite intuitive. Gnuplot does not require programming or any deeper understanding of its command syntax to get started. So this is the fundamental workflow of all work with gnuplot: plot, examine, repeat—until you have found out whatever you wanted to learn from the data. Gnuplot supports the iterative process model required for exploratory work perfectly. 1.3.1 Gnuplot isn’t GNU To dispel one common confusion right away: gnuplot isn’t GNU software, has nothing to do with the GNU project, and isn’t released under the GNU Public License (GPL). Gnuplot is released under a permissive open source license. Gnuplot has been around a long time—a very long time! It was started by Thomas Williams and Colin Kelley in 1986.

In general, the colors are distributed rather uniformly over the entire spectrum, because this matches up with the regularly varying function in this plot. 9.4.2 A complex figure As an example of a graph that includes a lot of fine detail, I’ve chosen a section from the edge of the Mandelbrot set. The Mandelbrot set is the set of all points in the complex plane for which a certain simple iteration process stays bounded. What’s noteworthy here is that the border between points inside the set and outside of it isn’t smooth—in fact the border is “infinitely” complicated, showing details at all levels of magnification.6 For points far from the Mandelbrot set, the iteration will diverge quickly (after just a few steps). But as we approach the border, the iteration will take many more steps before finally diverging.

pages: 410 words: 114,005

Black Box Thinking: Why Most People Never Learn From Their Mistakes--But Some Do by Matthew Syed


Alfred Russel Wallace, Arthur Eddington, Atul Gawande, Black Swan, British Empire, call centre, Captain Sullenberger Hudson, Checklist Manifesto, cognitive bias, cognitive dissonance, conceptual framework, corporate governance, credit crunch, deliberate practice, double helix, epigenetics, fear of failure, fundamental attribution error, Henri Poincaré, hindsight bias, Isaac Newton, iterative process, James Dyson, James Hargreaves, James Watt: steam engine, Joseph Schumpeter, Lean Startup, meta analysis, meta-analysis, minimum viable product, quantitative easing, randomized controlled trial, Silicon Valley, six sigma, spinning jenny, Steve Jobs, the scientific method, Thomas Kuhn: the structure of scientific revolutions, too big to fail, Toyota Production System, Wall-E, Yom Kippur War

One of the pit stops I witnessed was completed in an astonishing 1.95 seconds.* Vowles said: The secret to modern F1 is not really to do with big ticket items; it is about hundreds of thousands of small items, optimized to the nth degree. People think that things like engines are based upon high-level strategic decisions, but they are not. What is an engine except many iterations of small components? You start with a sensible design, but it is the iterative process that guides you to the best solution. Success is about creating the most effective optimization loop. I also spoke to Andy Cowell, the leader of the team that devised the engine. His attitude was a carbon copy of that of Vowles. We got our development engine up and running in late December [2012]. We didn’t design it to be car friendly. We didn’t try and figure out the perfect weight and aerodynamic design.

“A cyclone has a number of variables: size of entry, exit, angle, diameter, length: and the trying thing is that if you change one dimension, it affects all the others.” His discipline was astonishing. “I couldn’t afford a computer, so I would hand-write the results into a book,” he recalls. “In the first year alone, I conducted literally hundreds of experiments. It was a very, very thick book.” But as the intensive, iterative process gradually solved the problem of separating ultra-fine dust, Dyson came up against another problem: long pieces of hair and fluff. These were not being separated from the airflow by the cyclone dynamics. “They were just coming out of the top along with the air,” he says. “It was another huge problem and it didn’t seem as if a conventional cyclone could solve it.” The sheer scale of the problem set the stage for a second eureka moment: the dual cyclone.

A good storyline is an act of creative synthesis: bringing disparate narrative strands together in novel form. It is a crucial part of the Pixar process. But now consider what happens next. The story line is pulled apart. As the animation gets into operation, each frame, each strand of the story, each scene is subject to debate, dissent, and testing. All told, it takes around twelve thousand storyboard drawings to make one ninety-minute feature, and because of the iterative process, story teams often create more than 125,000 storyboards by the time the film is actually delivered. Monsters, Inc. is a perfect illustration of a creative idea adapted in the light of criticism. It started off with a plot centered on a middle-aged accountant who hates his job and who is given a sketchbook by his mother. As a child he had drawn some monsters in the sketchbook and that night they turn up in his bedroom, but only the accountant can see them.

pages: 396 words: 112,748

Chaos by James Gleick


Benoit Mandelbrot, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Murray Gell-Mann, Norbert Wiener, pattern recognition, Richard Feynman, Richard Feynman, Stephen Hawking, stochastic process, trade route

The Mandelbrot set became a kind of public emblem for chaos, appearing on the glossy covers of conference brochures and engineering quarterlies, forming the centerpiece of an exhibit of computer art that traveled internationally in 1985 and 1986. Its beauty was easy to feel from these pictures; harder to grasp was the meaning it had for the mathematicians who slowly understood it. Many fractal shapes can be formed by iterated processes in the complex plane, but there is just one Mandelbrot set. It started appearing, vague and spectral, when Mandelbrot tried to find a way of generalizing about a class of shapes known as Julia sets. These were invented and studied during World War I by the French mathematicians Gaston Julia and Pierre Fatou, laboring without the pictures that a computer could provide. Mandelbrot had seen their modest drawings and read their work—already obscure—when he was twenty years old.

Were the buglike, floating “molecules” isolated islands? Or were they attached to the main body by filaments too fine to be observed? It was impossible to tell. For a one-dimensional process, no one need actually resort to experimental trial. It is easy enough to establish that numbers greater than one lead to infinity and the rest do not. But in the two dimensions of the complex plane, to deduce a shape defined by an iterated process, knowing the equation is generally not enough. Unlike the traditional shapes of geometry, circles and ellipses and parabolas, the Mandelbrot set allows no shortcuts. The only way to see what kind of shape goes with a particular equation is by trial and error, and the trial-and–error style brought the explorers of this new terrain closer in spirit to Magellan than to Euclid. Joining the world of shapes to the world of numbers in this way represented a break with the past.

If you look at it now it seems to have passed. People don’t like it any more. In Germany they built huge apartment blocks in the Bauhaus style and people move out, they don’t like to live there. There are very deep reasons, it seems to me, in society right now to dislike some aspects of our conception of nature.” Peitgen had been helping a visitor select blowups of regions of the Mandelbrot set, Julia sets, and other complex iterative processes, all exquisitely colored. In his small California office he offered slides, large transparencies, even a Mandelbrot set calendar. “The deep enthusiasm we have has to do with this different perspective of looking at nature. What is the true aspect of the natural object? The tree, let’s say—what is important? Is it the straight line, or is it the fractal object?” At Cornell, meanwhile, John Hubbard was struggling with the demands of commerce.

pages: 287 words: 44,739

Guide to business modelling by John Tennent, Graham Friend, Economist Group


correlation coefficient, discounted cash flows, double entry bookkeeping, iterative process, purchasing power parity, RAND corporation, shareholder value, the market place, time value of money

Chart 15.10 Applying a discount rate As the discount rate increases so the npv of the project will fall. The graph in Chart 15.11 shows the npv for a range of discount rates. 184 15. PROJECT APPRAISAL AND COMPANY VALUATION Chart 15.11 The NPV of a project with a range of discount rates The graph is a curved shape so the irr has to be found by trial and error or interpolation between two known points. Even spreadsheets use a trial and error iterative process to find the breakeven point. In the example it was possible to find the point almost exactly. In spreadsheets the irr function can be used to find the breakeven interest rate. The syntax is: ⫽IRR(range,guess) The range is the cash flows in the model and the guess is the point near where the irr is expected to be found. If no guess is used the formula assumes 10%. Note that the range for irr is time 0 to time N, whereas with npv above the range is time 1 to time N.

Even when a model contains no technical errors, however, it may still fail to deliver intuitive results because of a conceptual flaw. Conceptual errors These constitute a flaw in the logic, the rationale or the mechanisms depicted in the model. As the business modelling process map in Chapter 6 (Chart 6.1, page 34) indicated, developing an understanding of the logical flows and the relationships within the environment is an iterative process. The testing phase offers the modeller another opportunity to increase and test his or her understanding of the business. User errors These occur in poorly structured and badly documented models with limited checks on user inputs and inadequately trained users. The problems may arise as a result of human error, but the fundamental problem often lies with the design of the model. Types of errors 207 Allow time for testing A testing and debugging strategy must ensure the identification and removal of as many of the three types of errors as possible.

However, if a circular reference is an integral part of the design, such as in the case of interest calculations, then the spreadsheet package must be instructed to find a set of values that satisfy the circularity. The values represent an equilibrium that is effectively the solution to a set of simultaneous equations. To allow the spreadsheet to solve the circular reference, the modeller should select tools➞options➞calculation tab➞iteration. The model uses an iterative process where a range of values is used until a consistent set of results is found. Additional error handling may be required in the presence of circular references because if, for example, a #DIV/0!, #N/A! or #REF! occurs, the model will be unable to find a solution and the errors become compounded by the circularity. This is common in the case of items that influence all the financial statements such as taxation, cash and dividend calculations.

pages: 312 words: 35,664

The Mathematics of Banking and Finance by Dennis W. Cox, Michael A. A. Cox


barriers to entry, Brownian motion, call centre, correlation coefficient, inventory management, iterative process, linear programming, meta analysis, meta-analysis, P = NP, pattern recognition, random walk, traveling salesman, value at risk

. ; options design/approach to analysis, data 129–47 dice-rolling examples, probability theory 21–3, 53–5 differentiation 251 discount factors adjusted discount rates 228–9 net present value (NPV) 220–1, 228–9, 231–2 discrete data bar charts 7–12, 13 concepts 7–12, 13, 44–5, 53–5, 72 discrete uniform distribution, concepts 53–5 displays see also presentational approaches data 1–5 Disraeli, Benjamin 1 division notation 280, 282 dynamic programming complex examples 184–7 concepts 179–87 costs 180–82 examples 180–87 principle of optimality 179–87 returns 179–80 schematic 179–80 ‘travelling salesman’ problem 185–7 e-mail surveys 50–1 economic order quantity see also stock control concepts 195–201 examples 196–9 empowerment, staff 189–90 error sum of the squares (SSE), concepts 122–5, 133–47 errors, data analysis 129–47 estimates mean 76–81 probability theory 22, 25–6, 31–5, 75–81 Euler, L. 131 288 Index events independent events 22–4, 35, 58, 60, 92–5 mutually exclusive events 22–4, 58 probability theory 21–35, 58–66, 92–5 scenario analysis 40, 193–4, 271–4 tree diagrams 30–5 Excel 68, 206–7 exclusive events see mutually exclusive events expected errors, sensitivity analysis 268–9 expected value, net present value (NPV) 231–2 expert systems 275 exponent notation 282–4 exponential distribution, concepts 65–6, 209–10, 252–5 external fraud 272–4 extrapolation 119 extreme value distributions, VaR 262–4 F distribution ANOVA (analysis of variance) 110–20, 127, 134–7 concepts 85–9, 110–20, 127, 134–7 examples 85–9, 110–20, 127, 137 tables 85–8 f notation 8–9, 13–20, 26, 38–9, 44–5, 65–6, 85 factorial notation 53–5, 283–4 failure probabilities see also reliability replacement of assets 215–18, 249–60 feasibility polygons 152–7, 163–4 finance selection, linear programming 164–6 fire extinguishers, ANOVA (analysis of variance) 123–7 focus groups 51 forward recursion 179–87 four by four tables 94–5 fraud 272–4, 276 Fréchet distribution 262 frequency concepts 8–9, 13–20, 37–45 cumulative frequency polygons 13–20, 39–40, 203 graphical presentational approaches 8–9, 13–20 frequentist approach, probability theory 22, 25–6 future cash flows 219–25, 227–34, 240–1 fuzzy logic 276 Garbage In, Garbage Out (GIGO) 261–2 general rules, linear programming 167–70 genetic algorithms 276 ghost costs, transport problems 172–7 goodness of fit test, chi-squared test 91–5 gradient (a notation), linear regression 103–4, 107–20 graphical method, linear programming 149–57, 163–4 graphical presentational approaches concepts 1–20, 149–57, 235–47 rules 8–9 greater-than notation 280–4 Greek alphabet 283 guesswork, modelling 191 histograms 2, 7, 13–20, 41, 73 class intervals 13–20, 44–5 comparative histograms 14–19 concepts 7, 13–20, 41, 73 continuous data 7, 13–14 examples 13–20, 73 skewness 41 uses 7, 13–20 holding costs 182–5, 197–201, 204–8 home insurance 10–12 Hopfield 275 horizontal axis bar charts 8–9 histograms 14–20 linear regression 103–4, 107–20 scatter plots 2–5, 103 hypothesis testing concepts 77–81, 85–95, 110–27 examples 78–80, 85 type I and type II errors 80–1 i notation 8–9, 13–20, 28–30, 37–8, 103–20 identification data 2–5, 261–5 trends 241–7 identity rule 282 impact assessments 21, 271–4 independent events, probability theory 22–4, 35, 58, 60, 92–5 independent variables, concepts 2–5, 70, 103–20, 235 infinity, normal distribution 67–72 information, quality needs 190–4 initial solution, linear programming 167–70 insurance industry 10–12, 29–30 integers 280–4 integration 65–6, 251 intercept (b notation), linear regression 103–4, 107–20 interest rates base rates 240 daily movements 40, 261 project evaluation 219–25, 228–9 internal rate of return (IRR) concepts 220–2, 223–5 examples 220–2 interpolation, IRR 221–2 interviews, uses 48, 51–2 inventory control see stock control Index investment strategies 149–57, 164–6, 262–5 IRR see internal rate of return iterative processes, linear programming 170 j notation 28–30, 37, 104–20, 121–2 JP Morgan 263 k notation 20, 121–7 ‘know your customer’ 272 Kohonen self-organising maps 275 Latin squares concepts 131–2, 143–7 examples 143–7 lead times, stock control 195–201 learning strategies, neural networks 275–6 less-than notation 281–4 lethargy pitfalls, decisions 189 likelihood considerations, scenario analysis 272–3 linear programming additional variables 167–70 concepts 149–70 concerns 170 constraining equations 159–70 costs 167–70, 171–7 critique 170 examples 149–57, 159–70 finance selection 164–6 general rules 167–70 graphical method 149–57, 163–4 initial solution 167–70 iterative processes 170 manual preparation 170 most profitable loans 159–66 optimal advertising allocation 154–7 optimal investment strategies 149–57, 164–6 returns 149–57, 164–6 simplex method 159–70, 171–2 standardisation 167–70 time constraints 167–70 transport problems 171–7 linear regression analysis 110–20 ANOVA (analysis of variance) 110–20 concepts 3, 103–20 equation 103–4 examples 107–20 gradient (a notation) 103–4, 107–20 intercept (b notation) 103–4, 107–20 interpretation 110–20 notation 103–4 residual sum of the squares 109–20 slope significance test 112–20 uncertainties 108–20 literature searches, surveys 48 289 loans finance selection 164–6 linear programming 159–66 risk assessments 159–60 log-normal distribution, concepts 257–8 logarithms (logs), types 20, 61 losses, banks 267–9, 271–4 lotteries 22 lower/upper quartiles, concepts 39–41 m notation 55–8 mail surveys 48, 50–1 management information, graphical presentational approaches 1–20 Mann–Whitney test see U test manual preparation, linear programming 170 margin of error, project evaluation 229–30 market prices, VaR 264–5 marketing brochures 184–7 mathematics 1, 7–8, 196–9, 219–20, 222–5, 234, 240–1, 251, 279–84 matrix plots, concepts 2, 4–5 matrix-based approach, transport problems 171–7 maximum and minimum, concepts 37–9, 40, 254–5 mean comparison of two sample means 79–81 comparisons 75–81 concepts 37–45, 59–60, 65–6, 67–74, 75–81, 97–8, 100–2, 104–27, 134–5 confidence intervals 71, 75–81, 105, 109, 116–20, 190, 262–5 continuous data 44–5, 65–6 estimates 76–81 hypothesis testing 77–81 linear regression 104–20 normal distribution 67–74, 75–81, 97–8 sampling 75–81 mean square causes (MSC), concepts 122–7, 134–47 mean square errors (MSE), ANOVA (analysis of variance) 110–20, 121–7, 134–7 median, concepts 37, 38–42, 83, 98–9 mid-points class intervals 44–5, 241–7 moving averages 241–7 minimax regret rule, concepts 192–4 minimum and maximum, concepts 37–9, 40 mode, concepts 37, 39, 41 modelling banks 75–81, 85, 97, 267–9, 271–4 concepts 75–81, 83, 91–2, 189–90, 195–201, 215–18, 261–5 decision-making pitfalls 189–91 economic order quantity 195–201 290 Index modelling (cont.) guesswork 191 neural networks 275–7 operational risk 75, 262–5, 267–9, 271–4 output reviews 191–2 replacement of assets 215–18, 249–60 VaR 261–5 moments, density functions 65–6, 83–4 money laundering 272–4 Monte Carlo simulation bank cashier problem 209–12 concepts 203–14, 234 examples 203–8 Monty Hall problem 212–13 queuing problems 208–10 random numbers 207–8 stock control 203–8 uses 203, 234 Monty Hall problem 34–5, 212–13 moving averages concepts 241–7 even numbers/observations 244–5 moving totals 245–7 MQMQM plot, concepts 40 MSC see mean square causes MSE see mean square errors multi-way tables, concepts 94–5 multiplication notation 279–80, 282 multiplication rule, probability theory 26–7 multistage sampling 50 mutually exclusive events, probability theory 22–4, 58 n notation 7, 20, 28–30, 37–45, 54–8, 103–20, 121–7, 132–47, 232–4 n!

The current cost can now be calculated as: 142.86 cost(x1 ) + 285.71 cost(x2 ) = 142.86 × 1 + 285.71 × 1 = 428.57 There are now no remaining negative opportunity costs, so the solution has variable 1 as 142.86 and variable 2 as 285.71 with the remaining variables no longer being used since they are now zero. 17.4 THE CONCERNS WITH THE APPROACH In practice when you are inputting data into a system and then using some iterative process to try to find a better estimate of what is the best strategy, you are actually conducting the process laid out in this chapter – it is just that the actual work is normally embedded within a computer program. However, where possible there are real merits in carrying out the analysis in a manual form, not the least of which is the relative complexity of the software solutions currently available.

pages: 372 words: 101,174

How to Create a Mind: The Secret of Human Thought Revealed by Ray Kurzweil


Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Albert Michelson, anesthesia awareness, anthropic principle, brain emulation, cellular automata, Claude Shannon: information theory, cloud computing, computer age, Dean Kamen, discovery of DNA, double helix,, epigenetics, George Gilder, Google Earth, Isaac Newton, iterative process, Jacquard loom, Jacquard loom, John von Neumann, Law of Accelerating Returns, linear programming, Loebner Prize, mandelbrot fractal, Norbert Wiener, optical character recognition, pattern recognition, Peter Thiel, Ralph Waldo Emerson, random walk, Ray Kurzweil, reversible computing, self-driving car, speech recognition, Steven Pinker, strong AI, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Wall-E, Watson beat the top human players on Jeopardy!, X Prize

(For an algorithmic description of genetic algorithms, see this endnote.)11 The key to a genetic algorithm is that the human designers don’t directly program a solution; rather, we let one emerge through an iterative process of simulated competition and improvement. Biological evolution is smart but slow, so to enhance its intelligence we greatly speed up its ponderous pace. The computer is fast enough to simulate many generations in a matter of hours or days, and we’ve occasionally had them run for as long as weeks to simulate hundreds of thousands of generations. But we have to go through this iterative process only once; as soon as we have let this simulated evolution run its course, we can apply the evolved and highly refined rules to real problems in a rapid fashion. In the case of our speech recognition systems, we used them to evolve the initial topology of the network and other critical parameters.

We then collapse the two (one-point) clusters that are closest together into a single cluster. We are thus still left with 1,024 clusters. After processing the 1,025th vector, one of those clusters now has more than one point. We keep processing points in this way, always maintaining 1,024 clusters. After we have processed all the points, we represent each multipoint cluster by the geometric center of the points in that cluster. We continue this iterative process until we have run through all the sample points. Typically we would process millions of points into 1,024 (210) clusters; we’ve also used 2,048 (211) or 4,096 (212) clusters. Each cluster is represented by one vector that is at the geometric center of all the points in that cluster. Thus the total of the distances of all the points in the cluster to the center point of the cluster is as small as possible.

pages: 193 words: 98,671

The Inmates Are Running the Asylum by Alan Cooper


Albert Einstein, delayed gratification, Donald Trump, Howard Rheingold, informal economy, iterative process, Jeff Bezos, Menlo Park, natural language processing, new economy,, Robert X Cringely, Silicon Valley, Silicon Valley startup, skunkworks, Steve Jobs, Steven Pinker, telemarketer, urban planning

I'll talk a lot about goals in the next chapter, but we discover them in the same way we discover personas. We determine the relevant personas and their goals in a process of successive refinement during our initial investigation of the problem domain. Typically, we start with a reasonable approximation and quickly converge on a believable population of personas. Although this iterative process is similar to the iterative process used by software engineers during the implementation process, it is significantly different in one major respect. Iterating the design and its premises is quick and easy because we are working in paper and words. Iterating the implementation is slow and difficult because it requires code. The Cast of Characters We give every project its own cast of characters, which consists of anywhere from 3 to 12 unique personas.

They write any old program that can be built in the least time and then put it before their users. They then listen to the complaints and feedback, measure the patterns of the user's navigation clicks, change the weak parts, and then ship it again. Generally, programmers aren't thrilled about the iterative method because it means extra work for them. Typically, it's managers new to technology who like the iterative process because it relieves them of having to perform rigorous planning, thinking, and product due diligence (in other words, interaction design). Of course, it's the users who pay the dearest price. They have to suffer through one halfhearted attempt after another before they get a program that isn't too painful. Just because customer feedback improves your understanding of your product or service, you cannot then deduce that it is efficient, cheap, or even effective to toss random features at your customers and see which ones are liked and which are disliked.

Functional Programming in Scala by Paul Chiusano, Rúnar Bjarnason


domain-specific language, iterative process, loose coupling, sorting algorithm, type inference, web application

Though because this style of organization is so common in FP, we sometimes don't bother to distinguish between an ordinary functional library and a "combinator library". 7.2 Choosing data types and functions Our goal in this section is to discover a data type and a set of primitive functions for our domain, and derive some useful combinators. This will be a somewhat meandering journey. Functional design can be a messy, iterative process. We hope to show at least a stylized view of this messiness that nonetheless gives some insight into how functional design proceeds in the real world. Don't worry if you don't follow absolutely every bit of discussion throughout this process. This chapter is a bit like peering over the shoulder of someone as they think through possible designs. And because no two people approach this process the same way, the particular path we walk here might not strike you as the most natural one—perhaps it considers issues in what seems like an odd order, skips too fast or goes too slow.

And lastly, you can look at your implementation and come up with laws you expect to hold based on your implementation.17 Footnote 17mThis last way of generating laws is probably the weakest, since it can be a little too easy to just have the laws reflect the implementation, even if the implementation is buggy or requires all sorts of unusual side conditions that make composition difficult. EXERCISE 13: Can you think of other laws that should hold for your implementation of unit, fork, and map2? Do any of them have interesting consequences? 7.2.4 Expressiveness and the limitations of an algebra Functional design is an iterative process. After you've written down your API and have at least a prototype implementation, try using it for progressively more complex or realistic scenarios. Often you'll find that these scenarios require only some combination of existing primitive or derived combinators, and this is a chance to factor out common usage patterns into other combinators; occasionally you'll find situations where your existing primitives are insufficient.

As a result, you might arrive at a design that's much better for your purposes. But even if you decide you like the existing library's solution, spending an hour or two of playing with designs and writing down some type signatures is a great way to learn more about a domain, understand the design tradeoffs, and improve your ability to think through design problems. 8.3 Choosing data types and functions In this section, we will embark on another messy, iterative process of discovering data types and a set of primitive functions and combinators for doing property-based testing. As before, this is a chance to peer over the shoulder of someone working through possible designs. The particular path we take and the library we arrive at isn't necessarily the same as what you would discover. If property-based testing is unfamiliar to you, even better; this is a chance to explore a new domain and its design space, and make your own discoveries about it.

pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable by James Owen Weatherall


Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Asian financial crisis, bank run, Benoit Mandelbrot, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, butterfly effect, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, collateralized debt obligation, collective bargaining, dark matter, Edward Lorenz: Chaos theory, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial innovation, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, John Nash: game theory, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, martingale, new economy, Paul Lévy, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk-adjusted returns, Robert Gordon, Robert Shiller, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, The Chicago School, The Myth of the Rational Market, tulip mania, V2 rocket, volatility smile

Extreme events occur far more often than Bachelier and Osborne believed they would, and markets are wilder places than normal distributions can describe. To fully understand markets, and to model them as safely as possible, these facts must be accounted for. And Mandelbrot is singularly responsible for discovering the shortcomings of the Bachelier-Osborne approach, and for developing the mathematics necessary to study them. Getting the details right may be an ongoing project — indeed, we should never expect to finish the iterative process of improving our mathematical models — but there is no doubt that Mandelbrot took a crucially important step forward. After a decade of interest in the statistics of markets, Mandelbrot gave up on his crusade to replace normal distributions with other Lévy-stable distributions. By this time, his ideas on randomness and disorder had begun to find applications in a wide variety of other fields, from cosmology to meteorology.

A sledgehammer may be great for laying train rails, but you need to recognize that it won’t be very good for hammering in finishing nails on a picture frame. I believe the history that I have recounted in this book supports the closely related claims that models in finance are best thought of as tools for certain kinds of purposes, and also that these tools make sense only in the context of an iterative process of developing models and then figuring out when, why, and how they fail — so that the next generation of models are robust in ways that the older models were not. From this perspective, Bachelier represents a first volley, the initial attempt to apply new ideas from statistical physics to an entirely different set of problems. He laid the groundwork for a revolutionary way of thinking about markets.

This led to a feedback loop that wasn’t fully recognized until after the 1987 crash. As sociologist Donald MacKenzie has observed, financial models are as much the engine behind markets as they are a camera capable of describing them. This means that the markets financial models are trying to capture are a moving target. Far from undermining the usefulness of models in understanding markets, the fact that markets are constantly evolving only makes the iterative process I have emphasized more important. Suppose that Sornette’s model of market crashes is perfect for current markets. Even then, we have to remain ever vigilant. What would happen if investors around the world started using his methods to predict crashes? Would this prevent crashes from occurring? Or would it simply make them bigger, or harder to predict? I don’t think anyone knows the answer to this question, which means that it is just the kind of thing we should be studying.

pages: 343 words: 93,544

vN: The First Machine Dynasty (The Machine Dynasty Book 1) by Madeline Ashby


big-box store, iterative process, natural language processing, place-making, traveling salesman, urban planning

Dr Singh had suggested a vN variety of naan as a replacement. "You're going to be entering a deep game immersion. You won't eat for a few hours. So you'd better fuel up now." Amy re-examined the plates. They were the smart kind; if she'd asked, they would have told her how many ounces she was eating from each. But she didn't need to ask. "There's too much here," she said. "If I eat all this without having to repair myself, it'll trigger the iteration process." She leaned as far forward as the Cuddlebug would allow. "Will I have to repair myself?" "No. It'll just wear you out, that's all." "How do you know?" "I've seen it happen." Dr Singh stood. "I thought you'd be happy with the spread. Your mother says you were never allowed to eat as much as you wanted. She says you were always hungry." Amy shut her eyes. She was going to cry, and she didn't want Dr Singh or the others to see it.

Asimov's Frankenstein Complex notion isn't just an early version of Mori's Uncanny Valley hypothesis, it's a reasonable extension of the fear that when we create in our own image, we will inevitably re-create the worst parts of ourselves. In other words: "I'm fucked up – therefore my kids will be fucked up, too." When I completed the submission draft of this book, I had just finished the first year of my second Master's – a design degree in strategic foresight. So I had spent months listening to discussions about the iterative process. And I started to realize that a self-replicating species of machine wouldn't have the usual fears about its offspring repeating its signature mistakes, nor would it have that uncanny response to copying. Machines like that could consider their iterations as prototypes, and nothing more. Stephen King has a famous adage about killing your darlings, and they could do that – literally – without a flood of oxytocin or normative culture telling them different.

But even so, the book was rejected by a bunch of different publishers, and I still had to re-write the whole opening of the submission draft before the book became sale-able. David Nickle was invaluable for that – we watched A History of Violence together and suddenly everything clicked. Normally he gives me the end of all my stories, and this time he helped me see a new beginning. In short: it was an iterative process. ANGRY ROBOT A member of the Osprey Group Lace Market House, 54-56 High Pavement, Nottingham, NG1 1HW, UK No three rules An Angry Robot paperback original 2012 1 Copyright © Madeline Ashby 2012 Madeline Ashby asserts the moral right to be identified as the author of this work. A catalogue record for this book is available from the British Library.

pages: 52 words: 14,333

Growth Hacker Marketing: A Primer on the Future of PR, Marketing, and Advertising by Ryan Holiday


Airbnb, iterative process, Kickstarter, Lean Startup, market design, minimum viable product, Paul Graham,, Silicon Valley, slashdot, Steve Wozniak

I start and end with my own experiences in this book not because I am anyone special but because I think they illustrate a microcosm of the industry itself. The old way—where product development and marketing were two distinct and separate processes—has been replaced. We all find ourselves in the same position: needing to do more with less and finding, increasingly, that the old strategies no longer generate results. So in this book, I am going to take you through a new cycle, a much more fluid and iterative process. A growth hacker doesn’t see marketing as something one does, but rather as something one builds into the product itself. The product is then kick-started, shared, and optimized (with these steps repeated multiple times) on its way to massive and rapid growth. The chapters of this book follow that structure. But first, let’s make a clean break between the old and the new. What Is Growth Hacking?

pages: 205 words: 20,452

Data Mining in Time Series Databases by Mark Last, Abraham Kandel, Horst Bunke


4chan, call centre, computer vision, discrete time, information retrieval, iterative process, NP-complete, p-value, pattern recognition, random walk, sensor fusion, speech recognition, web application

For better computational efficiency a second genetic algorithm is designed using a more elaborate chromosome coding scheme of strings; see [17] for details. The time complexity becomes O(N nmpP ), m << n, implying a substantial speedup. 5.2.3. Perturbation-Based Iterative Refinement The set median represents an approximation of the generalized median string. The greedy algorithms and the genetic search techniques also give approximate solutions. An approximate solution p̄ can be further improved by an iterative process of systematic perturbations. This idea was first suggested in [20]. But no algorithmic details are specified there. A concrete algorithm for realizing systematic perturbations is given in [26]. For each 184 X. Jiang, H. Bunke and J. Csirik position i, the following operations are performed: (i) Build perturbations • Substitution: Replace the i-th symbol of p̄ by each symbol of Σ in turn and choose the resulting string x with the smallest consensus error relative to S. • Insertion: Insert each symbol of Σ in turn at the i-th position of p̄ and choose the resulting string y with the smallest consensus error relative to S. • Deletion: Delete the i-th symbol of p̄ to generate z.

Note that the consensus errors of digit 6 are substantially larger than those of the other digits because of the definition of consensus error as the sum, but not the average, of the distances to all input samples. The best results are achieved by GA, followed by the dynamic approach. Except for digit 1, the greedy algorithm reveals some weakness. Looking at the median for digits 2, 3 and 6 it seems that the iterative process terminates too early, resulting in a string (digit) much shorter than it should be. The reason lies in the simple termination criterion defined in [4]. It works well for the (short) words used there, but obviously encounters difficulties in dealing with longer strings occurring in our study. At first glance, the dynamic approach needs more computation time than the greedy algorithm. But one has to take into account that the recorded time is the total time of the dynamic process of adding one sample to the existing set each time, starting from a set consisting of the first sample.

pages: 252 words: 73,131

The Inner Lives of Markets: How People Shape Them—And They Shape Us by Tim Sullivan


Airbnb, airport security, Al Roth, Andrei Shleifer, attribution theory, autonomous vehicles, barriers to entry, Brownian motion, centralized clearinghouse, clean water, conceptual framework, constrained optimization, continuous double auction, deferred acceptance, Donald Trump, Edward Glaeser, experimental subject, first-price auction, framing effect, frictionless, fundamental attribution error, George Akerlof, Goldman Sachs: Vampire Squid, helicopter parent, Internet of things, invisible hand, Isaac Newton, iterative process, Jean Tirole, Jeff Bezos, Johann Wolfgang von Goethe, John Nash: game theory, John von Neumann, Joseph Schumpeter, late fees, linear programming, Lyft, market clearing, market design, market friction, medical residency, multi-sided market, mutually assured destruction, Nash equilibrium, Occupy movement, Peter Thiel,, pez dispenser, pre–internet, price mechanism, price stability, prisoner's dilemma, profit motive, proxy bid, RAND corporation, ride hailing / ride sharing, Robert Shiller, Robert Shiller, Ronald Coase, school choice, school vouchers, sealed-bid auction, second-price auction, second-price sealed-bid, sharing economy, Silicon Valley, spectrum auction, Steve Jobs, Tacoma Narrows Bridge, technoutopianism, telemarketer, The Market for Lemons, The Wisdom of Crowds, Thomas Malthus, Thorstein Veblen, trade route, transaction costs, two-sided market, uranium enrichment, Vickrey auction, winner-take-all economy

The pioneers of information economics set the profession on a path to better describing the nature of markets, which has in turn led the current generation to turn its attention outward to dabble in the design of markets and policy. A great many applied theorists and empirical economists are, together, able to match theories up to data they can use to evaluate how they perform in practice. We hope that the iterative process of theorizing and testing of theories in the field and the reformulating of theories (a process of experimentation that we’re in the midst of) will make it more likely that economics’ increased influence on the world is ultimately for the better. 5 BUILDING AN AUCTION FOR EVERYTHING THE TALE OF THE ROLLER-SKATING ECONOMIST In the fall of 2006, the Japanese baseball phenom Daisuke Matsuzaka announced his interest in moving to the American big leagues.

These allocation problems all now have centralized clearinghouses, many designed with the basic deferred acceptance algorithm as their foundations. But that’s really all that Gale and Shapley provided: a conceptual framework that market designers have, for several decades now, been applying, evaluating, and refining. They’ve learned from its successes and, unfortunately, learned even more from its inevitable failures: modeling real-life exchanges is an imprecise, iterative process in which many of us find ourselves as experimental subjects. The Complicated Job of Engineering Matches Market designer Al Roth likes to use a bridge-building metaphor to explain the contrast between his own work and that of design pioneers like Shapley. Suppose you want to build a suspension bridge connecting Brooklyn and Manhattan. In confronting decisions like where to place the suspension cables and how thick each should be, you’d better have paid attention in physics class.

pages: 462 words: 172,671

Clean Code: A Handbook of Agile Software Craftsmanship by Robert C. Martin


continuous integration, database schema, domain-specific language,, Eratosthenes, finite state, Ignaz Semmelweis: hand washing, iterative process, place-making, web application, WebSocket

All the analysis functions appear first, and all the synthesis functions appear last. If you look carefully, you will notice that I reversed several of the decisions I made earlier in this chapter. For example, I inlined some extracted methods back into formatCompactedComparison, and I changed the sense of the shouldNotBeCompacted expression. This is typical. Often one refactoring leads to another that leads to the undoing of the first. Refactoring is an iterative process full of trial and error, inevitably converging on something that we feel is worthy of a professional. Conclusion And so we have satisfied the Boy Scout Rule. We have left this module a bit cleaner than we found it. Not that it wasn’t clean already. The authors had done an excellent job with it. But no module is immune from improvement, and each of us has the responsibility to leave the code a little better than we found it. 16 Refactoring SerialDate If you go to, you will find the JCommon library.

See Hungarian Notation horizontal alignment, of code, 87–88 horizontal formatting, 85–90 horizontal white space, 86 HTML, in source code, 69 Hungarian Notation (HN), 23–24, 295 Hunt, Andy, 8, 289 hybrid structures, 99 I if statements duplicate, 276 eliminating, 262 if-else chain appearing again and again, 290 eliminating, 233 ignored tests, 313 implementation duplication of, 173 encoding, 24 exposing, 94 hiding, 94 wrapping an abstraction, 11 Implementation Patterns, 3, 296 implicity, of code, 18 import lists avoiding long, 307 shortening in SerialDate, 270 imports, as hard dependencies, 307 imprecision, in code, 301 inaccurate comments, 54 inappropriate information, in comments, 286 inappropriate static methods, 296 include method, 48 inconsistency, in code, 292 inconsistent spellings, 20 incrementalism, 212–214 indent level, of a function, 35 indentation, of code, 88–89 indentation rules, 89 independent tests, 132 information inappropriate, 286 too much, 70, 291–292 informative comments, 56 inheritance hierarchy, 308 inobvious connection, between a comment and code, 70 input arguments, 41 instance variables in classes, 140 declaring, 81 hiding the declaration of, 81–82 passing as function arguments, 231 proliferation of, 140 instrumented classes, 342 insufficient tests, 313 integer argument(s) defining, 194 integrating, 224–225 integer argument functionality, moving into ArgumentMarshaler, 215–216 integer argument type, adding to Args, 212 integers, pattern of changes for, 220 IntelliJ, 26 intent explaining in code, 55 explanation of, 56–57 obscured, 295 intention-revealing function, 19 intention-revealing names, 18–19 interface(s) defining local or remote, 158–160 encoding, 24 implementing, 149–150 representing abstract concerns, 150 turning ArgumentMarshaler into, 237 well-defined, 291–292 writing, 119 internal structures, objects hiding, 97 intersection, of domains, 160 intuition, not relying on, 289 inventor of C++, 7 Inversion of Control (IoC), 157 InvocationHandler object, 162 I/O bound, 318 isolating, from change, 149–150 isxxxArg methods, 221–222 iterative process, refactoring as, 265 J jar files, deploying derivatives and bases in, 291 Java aspects or aspect-like mechanisms, 161–166 heuristics on, 307–309 as a wordy language, 200 Java 5, improvements for concurrent development, 182–183 Java 5 Executor framework, 320–321 Java 5 VM, nonblocking solutions in, 327–328 Java AOP frameworks, 163–166 Java programmers, encoding not needed, 24 Java proxies, 161–163 Java source files, 76–77 javadocs as clutter, 276 in nonpublic code, 71 preserving formatting in, 270 in public APIs, 59 requiring for every function, 63 java.util.concurrent package, collections in, 182–183 JBoss AOP, proxies in, 163 JCommon library, 267 JCommon unit tests, 270 JDepend project, 76, 77 JDK proxy, providing persistence support, 161–163 Jeffries, Ron, 10–11, 289 jiggling strategies, 190 JNDI lookups, 157 journal comments, 63–64 JUnit, 34 JUnit framework, 252–265 Junit project, 76, 77 Just-In-Time Compiler, 180 K keyword form, of a function name, 43 L L, lower-case in variable names, 20 language design, art of programming as, 49 languages appearing to be simple, 12 level of abstraction, 2 multiple in one source file, 288 multiples in a comment, 270 last-in, first-out (LIFO) data structure, operand stack as, 324 Law of Demeter, 97–98, 306 LAZY INITIALIZATION/EVALUATION idiom, 154 LAZY-INITIALIZATION, 157 Lea, Doug, 182, 342 learning tests, 116, 118 LeBlanc’s law, 4 legacy code, 307 legal comments, 55–56 level of abstraction, 36–37 levels of detail, 99 lexicon, having a consistent, 26 lines of code duplicating, 173 width of, 85 list(s) of arguments, 43 meaning specific to programmers, 19 returning a predefined immutable, 110 literate code, 9 literate programming, 9 Literate Programming, 141 livelock, 183, 338 local comments, 69–70 local variables, 324 declaring, 292 at the top of each function, 80 lock & wait, 337, 338 locks, introducing, 185 log4j package, 116–118 logical dependencies, 282, 298–299 LOGO language, 36 long descriptive names, 39 long names, for long scopes, 312 loop counters, single-letter names for, 25 M magic numbers obscuring intent, 295 replacing with named constants, 300–301 main function, moving construction to, 155, 156 managers, role of, 6 mandated comments, 63 manual control, over a serial ID, 272 Map adding for ArgumentMarshaler, 221 methods of, 114 maps, breaking the use of, 222–223 marshalling implementation, 214–215 meaningful context, 27–29 member variables f prefix for, 257 prefixing, 24 renaming for clarity, 259 mental mapping, avoiding, 25 messy code.

See POJOs platforms, running threaded code, 188 pleasing code, 7 pluggable thread-based code, 187 POJO system, agility provided by, 168 POJOs (Plain-Old Java Objects) creating, 187 implementing business logic, 162 separating threaded-aware code, 190 in Spring, 163 writing application domain logic, 166 polyadic argument, 40 polymorphic behavior, of functions, 296 polymorphic changes, 96–97 polymorphism, 37, 299 position markers, 67 positives as easier to understand, 258 expressing conditionals as, 302 of decisions, 301precision as the point of all naming, 30 predicates, naming, 25 preemption, breaking, 338 prefixes for member variables, 24 as useless in today’s environments, 312–313 pre-increment operator, ++, 324, 325, 326 “prequel”, this book as, 15 principle of least surprise, 288–289, 295 principles, of design, 15 PrintPrimes program, translation into Java, 141 private behavior, isolating, 148–149 private functions, 292 private method behavior, 147 problem domain names, 27 procedural code, 97 procedural shape example, 95–96 procedures, compared to objects, 101 process function, repartitioning, 319–320 process method, I/O bound, 319 processes, competing for resources, 184 processor bound, code as, 318 producer consumer execution model, 184 producer threads, 184 production environment, 127–130 productivity, decreased by messy code, 4 professional programmer, 25 professional review, of code, 268 programmers as authors, 13–14 conundrum faced by, 6 responsibility for messes, 5–6 unprofessional, 5–6 programming defined, 2 structured, 48–49 programs, getting them to work, 201 pronounceable names, 21–22 protected variables, avoiding, 80 proxies, drawbacks of, 163 public APIs, javadocs in, 59 puns, avoiding, 26–27 PUTFIELD instruction, as atomic, 325 Q queries, separating from commands, 45–46 R random jiggling, tests running, 190 range, including end-point dates in, 276 readability of clean tests, 124 of code, 76 Dave Thomas on, 9 improving using generics, 115 readability perspective, 8 readers of code, 13–14 continuous, 184 readers-writers execution model, 184 reading clean code, 8 code from top to bottom, 37 versus writing, 14 reboots, as a lock up solution, 331 recommendations, in this book, 13 redesign, demanded by the team, 5 redundancy, of noise words, 21 redundant comments, 60–62, 272, 275, 286–287 ReentrantLock class, 183 refactored programs, as longer, 146 refactoring Args, 212 code incrementally, 172 as an iterative process, 265 putting things in to take out, 233 test code, 127 Refactoring (Fowler), 285 renaming, fear of, 30 repeatability, of concurrency bugs, 180 repeatable tests, 132 requirements, specifying, 2 resetId, byte-code generated for, 324–325 resources bound, 183 processes competing for, 184 threads agreeing on a global ordering of, 338 responsibilities counting in classes, 136 definition of, 138 identifying, 139 misplaced, 295–296, 299 splitting a program into main, 146 return codes, using exceptions instead, 103–105 reuse, 174 risk of change, reducing, 147 robust clear code, writing, 112 rough drafts, writing, 200 runnable interface, 326 run-on expressions, 295 run-on journal entries, 63–64 runtime logic, separating startup from, 154 S safety mechanisms, overridden, 289 scaling up, 157–161 scary noise, 66 schema, of a class, 194 schools of thought, about clean code, 12–13 scissors rule, in C++, 81 scope(s) defined by exceptions, 105 dummy, 90 envying, 293 expanding and indenting, 89 hierarchy in a source file, 88 limiting for data, 181 names related to the length of, 22–23, 312 of shared variables, 333 searchable names, 22–23 Second Law, of TDD, 122 sections, within functions, 36 selector arguments, avoiding, 294–295 self validating tests, 132 Semaphore class, 183 semicolon, making visible, 90 “serial number”, SerialDate using, 271 SerialDate class making it right, 270–284 naming of, 270–271 refactoring, 267–284 SerialDateTests class, 268 serialization, 272 server, threads created by, 319–321 server application, 317–318, 343–344 server code, responsibilities of, 319 server-based locking, 329 as preferred, 332–333 with synchronized methods, 185 “Servlet” model, of Web applications, 178 Servlets, synchronization problems, 182 set functions, moving into appropriate derivatives, 232, 233–235 setArgument, changing, 232–233 setBoolean function, 217 setter methods, injecting dependencies, 157 setup strategy, 155 listing, 50–52 shape classes, 95–96 shared data, limiting access, 181 shared variables method updating, 328 reducing the scope of, 333 shotgun approach, hand-coded instrumentation as, 189 shut-down code, 186 shutdowns, graceful, 186 side effects having none, 44 names describing, 313 Simmons, Robert, 276 simple code, 10, 12 Simple Design, rules of, 171–176 simplicity, of code, 18, 19 single assert rule, 130–131 single concepts, in each test function, 131–132 Single Responsibility Principle (SRP), 15, 138–140 applying, 321 breaking, 155 as a concurrency defense principle, 181 recognizing violations of, 174 server violating, 320 Sql class violating, 147 supporting, 157 in test classes conforming to, 172 violating, 38 single value, ordered components of, 42 single-letter names, 22, 25 single-thread calculation, of throughput, 334 SINGLETON pattern, 274 small classes, 136 Smalltalk Best Practice Patterns, 296 smart programmer, 25 software project, maintenance of, 175 software systems.

pages: 556 words: 46,885

The World's First Railway System: Enterprise, Competition, and Regulation on the Railway Network in Victorian Britain by Mark Casson


banking crisis, barriers to entry, Beeching cuts, British Empire, combinatorial explosion, Corn Laws, corporate social responsibility, David Ricardo: comparative advantage, intermodal, iterative process, joint-stock company, joint-stock limited liability company, knowledge economy, linear programming, Network effects, New Urbanism, performance metric, railway mania, rent-seeking, strikebreaker, the market place, transaction costs

It is this ‘dominating’ counterfactual that is reported here. It should be emphasized that this counterfactual is not superior for every conceivable consignment of traYc, but only for a typical consignment of a certain type of traYc. The counterfactual system is developed from a ‘blank sheet of paper’—almost literally—and not by simply exploring variations to the conWguration of the actual network. It is constructed using an iterative process, as explained below. To avoid the need for a full evaluation of the performance of the network after each iteration, a set of simple criteria were used to guide the initial formulation of the model. These criteria represent conditions that the counterfactual would 6 The World’s First Railway System almost certainly have to fulWl if it were to stand any chance of matching the performance of the actual system.

The destinations involved both large and small towns—because large towns typically act as hubs for small towns on the counterfactual, distinguishing between them is somewhat artificial. Finally, seven regional samples were constructed—reflecting the fact that most journeys on any railway system tend to be relatively short. Comparing the results for different samples illustrates how well the actual system served different types of traffic and different parts of the country. The counterfactual network was constructed using an iterative process. The performance of the actual system was first assessed. This proved to be a most illuminating process, indicating that the actual performance of the system for many categories of traffic was much inferior to what has often been suggested— particularly in the enthusiasts’ literature. An initial counterfactual system was then constructed, using only a limited number of local lines, and its performance compared with the actual system.

The second stage was to specify the local lines. Wherever possible, local lines feed into the hubs identified at the first stage. By concentrating interchange traffic at a limited number of hubs, the number of stops that long-distance trains need to make for connection purposes is reduced. At the same time, the power of hubs to act as a ‘one-stop shop’ for local connections is increased. An iterative process was then followed to fine-tune the interfaces between trunk and local networks. The final stage is based on the counterfactual timetable. The preparation of the timetable provides an opportunity to assess whether upgrading certain links to permit higher speeds would improve connections. Connections are improved when speeding up one link relative to others allows trains entering a hub from the accelerated link to make connections with trains that they would otherwise miss.

pages: 376 words: 110,796

Realizing Tomorrow: The Path to Private Spaceflight by Chris Dubbs, Emeline Paat-dahlstrom, Charles D. Walker


Berlin Wall, call centre, desegregation, Donald Trump, Doomsday Book, Elon Musk, high net worth, Iridium satellite, iterative process, Jeff Bezos, Mikhail Gorbachev, multiplanetary species, Richard Feynman, Richard Feynman, Ronald Reagan, Search for Extraterrestrial Intelligence, Silicon Valley, Skype, Steve Jobs, Steve Wozniak, technoutopianism, V2 rocket, X Prize, young professional

Having watched one of the CATS teams develop their rocket Carmack then set about going through the same process. "They build a rocket for a year, go out into the desert, they press the button and hope it doesn't blow up. It really rarely works right." He wanted to follow the same "rapid iterative process" he used in developing software and apply it to his rocketry business. "The background that I came from in software is you compile and test maybe a dozen times a day. It's a cyclic thing where you try to make it right but much of the benefit you get is in the exploration of the process, not so much plan it out perfect, implement it perfect for it to work. It's an iterative process of exploring your options." Carmack taught himself aerospace engineering and became one of Armadillo's principal engineers for the project. Armadillo officially registered for the x PRIZE in October zooz when Carmack was sure the prize was funded.

pages: 353 words: 104,146

European Founders at Work by Pedro Gairifo Santos


business intelligence, cloud computing, crowdsourcing, fear of failure, full text search, information retrieval, inventory management, iterative process, Jeff Bezos, Lean Startup, Mark Zuckerberg, natural language processing, pattern recognition, pre–internet, recommendation engine, Richard Stallman, Silicon Valley, Skype, slashdot, Steve Jobs, Steve Wozniak, subscription business, technology bubble, web application, Y Combinator

So that was a small signal that certain people had the kind of engagement with this product that they would probably pay for it. But it got me thinking around that time about monetizing it. I was actually more bothered about what happens when all these early adopters have the product, and perhaps some of them have donated generously. That’s very nice. What happens next? Is it literally just the iterative process of adding more and more features? Or is there something a bit more to this? I think over time it became very apparent that to keep up the iterative process, I needed more staff. I needed help to just keep doing this. We needed to integrate Facebook, and LinkedIn, and more recently Foursquare. To make it more of a hub than simply a Twitter client. So, yes, it didn’t take very long for me to start thinking in that way. It did take quite a while to actually move on it though, to actually start turning it into the company.

pages: 445 words: 105,255

Radical Abundance: How a Revolution in Nanotechnology Will Change Civilization by K. Eric Drexler


3D printing, additive manufacturing, agricultural Revolution, Bill Joy: nanobots, Brownian motion, carbon footprint, Cass Sunstein, conceptual framework, crowdsourcing, dark matter, double helix, failed state, global supply chain, industrial robot, iterative process, Mars Rover, means of production, Menlo Park, mutually assured destruction, New Journalism, performance metric, reversible computing, Richard Feynman, Richard Feynman, Silicon Valley, South China Sea, Thomas Malthus, V2 rocket, Vannevar Bush

These words are inscribed on the obelisk that marks his grave: “Man will not always stay on Earth; the pursuit of light and space will lead him to penetrate the bounds of the atmosphere, timidly at first but in the end to conquer the whole of solar space.” 139Engineering with an Exploratory Twist: The structural contrasts between conventional and exploratory engineering can be laid out as follows: Kind of engineering: Production-oriented Exploratory Basic purpose: provides products provides knowledge Basic constraint: accessible fabrication valid modeling Level of design: detailed specification parametric model Primary costs: production, operation design, analysis Design margins: enable robust products enable robust analyses Larger margins: increase costs reduce costs Chapter 10: The Machinery of Radical Abundance 150successive layers will have thicknesses of 1, ½, ¼, , and so on: This neat, self-similar architecture was suggested by Ralph Merkle and improves on the functionally similar version described in Nanosystems. 153inherently messy contact with the stuff of nature: In converting raw materials into purified feedstocks, the path to atomic precision leads through reducing molecular complexity, a natural task for conventional chemical processes like those used in industrial processes to dissolve minerals and convert their materials into simple molecular and ionic species. These can then be separated through cascade sorting processes well suited to atomically precise mechanisms, as discussed in Nanosystems. 153with a site that binds a feedstock molecule: A mechanism downstream can probe each site to ensure that it’s full and push empties on a path that leads them around for another go. With an iterative process of this kind, the free energy requirement for reliable binding can approach the minimum required for reducing entropy, which can be thought of as the work required for compression in a configuration space with both positional and angular coordinates. Computational chemists will note that free energy calculations involving oriented, non-solvated molecules in a rigid binding site are far less challenging than calculations that must probe the configuration space of mobile, solvated, conformationally flexible molecules.

A large slowdown factor from this base (to reduce phonon drag in bearings, for example) is compatible with a still-enormous product throughput. 154chemical steps that prepare reactive bits of molecular structure: To work reliably with small reactive groups and fine-grained structures, placement mechanisms must be stiff enough to adequately constrain thermal fluctuations. (Appendix I and Appendix II place this requirement in perspective.) 156each step typically must expend substantial chemical energy: As with binding, an iterated process with conditional repetition can in some instances avoid this constraint. 157density functional methods . . . applied in conservative ways: Methods in quantum chemistry have limited accuracy and ranges of applicability that must be kept in mind when considering which methods to use and how far to trust their results. Density functional methods, for example, typically underestimate the energies of reaction transition states, and this may or may not be acceptable, depending on the intended application.

The Singularity Is Near: When Humans Transcend Biology by Ray Kurzweil


additive manufacturing, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, autonomous vehicles, Benoit Mandelbrot, Bill Joy: nanobots, bioinformatics, brain emulation, Brewster Kahle, Brownian motion, business intelligence,, call centre, carbon-based life, cellular automata, Claude Shannon: information theory, complexity theory, conceptual framework, Conway's Game of Life, cosmological constant, cosmological principle, cuban missile crisis, data acquisition, Dava Sobel, David Brooks, Dean Kamen, disintermediation, double helix, Douglas Hofstadter,, epigenetics, factory automation, friendly AI, George Gilder, Gödel, Escher, Bach, informal economy, information retrieval, invention of the telephone, invention of the telescope, invention of writing, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, linked data, Loebner Prize, Louis Pasteur, mandelbrot fractal, Mikhail Gorbachev, mouse model, Murray Gell-Mann, mutually assured destruction, natural language processing, Network effects, new economy, Norbert Wiener, oil shale / tar sands, optical character recognition, pattern recognition, phenotype, premature optimization, randomized controlled trial, Ray Kurzweil, remote working, reversible computing, Richard Feynman, Richard Feynman, Rodney Brooks, Search for Extraterrestrial Intelligence, semantic web, Silicon Valley, Singularitarianism, speech recognition, statistical model, stem cell, Stephen Hawking, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Kaczynski, telepresence, The Coming Technological Singularity, transaction costs, Turing machine, Turing test, Vernor Vinge, Y2K, Yogi Berra

When the improvement in the evaluation of the design creatures from one generation to the next becomes very small, we stop this iterative cycle of improvement and use the best design(s) in the last generation. (For an algorithmic description of genetic algorithms, see this note.175) The key to a GA is that the human designers don't directly program a solution; rather, they let one emerge through an iterative process of simulated competition and improvement. As we discussed, biological evolution is smart but slow, so to enhance its intelligence we retain its discernment while greatly speeding up its ponderous pace. The computer is fast enough to simulate many generations in a matter of hours or days or weeks. But we have to go through this iterative process only once; once we have let this simulated evolution run its course, we can apply the evolved and highly refined rules to real problems in a rapid fashion. Like neural nets GAs are a way to harness the subtle but profound patterns that exist in chaotic data.

But it certainly achieves vast levels of all of these qualities, including intelligence. With the reverse engineering of the human brain we will be able to apply the parallel, self-organizing, chaotic algorithms of human intelligence to enormously powerful computational substrates. This intelligence will then be in a position to improve its own design, both hardware and software, in a rapidly accelerating iterative process. But there still appears to be a limit. The capacity of the universe to support intelligence appears to be only about 1090 calculations per second, as I discussed in chapter 6. There are theories such as the holographic universe that suggest the possibility of higher numbers (such as 10120), but these levels are all decidedly finite. Of course, the capabilities of such an intelligence may appear infinite for all practical purposes to our current level of intelligence.

pages: 1,606 words: 168,061

Python Cookbook by David Beazley, Brian K. Jones


Firefox, iterative process, p-value, web application

For example: >>> s = ' hello world \n' >>> s = s.strip() >>> s 'hello world' >>> If you needed to do something to the inner space, you would need to use another technique, such as using the replace() method or a regular expression substitution. For example: >>> s.replace(' ', '') 'helloworld' >>> import re >>> re.sub('\s+', ' ', s) 'hello world' >>> It is often the case that you want to combine string stripping operations with some other kind of iterative processing, such as reading lines of data from a file. If so, this is one area where a generator expression can be useful. For example: with open(filename) as f: lines = (line.strip() for line in f) for line in lines: ... Here, the expression lines = (line.strip() for line in f) acts as a kind of data transform. It’s efficient because it doesn’t actually read the data into any kind of temporary list first.

_child_iter) return nextchild except StopIteration: self._child_iter = None return next(self) # Advance to the next child and start its iteration else: self._child_iter = next(self._children_iter).depth_first() return next(self) The DepthFirstIterator class works in the same way as the generator version, but it’s a mess because the iterator has to maintain a lot of complex state about where it is in the iteration process. Frankly, nobody likes to write mind-bending code like that. Define your iterator as a generator and be done with it. 4.5. Iterating in Reverse Problem You want to iterate in reverse over a sequence. Solution Use the built-in reversed() function. For example: >>> a = [1, 2, 3, 4] >>> for x in reversed(a): ... print(x) ... 4 3 2 1 Reversed iteration only works if the object in question has a size that can be determined or if the object implements a __reversed__() special method.

Nor does it perform any kind of validation of the inputs to check if they meet the ordering requirements. Instead, it simply examines the set of items from the front of each input sequence and emits the smallest one found. A new item from the chosen sequence is then read, and the process repeats itself until all input sequences have been fully consumed. 4.16. Replacing Infinite while Loops with an Iterator Problem You have code that uses a while loop to iteratively process data because it involves a function or some kind of unusual test condition that doesn’t fall into the usual iteration pattern. Solution A somewhat common scenario in programs involving I/O is to write code like this: CHUNKSIZE = 8192 def reader(s): while True: data = s.recv(CHUNKSIZE) if data == b'': break process_data(data) Such code can often be replaced using iter(), as follows: def reader(s): for chunk in iter(lambda: s.recv(CHUNKSIZE), b''): process_data(data) If you’re a bit skeptical that it might work, you can try a similar example involving files.

pages: 348 words: 39,850

Data Scientists at Work by Sebastian Gutierrez


Albert Einstein, algorithmic trading, bioinformatics, bitcoin, business intelligence, chief data officer, clean water, cloud computing, computer vision, continuous integration, correlation does not imply causation, crowdsourcing, data is the new oil, DevOps, domain-specific language, follow your passion, full text search, informal economy, information retrieval, Infrastructure as a Service, inventory management, iterative process, linked data, Mark Zuckerberg, microbiome, Moneyball by Michael Lewis explains big data, move fast and break things, natural language processing, Network effects, nuclear winter, optical character recognition, pattern recognition, Paul Graham, personalized medicine, Peter Thiel, pre–internet, quantitative hedge fund, quantitative trading / quantitative finance, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman, self-driving car, side project, Silicon Valley, Skype, software as a service, speech recognition, statistical model, Steve Jobs, stochastic process, technology bubble, text mining, the scientific method, web application

You have to think about what question you want to answer, as well as what question you can answer with the data. So many people, I think, neglect to think about how long that takes and what industry-specific knowledge, as well as knowledge of your own data that this takes. So that was an important lesson. Gutierrez: Once you arrived at the modeling stage, what was the process like? Hu: The modeling was definitely an iterative process. We started off with throwing theoretical models at it, and quickly realized that there were a lot of things we had not accounted for in the initial thinking. For example, most artists do not have all the social media networks set up and connected. So you get this unusual data artifact that, for each row of data about an artist, you only have a couple of metrics for that artist, and it varies across the whole universe of artists.

I realized it when I ran the model, and all of a sudden, all of these artists who did not have certain networks connected were showing up really low—like Kanye West did not have Facebook or a similar network connected, so his predictions were really low, and that obviously did not make any sense. 267 268 Chapter 13 | Victor Hu, Next Big Sound We had to go back and figure out how to deal with that, so it was very much an iterative process. That was where a lot of the statistical testing comes in, and you can see that the fact that someone does not have a network connected actually does provide a lot of information. Eventually, I had to code that in—the presence of a network is one of the predictor variables. So that is one interesting and kind of unusual aspect to the music data that we discovered during the modeling process.

The chapter locations were each chosen for their unique combination of local technical data science expertise and the range of opportunities to work with mission-driven organizations tackling the world’s biggest problems. This means that we’re building on our track record of project work, where we’ve been functioning like a Data for Good consultancy, and taking the first steps to build a global Data for Good movement, where people are doing the same thing around the world in their own communities, on their own, with our help and our playbook. And of course it’s an iterative process.They’ll learn from our experience and framework, but they’ll also find ways that work better and help us improve our process. This year is going to be a really big year for us in terms of understanding this process, its impact, and helping scale it out to others. Gutierrez: Why is it important to scale up DataKind? Porway: We really feel that if we’re going to tackle the world’s biggest problems with data science, then we want as much of a movement and community as possible.

pages: 696 words: 143,736

The Age of Spiritual Machines: When Computers Exceed Human Intelligence by Ray Kurzweil


Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Buckminster Fuller, call centre, cellular automata, combinatorial explosion, complexity theory, computer age, computer vision, cosmological constant, cosmological principle, Danny Hillis, double helix, Douglas Hofstadter, first square of the chessboard / second half of the chessboard, fudge factor, George Gilder, Gödel, Escher, Bach, I think there is a world market for maybe five computers, information retrieval, invention of movable type, Isaac Newton, iterative process, Jacquard loom, Jacquard loom, John von Neumann, Lao Tzu, Law of Accelerating Returns, mandelbrot fractal, Marshall McLuhan, Menlo Park, natural language processing, Norbert Wiener, optical character recognition, pattern recognition, phenotype, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, speech recognition, Steven Pinker, Stewart Brand, stochastic process, technological singularity, Ted Kaczynski, telepresence, the medium is the message, traveling salesman, Turing machine, Turing test, Whole Earth Review, Y2K

This includes a majority stake in Advanced Investment Technologies, which runs a successful fund in which buy-and-sell decisions are made by a program combining these methods.30 Evolutionary and related techniques guide a $95 billion fund managed by Barclays Global Investors, as well as funds run by Fidelity and PanAgora Asset Management. The above paradigm is called an evolutionary (sometimes called genetic) algorithm.31 The system designers don’t directly program a solution; they let one emerge through an iterative process of simulated competition and improvement. Recall that evolution is smart but slow, so to enhance its intelligence we retain its discernment while greatly speeding up its ponderous pace. The computer is fast enough to simulate thousands of generations in a matter of hours or days or weeks. But we have only to go through this iterative process one time. Once we have let this simulated evolution run its course, we can apply the evolved and highly refined rules to real problems in a rapid fashion. Like neural nets, evolutionary algorithms are a way of harnessing the subtle but profound patterns that exist in chaotic data.

pages: 468 words: 124,573

How to Build a Billion Dollar App: Discover the Secrets of the Most Successful Entrepreneurs of Our Time by George Berkowski


Airbnb, Amazon Web Services, barriers to entry, Black Swan, business intelligence, call centre, crowdsourcing,, game design, Google Glasses, Google Hangouts, Google X / Alphabet X, iterative process, Jeff Bezos, Jony Ive, Kickstarter, knowledge worker, Lean Startup, loose coupling, Mark Zuckerberg, minimum viable product, move fast and break things, Network effects, Oculus Rift, Paul Graham, self-driving car, Silicon Valley, Silicon Valley startup, Skype, Snapchat, social graph, software as a service, software is eating the world, Steve Jobs, Steven Levy, Y Combinator

We knew that, if we couldn’t generate enough interest from drivers to use the app, there wouldn’t be a large enough supply of taxis, which would mean no passengers using the service, which would then mean we didn’t have a business. The passenger app could clearly wait. Great. So Jay, Caspar and our taxi-driver cofounders – TRG (Terry, Russell and Gary) – were all full of great ideas about what features to include in the driver app to make it appeal to drivers. After a long debate we had agreed on three features – and they were baked into the very first ‘paper sketches’ of the driver app. Now came the long – and iterative – process of actually transforming those sketches into a piece of software. Every week I hosted a group of about 15 taxi drivers for breakfast at a café and restaurant called Smiths of Smithfields. They were all keen to give us a round of colourful weekly feedback: ‘These buttons are way too small, gov,’ said one driver. ‘I think your fingers are just too fat,’ replied another driver. I couldn’t disagree – he did seem to have pretty fat fingers.

We were able to build it in a matter of weeks, cutting plenty of corners along the way. We even added the ability for the customer to pay with a stored credit card (yes, we faked it again). The first time I summoned one of our test group drivers he actually drove quite a few miles to pick me up, and I have to say I that was truly wowed. I couldn’t believe it had worked. Really Viable Building your own prototype is a tricky and iterative process. What you are trying to do is create the bare bones of something – the very basic vision of your app – and see whether it can become something that people love. You need to get to wow as quickly, cheaply and efficiently as possible. There’s no point wasting time or money on any app that doesn’t get to wow. The point of doing it this way – using paper designs, testing the bare minimum – is to get real data, to get real validation.

pages: 170 words: 42,196

Don't Make Me Think!: A Common Sense Approach to Web Usability by Steve Krug


collective bargaining, iterative process,, Silicon Valley, web application, Whole Earth Catalog

People like to think, for instance, that they can use testing to prove whether navigation system “a” is better than navigation system “b”, but you can’t. No one has the resources to set up the kind of controlled experiment you’d need. What testing can do is provide you with invaluable input which, taken together with your experience, professional judgment, and common sense, will make it easier for you to choose wisely—and with greater confidence—between “a” and “b.” > Testing is an iterative process. Testing isn’t something you do once. You make something, test it, fix it, and test it again. > Nothing beats a live audience reaction. One reason why the Marx Brothers’ movies are so wonderful is that before they started filming they would go on tour on the vaudeville circuit and perform scenes from the movie, doing five shows a day, improvising constantly and noting which lines got the best laughs.

pages: 199 words: 43,653

Hooked: How to Build Habit-Forming Products by Nir Eyal


Airbnb, AltaVista, Cass Sunstein, choice architecture, cognitive bias, cognitive dissonance,, framing effect, game design, Google Glasses, Inbox Zero, invention of the telephone, iterative process, Jeff Bezos, Lean Startup, Mahatma Gandhi, Mark Zuckerberg, meta analysis, meta-analysis, Oculus Rift, Paul Buchheit, Paul Graham, Peter Thiel, QWERTY keyboard, Silicon Valley, Silicon Valley startup, Snapchat, TaskRabbit, telemarketer, Toyota Production System, Y Combinator

The process of developing successful habit-forming technologies requires patience and persistence. The Hook Model can be a helpful tool for filtering out bad ideas with low habit potential as well as a framework for identifying room for improvement in existing products. However, after the designer has formulated new hypotheses, there is no way to know which ideas will work without testing them with actual users. Building a habit-forming product is an iterative process and requires user behavior analysis and continuous experimentation. How can you implement the concepts in this book to measure your product’s effectiveness building user habits? Through my studies and discussions with entrepreneurs at today’s most successful habit-forming companies, I’ve distilled this process into what I call “Habit Testing.” It is a process inspired by the build-measure-learn methodology championed by the lean startup movement.

pages: 137 words: 44,363

Design Is a Job by Mike Monteiro


4chan, crowdsourcing, index card, iterative process, John Gruber, Kickstarter, late fees, Steve Jobs

Encourage them to stay in their own zone of expertise and they won’t attempt to hop on yours. Never apologize for what you’re not showing. By the time you’re presenting, you should be focused on presenting what you have, not making excuses for what you don’t. And you need to believe what you’re saying to convince the client of the same. If you think the work is on the way to meeting their goals then say that. Design is an iterative process, done with a client’s proper involvement at key points. The goal isn’t always to present finished work; it’s to present work at the right time. I’ve met a few designers over the years who feel like selling design is manipulation. Manipulation is convincing someone that the truth is different than what it seems. You’re familiar with the marketing phrase “Sell the sizzle, not the steak”?

pages: 202 words: 62,199

Essentialism: The Disciplined Pursuit of Less by Greg McKeown


Albert Einstein, Clayton Christensen, Daniel Kahneman / Amos Tversky, deliberate practice, double helix,, endowment effect, Isaac Newton, iterative process, Jeff Bezos, Lao Tzu, loss aversion, Mahatma Gandhi, microcredit, minimum viable product, North Sea oil, Peter Thiel, Ralph Waldo Emerson, Richard Thaler, Rosa Parks, side project, Silicon Valley, Silicon Valley startup, sovereign wealth fund, Steve Jobs

We can ask ourselves, “What is the smallest amount of progress that will be useful and valuable to the essential task we are trying to get done?” I used this practice in writing this book. For example, when I was still in the exploratory mode of the book, before I’d even begun to put pen to paper (or fingers to keyboard), I would share a short idea (my minimal viable product) on Twitter. If it seemed to resonate with people there, I would write a blog piece on Harvard Business Review. Through this iterative process, which required very little effort, I was able to find where there seemed to be a connection between what I was thinking and what seemed to have the highest relevancy in other people’s lives. It is the process Pixar uses on their movies. Instead of starting with a script, they start with storyboards—or what have been described as the comic book version of a movie. They try ideas out and see what works.

pages: 222 words: 53,317

Overcomplicated: Technology at the Limits of Comprehension by Samuel Arbesman


3D printing, algorithmic trading, Anton Chekhov, Apple II, Benoit Mandelbrot, citation needed, combinatorial explosion, Danny Hillis, David Brooks, discovery of the americas,, Erik Brynjolfsson, Flash crash, friendly AI, game design, Google X / Alphabet X, Googley, HyperCard, Inbox Zero, Isaac Newton, iterative process, Kevin Kelly, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, mandelbrot fractal, Minecraft, Netflix Prize, Nicholas Carr, Parkinson's law, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Second Machine Age, self-driving car, software studies, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, superintelligent machines, Therac-25, Tyler Cowen: Great Stagnation, urban planning, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Y2K

Humility recognizes our own limitations but is not paralyzed by them, nor does it enshrine them. A humble approach to our technologies helps us strive to understand these human-made, messy constructions, yet still yield to our limits. And this humble approach to technology fits quite nicely with biological thinking. While at every moment an incremental approach to knowledge provides additional understanding of a system, this iterative process will always feel incomplete. And that’s okay. New York Times columnist David Brooks has noted, “Wisdom starts with epistemological modesty.” Humility, alongside an interest in the details of complex systems, can do what both fear and worship cannot: help us peer and poke around the backs of our systems, even if we never look them in the face with complete understanding. In many instances, an incomplete muddle of understanding may be the best that we can do.

pages: 189 words: 52,741

Lifestyle Entrepreneur: Live Your Dreams, Ignite Your Passions and Run Your Business From Anywhere in the World by Jesse Krieger


Airbnb, bounce rate, call centre, carbon footprint, Deng Xiaoping, financial independence, follow your passion, income inequality, iterative process, Ralph Waldo Emerson, search engine result page, Skype, software as a service, South China Sea, Steve Jobs

Ask for specific deliverables, but always ask for 1-2 creative ideas of their own. This encourages creativity and you may be pleasantly surprised Once the initial sketches designs are completed I’ll look for various elements in the logo that I like and write feedback asking them to incorporate various aspects from the initial designs into a new round of logos based on my feedback. This is an iterative process where each round of designs helps clarify the idea I have in mind and informs the directions I give the designer for the next round of improvements. Generally going through this process 2-3 times gets me 80-90% of the way there and then the final changes usually revolve around changing font styles, adjusting color schemes and the placement of elements within the logo. Having a basic working knowledge of Photoshop allows me to try ideas out and play with the placement of elements, although describing the changes that need to be made accomplishes the same goal.

pages: 167 words: 50,652

Alternatives to Capitalism by Robin Hahnel, Erik Olin Wright


3D printing, affirmative action, crowdsourcing, inventory management, iterative process, Kickstarter, loose coupling, means of production, profit maximization, race to the bottom, transaction costs

If the proposals are rejected, households revise them. 5.Neighborhood consumption councils aggregate the approved individual consumption requests of all households in the neighborhood, append requests for whatever neighborhood public goods they want, and submit the total list as the neighborhood consumption council’s request in the planning process. 6.Higher-level federations of consumption councils make requests for whatever public goods are consumed by their membership. 7.On the basis of all of the consumption proposals along with the production proposals from worker councils, the IFB recalculates the indicative prices and, where necessary, sends proposals back to the relevant councils for revision. 8.This iterative process continues until no revisions are needed. There are two issues that I would like to raise with this account about how household consumption planning would actually work in practice: (1) How useful is household consumption planning? (2) How marketish are “adjustments”? How Useful Is Household Consumption Planning? Robin argues that this planning process would not be especially demanding on people.

pages: 176 words: 54,784

The Subtle Art of Not Giving a F*ck: A Counterintuitive Approach to Living a Good Life by Mark Manson


false memory syndrome, fear of failure, iterative process, Parkinson's law

Just as Present Mark can look back on Past Mark’s every flaw and mistake, one day Future Mark will look back on Present Mark’s assumptions (including the contents of this book) and notice similar flaws. And that will be a good thing. Because that will mean I have grown. There’s a famous Michael Jordan quote about him failing over and over and over again, and that’s why he succeeded. Well, I’m always wrong about everything, over and over and over again, and that’s why my life improves. Growth is an endlessly iterative process. When we learn something new, we don’t go from “wrong” to “right.” Rather, we go from wrong to slightly less wrong. And when we learn something additional, we go from slightly less wrong to slightly less wrong than that, and then to even less wrong than that, and so on. We are always in the process of approaching truth and perfection without actually ever reaching truth or perfection. We shouldn’t seek to find the ultimate “right” answer for ourselves, but rather, we should seek to chip away at the ways that we’re wrong today so that we can be a little less wrong tomorrow.

pages: 307 words: 17,123

Behind the cloud: the untold story of how went from idea to billion-dollar company--and revolutionized an industry by Marc Benioff, Carlye Adler


Albert Einstein, Apple's 1984 Super Bowl advert, barriers to entry, Bay Area Rapid Transit, business continuity plan, call centre, carbon footprint, Clayton Christensen, cloud computing, corporate social responsibility, crowdsourcing, iterative process, Maui Hawaii, Nicholas Carr, platform as a service, Silicon Valley, software as a service, Steve Ballmer, Steve Jobs

In fact, we have determined a sequential process to growth that we initiated in the United States and adhere to in nearly every market we enter. The system includes entering a country, establishing a beachhead, gaining customers, earning local references, and then making hires. Next, we seek partners, build add-ons, and grow field sales. It is a system that operates as a machine with distinct cogs that work together. The best part is that it is an iterative process that works in almost all markets; or as Doug Farber, 177 BEHIND THE CLOUD who’s built our markets in Australia and Asia, says, the ability to ‘‘rinse and repeat’’ is the key to global growth. Play #81: Uphold a One-Company Attitude Across Borders The Internet was making the world more homogeneous when it came to IT needs, and the services we were selling were not affected by global boundaries.

pages: 220 words: 73,451

Democratizing innovation by Eric von Hippel


additive manufacturing, correlation coefficient, Debian, hacker house, informal economy, inventory management, iterative process, James Watt: steam engine, knowledge economy, meta analysis, meta-analysis, Network effects, placebo effect, principal–agent problem, Richard Stallman, software patent, transaction costs, Vickrey auction

Repartitioning of Development Tasks To create the setting for a toolkit, one must partition the tasks of product development to concentrate need-related information in some and solution-related information in others. This can involve fundamental changes to the underlying architecture of a product or service. As illustration, I first discuss the repartioning of the tasks involved in custom semiconductor chip development. Then, I show how the same principles can be applied in the less technical context of custom food design. Traditionally, fully customized integrated circuits were developed in an iterative process like that illustrated in figure 11.1. The process began with a user specifying the functions that the custom chip was to perform to a manufacturer of integrated circuits. The chip would then be designed by manufacturer employees, and an (expensive) prototype would be produced and sent to the user. Testing by the user would typically reveal faults in the chip and/or in the initial specification, responsive changes would be made, a new prototype would be built.

pages: 411 words: 80,925

What's Mine Is Yours: How Collaborative Consumption Is Changing the Way We Live by Rachel Botsman, Roo Rogers


Airbnb, barriers to entry, Bernie Madoff, bike sharing scheme, Buckminster Fuller, carbon footprint, Cass Sunstein, collaborative consumption, collaborative economy, Community Supported Agriculture, credit crunch, crowdsourcing, dematerialisation, disintermediation,, experimental economics, George Akerlof, global village, Hugh Fearnley-Whittingstall, information retrieval, iterative process, Kevin Kelly, Kickstarter, late fees, Mark Zuckerberg, market design, Menlo Park, Network effects, new economy, new new economy, out of africa, Parkinson's law, peer-to-peer lending, Ponzi scheme, pre–internet, recommendation engine, RFID, Richard Stallman, ride hailing / ride sharing, Robert Shiller, Robert Shiller, Ronald Coase, Search for Extraterrestrial Intelligence, SETI@home, Simon Kuznets, Skype, slashdot, smart grid, South of Market, San Francisco, Stewart Brand, The Nature of the Firm, The Spirit Level, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thorstein Veblen, Torches of Freedom, transaction costs, traveling salesman, ultimatum game, Victor Gruen, web of trust, women in the workforce, Zipcar

Consumption is no longer an asymmetrical activity of endless acquisition but a dynamic push and pull of giving and collaborating in order to get what you want. Along the way, the acts of collaboration and giving become an end in themselves. Collaborative Consumption shows consumers that their material wants and needs do not need to be in conflict with the responsibilities of a connected citizen. The idea of happiness being epitomized by the lone shopper surrounded by stuff becomes absurd, and happiness becomes a much broader, more iterative process. Reputation Bank Account Reputation is one of the most salient areas where the push and pull between the collective good and self-interest have real impact. Reputation is a personal reward that is intimately bound up with respecting and considering the needs of others. Undeniably, almost all of us wonder and care, at least a little bit, what other people—friends, family, coworkers, and people we have just met—think about us.

pages: 239 words: 64,812

Geek Sublime: The Beauty of Code, the Code of Beauty by Vikram Chandra


Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, British Empire, business process, conceptual framework, create, read, update, delete, crowdsourcing, East Village, European colonialism, finite state, Firefox, Flash crash, glass ceiling, Grace Hopper, haute couture, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce

The best-known assertion of this notion is the essay “Hackers and Painters” by programmer and venture capitalist Paul Graham. “Of all the different types of people I’ve known, hackers and painters are among the most alike,” writes Graham. “What hackers and painters have in common is that they’re both makers. Along with composers, architects, and writers, what hackers and painters are trying to do is make good things.”1 According to Graham, the iterative processes of programming—write, debug (discover and remove bugs, which are coding errors, mistakes), rewrite, experiment, debug, rewrite—exactly duplicate the methods of artists: “The way to create something beautiful is often to make subtle tweaks to something that already exists, or to combine existing ideas in a slightly new way … You should figure out programs as you’re writing them, just as writers and painters and architects do.”2 Attention to detail further marks good hackers with artist-like passion: All those unseen details [in a Leonardo da Vinci painting] combine to produce something that’s just stunning, like a thousand barely audible voices all singing in tune.

pages: 186 words: 50,651

Interactive Data Visualization for the Web by Scott Murray


barriers to entry, Firefox, iterative process, web application

So we usually employ the power of computation to speed things up. The increased speed enables us to work with much larger datasets of thousands or millions of values; what would have taken years of effort by hand can be mapped in a moment. Just as important, we can rapidly experiment with alternate mappings, tweaking our rules and seeing their output re-rendered immediately. This loop of write/render/evaluate is critical to the iterative process of refining a design. Sets of mapping rules function as design systems. The human hand no longer executes the visual output; the computer does. Our human role is to conceptualize, craft, and write out the rules of the system, which is then finally executed by software. Unfortunately, software (and computation generally) is extremely bad at understanding what, exactly, people want. (To be fair, many humans are also not good at this challenging task.)

pages: 202 words: 64,725

Designing Your Life: How to Build a Well-Lived, Joyful Life by Bill Burnett, Dave Evans


David Brooks, fear of failure, financial independence, game design, Haight Ashbury, invention of the printing press, iterative process, knowledge worker, market design, science of happiness, Silicon Valley, Silicon Valley startup, Skype, Steve Jobs

Knowing the current status of your health / work / play / love dashboard gives you a framework and some data about yourself, all in one place. Only you know what’s good enough or not good enough—right now. After a few more chapters and a few more tools and ideas, you may want to come back to this assessment and check the dashboard one more time, to see if anything has changed. Since life design is an iterative process of prototypes and experimentation, there are lots of on ramps and off ramps along the way. If you’re beginning to think like a designer, you will recognize that life is never done. Work is never done. Play is never done. Love and health are never done. We are only done designing our lives when we die. Until then, we’re involved in a constant iteration of the next big thing: life as we know it.

pages: 310 words: 82,592

Never Split the Difference: Negotiating as if Your Life Depended on It by Chris Voss, Tahl Raz


banking crisis, Black Swan, clean water, cognitive bias, Daniel Kahneman / Amos Tversky, Donald Trump, framing effect, friendly fire, iterative process, loss aversion, market fundamentalism, price anchoring, telemarketer, ultimatum game, uranium enrichment

All I knew about the techniques we used at the FBI was that they worked. In the twenty years I spent at the Bureau we’d designed a system that had successfully resolved almost every kidnapping we applied it to. But we didn’t have grand theories. Our techniques were the products of experiential learning; they were developed by agents in the field, negotiating through crisis and sharing stories of what succeeded and what failed. It was an iterative process, not an intellectual one, as we refined the tools we used day after day. And it was urgent. Our tools had to work, because if they didn’t someone died. But why did they work? That was the question that drew me to Harvard, to that office with Mnookin and Blum. I lacked confidence outside my narrow world. Most of all, I needed to articulate my knowledge and learn how to combine it with theirs—and they clearly had some—so I could understand, systematize, and expand it.

Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei


bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, distributed generation, finite state, information retrieval, iterative process, knowledge worker, linked data, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, random walk, recommendation engine, RFID, semantic web, sentiment analysis, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, web application

For each of these K seeds, we find all the patterns within a ball of a size specified by τ. All the patterns in each “ball” are then fused together to generate a set of superpatterns. These superpatterns form a new pool. If the pool contains more than K patterns, the next iteration begins with this pool for the new round of random drawing. As the support set of every superpattern shrinks with each new iteration, the iteration process terminates. Note that Pattern-Fusion merges small subpatterns of a large pattern instead of incrementally-expanding patterns with single items. This gives the method an advantage to circumvent midsize patterns and progress on a path leading to a potential colossal pattern. The idea is illustrated in Figure 7.10. Each point shown in the metric space represents a core pattern. In comparison to a smaller pattern, a larger pattern has far more core patterns that are close to one another, all of which are bounded by a ball, as shown by the dotted lines.

Cross-validation techniques for accuracy estimation (described in Chapter 8) can be used to help decide when an acceptable network has been found. A number of automated techniques have been proposed that search for a “good” network structure. These typically use a hill-climbing approach that starts with an initial structure that is selectively modified. 9.2.3. Backpropagation “How does backpropagation work?” Backpropagation learns by iteratively processing a data set of training tuples, comparing the network's prediction for each tuple with the actual known target value. The target value may be the known class label of the training tuple (for classification problems) or a continuous value (for numeric prediction). For each training tuple, the weights are modified so as to minimize the mean-squared error between the network's prediction and the actual target value.

The Partitioning Around Medoids (PAM) algorithm (see Figure 10.5 later) is a popular realization of k-medoids clustering. It tackles the problem in an iterative, greedy way. Like the k-means algorithm, the initial representative objects (called seeds) are chosen arbitrarily. We consider whether replacing a representative object by a nonrepresentative object would improve the clustering quality. All the possible replacements are tried out. The iterative process of replacing representative objects by other objects continues until the quality of the resulting clustering cannot be improved by any replacement. This quality is measured by a cost function of the average dissimilarity between an object and the representative object of its cluster. Specifically, let o1, …, ok be the current set of representative objects (i.e., medoids). To determine whether a nonrepresentative object, denoted by orandom, is a good replacement for a current medoid oj (1 ≤ j ≤ k), we calculate the distance from every object p to the closest object in the set {o1, …, oj−1, orandom, oj+1, …, ok}, and use the distance to update the cost function.

pages: 728 words: 182,850

Cooking for Geeks by Jeff Potter


3D printing, A Pattern Language, carbon footprint, centre right, Community Supported Agriculture, crowdsourcing, double helix,, European colonialism, fear of failure, food miles, hacker house, haute cuisine, helicopter parent, Internet Archive, iterative process, Parkinson's law, placebo effect, random walk, slashdot, stochastic process, the scientific method

Sure, to be proficient at something you do need the technical skill to be able to see where you want to go and to understand how to get there. And happy accidents do happen. However, the methodical approach is to look at A, wonder if maybe B would be better, and rework it until you have B. ("Hmm, seems a bit dull, needs a bit more zing, how about some lemon juice?") The real skill isn’t in getting to B, though: it’s in holding the memory of A in your head and judging whether B is actually an improvement. It’s an iterative process—taste, adjust, taste, adjust—with each loop either improving the dish or educating you about what guesses didn’t work out. Even the bad guesses are useful because they’ll help you build up a body of knowledge. Taste the dish. It’s your feedback mechanism both for checking if A is "good enough" and for determining if B is better than A. Don’t be afraid to burn dinner! Talking with other geeks, I realized how lucky I was as a kid to have parents who both liked to cook and made time to sit down with us every day over a good home-cooked meal.

It’s like learning to play the guitar: at first you strive just to hit the notes and play the chords, and it takes time to gain command of the basic techniques and to move on to the level where subtle improvisation and nuanced expression can occur. If your dream is to play in a band, don’t expect to get up on stage after a day or even a month; start by picking up a basic book on learning to play the guitar and practicing somewhere you’re comfortable. A beta tester for this book commented: While there are chefs with natural-born abilities, people have to be aware that learning to cook is an iterative process. They have to learn to expect not to get it right the first time, and proceed from there, doing it again and again. What about when you fubar (foobar?) a meal and can’t figure out why? Think of it like not solving a puzzle on the first try. When starting to cook, make sure you don’t pick puzzles that are too difficult. Start with simpler puzzles (recipes) that will allow you to gain the insights needed to solve the harder ones.

pages: 678 words: 216,204

The Wealth of Networks: How Social Production Transforms Markets and Freedom by Yochai Benkler


affirmative action, barriers to entry, bioinformatics, Brownian motion, call centre, Cass Sunstein, centre right, clean water, dark matter, desegregation, East Village, fear of failure, Firefox, game design, George Gilder, hiring and firing, Howard Rheingold, informal economy, invention of radio, Isaac Newton, iterative process, Jean Tirole, jimmy wales, market bubble, market clearing, Marshall McLuhan, New Journalism, optical character recognition, pattern recognition, pre–internet, price discrimination, profit maximization, profit motive, random walk, recommendation engine, regulatory arbitrage, rent-seeking, RFID, Richard Stallman, Ronald Coase, Search for Extraterrestrial Intelligence, SETI@home, shareholder value, Silicon Valley, Skype, slashdot, social software, software patent, spectrum auction, technoutopianism, The Fortune at the Bottom of the Pyramid, The Nature of the Firm, transaction costs

The fact that power law distributions of attention to Web sites result from random distributions of interests, not from formal or practical bottlenecks that cannot be worked around, means that whenever an individual chooses to search based on some mechanism other than the simplest, thinnest belief that individuals are all equally similar and dissimilar, a different type of site will emerge as highly visible. Topical sites cluster, unsurprisingly, around topical preference groups; one site does not account for all readers irrespective of their interests. We, as individuals, also go through an iterative process of assigning a likely relevance to the judgments of others. Through this process, we limit the information overload that would threaten to swamp our capacity to know; we diversify the sources of information to which we expose ourselves; and we avoid a stifling dependence on an editor whose judgments we cannot circumvent. We might spend some of our time using the most general, "human interest has some overlap" algorithm represented by Google for some things, but use political common interest, geographic or local interest, hobbyist, subject matter, or the like, to slice the universe of potential others with whose judgments we will choose to affiliate for any given search.

Without forming or requiring a formal hierarchy, and without creating single points of control, each cluster generates a set of sites that offer points of initial filtering, in ways that are still congruent with the judgments of participants in the highly connected small cluster. The process is replicated at larger and more general clusters, to the point where positions that have been synthesized "locally" and "regionally" can reach Web-wide visibility and salience. It turns out that we are not intellectual lemmings. We do not use the freedom that the network has made possible to plunge into the abyss of incoherent babble. Instead, through iterative processes of cooperative filtering and "transmission" through the high visibility nodes, the low-end thin tail turns out to be a peer-produced filter and transmission medium for a vastly larger number of speakers than was imaginable in the mass-media model. 459 The effects of the topology of the network are reinforced by the cultural forms of linking, e-mail lists, and the writable Web. The network topology literature treats every page or site as a node.

pages: 398 words: 86,855

Bad Data Handbook by Q. Ethan McCallum


Amazon Mechanical Turk, asset allocation, barriers to entry, Benoit Mandelbrot, business intelligence, cellular automata, chief data officer, cloud computing, cognitive dissonance, combinatorial explosion, conceptual framework, database schema,, Firefox, Flash crash, Gini coefficient, illegal immigration, iterative process, labor-force participation, loose coupling, natural language processing, Netflix Prize, quantitative trading / quantitative finance, recommendation engine, sentiment analysis, statistical model, supply-chain management, text mining, too big to fail, web application

This was one of the areas where our data contest approach led to quite a different experience than you’d find with a more traditionally outsourced project. All of the previous preparation steps had created an extremely well-defined problem for the contestants to tackle, and on our end we couldn’t update any data or change the rules part-way through. Working with a consultant or part-time employee is a much more iterative process, because you can revise your requirements and inputs as you go. Because those changes are often costly in terms of time and resources, up-front preparation is still extremely effective. Several teams apparently tried to use external data sources to help improve their results, without much success. Their hope was that by adding in extra information about things like the geographic location where a photo was taken, they could produce better guesses about its quality.

pages: 502 words: 107,510

Natural Language Annotation for Machine Learning by James Pustejovsky, Amber Stubbs


Amazon Mechanical Turk, bioinformatics, cloud computing, computer vision, crowdsourcing, easy for humans, difficult for computers, finite state, game design, information retrieval, iterative process, natural language processing, pattern recognition, performance metric, sentiment analysis, social web, speech recognition, statistical model, text mining

In particular, we will look at: What makes a good annotation goal Where to find related research How your dataset reflects your annotation goals Preparing the data for annotators to use How much data you will need for your task What you should be able to take away from this chapter is a clear answer to the questions “What am I trying to do?”, “How am I trying to do it?”, and “Which resources best fit my needs?”. As you progress through the MATTER cycle, the answers to these questions will probably change—corpus creation is an iterative process—but having a stated goal will help keep you from getting off track. Defining Your Goal In terms of the MATTER cycle, at this point we’re right at the start of “M”—being able to clearly explain what you hope to accomplish with your corpus is the first step in creating your model. While you probably already have a good idea about what you want to do, in this section we’ll give you some pointers on how to create a goal definition that is useful and will help keep you focused in the later stages of the MATTER cycle.

pages: 411 words: 108,119

The Irrational Economist: Making Decisions in a Dangerous World by Erwann Michel-Kerjan, Paul Slovic


Andrei Shleifer, availability heuristic, bank run, Black Swan, Cass Sunstein, clean water, cognitive dissonance, collateralized debt obligation, complexity theory, conceptual framework, corporate social responsibility, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-subsidies, Daniel Kahneman / Amos Tversky, endowment effect, experimental economics, financial innovation, Fractional reserve banking, George Akerlof, hindsight bias, incomplete markets, invisible hand, Isaac Newton, iterative process, Loma Prieta earthquake, London Interbank Offered Rate, market bubble, market clearing, moral hazard, mortgage debt, placebo effect, price discrimination, price stability, RAND corporation, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, statistical model, stochastic process, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, ultimatum game, University of East Anglia, urban planning

All rights reserved. Figure 6.2 provides a snapshot of the quality of the decision at a given point in time, in order to judge whether more work is needed. The practical challenge is to select decision-making methods that move the cursor in each link to the right efficiently, while periodically taking stock of the overall profile, without overshooting the optimal target. This is essentially a heuristic and iterative process, guided by intuition and decision coaching, in order to find the optimal position for each link. It is important to recognize that in the rational components of the model lie judgments and values that are behaviorally rooted and, thus, that deep biases may never fully surface or be completely eliminated. This is a key challenge for the aggregation assumption alluded to earlier. The DEF approach, illustrated in Figure 6.3, is a well-founded, proven, and practical way to integrate traditional decision analysis with behavioral insights about the psychology and sociology of choice.

pages: 364 words: 102,528

An Economist Gets Lunch: New Rules for Everyday Foodies by Tyler Cowen


agricultural Revolution, big-box store, business climate, carbon footprint, cognitive bias, cross-subsidies, East Village,, food miles, guest worker program, haute cuisine, illegal immigration, informal economy, iterative process, oil shale / tar sands, out of africa, pattern recognition, Peter Singer: altruism, price discrimination, refrigerator car, The Wealth of Nations by Adam Smith, Tyler Cowen: Great Stagnation, Upton Sinclair, winner-take-all economy, women in the workforce

They are a constant challenge as to whether I have mastered various codes of Indian cooking and their lack of detail gives me room to improvise, learn, and make mistakes. Every now and then I go back to the more thorough cookbooks (another is 1,000 Indian Recipes by Neelam Batra) to learn new recipes and techniques, and then I do a batch more Indian cooking from the shorter guides. It’s an iterative process where I step back and forth between a food world where I am told what to do and a food world where I am immersed in the implicit codes of meaning and contributing to innovation within established structures. Some of your cookbooks, or more broadly your recipe sources, should have very short recipes for use in this manner. Maureen Evans posts recipes at @cookbook on Twitter and has a book called Eat Tweet.

pages: 398 words: 100,679

The Knowledge: How to Rebuild Our World From Scratch by Lewis Dartnell


agricultural Revolution, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, clean water, Dava Sobel, decarbonisation, discovery of penicillin, Dmitri Mendeleev, global village, Haber-Bosch Process, invention of movable type, invention of radio, invention of writing, iterative process, James Watt: steam engine, John Harrison: Longitude, lone genius, nuclear winter, Richard Feynman, Richard Feynman, technology bubble, the scientific method, Thomas Kuhn: the structure of scientific revolutions, trade route

Once you get fermentation, throw half of the culture away and replace with fresh flour and water in the same proportions, repeating this refill twice a day. This gives the culture more nutrients to reproduce and continually doubles the size of the microbial territory to expand into. After about a week, once you have a healthy-smelling culture reliably growing and frothing after every replenishment, like a microbial pet thriving on the feed left in its bowl, you are ready to extract some of the dough and bake bread. By running through this iterative process you have essentially created a rudimentary microbiological selection protocol—narrowing down to wild strains that can grow on the starch nutrients in the flour with the fastest cell division rates at a temperature of around 20°–30°C. Your resultant sourdough is not a pure culture of a single isolate, but actually a balanced community of lactobacillus bacteria, able to break down the complex storage molecules of the grain, and yeast living on the byproducts of the lactobacilli and releasing carbon dioxide gas to leaven the bread.

Data Mining the Web: Uncovering Patterns in Web Content, Structure, and Usage by Zdravko Markov, Daniel T. Larose


Firefox, information retrieval, Internet Archive, iterative process, natural language processing, pattern recognition, random walk, recommendation engine, semantic web, speech recognition, statistical model, William of Occam

The standard formulas for mean and standard deviation are adjusted to use the cluster membership probabilities wi as weights. Thus, the following weighted mean and standard deviation are computed: n μC = i=1 n n σC2 = wi xi i=1 wi wi (xi − μC )2 n w i=1 i i=1 Note that the sums go for all values, not only for those belonging to the corresponding cluster. Thus, given a sample size n, we have an n-component weight vector for each cluster. The iterative process is similar to that of k-means; the data points are redistributed among clusters repeatedly until the process reaches a fixpoint. The k-means algorithm stops when the cluster membership does not change from one iteration to the next. k-Means uses “hard”2 cluster assignment, however, whereas the EM uses 2 In fact, there exist versions of k-means with soft assignment, which are special cases of EM. 80 CHAPTER 3 CLUSTERING “soft” assignment—probability of membership.

pages: 336 words: 93,672

The Future of the Brain: Essays by the World's Leading Neuroscientists by Gary Marcus, Jeremy Freeman


23andMe, Albert Einstein, bioinformatics, bitcoin, brain emulation, cloud computing, complexity theory, computer age, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, dark matter, data acquisition, Drosophila, epigenetics, Google Glasses, iterative process, linked data, mouse model, optical character recognition, pattern recognition, personalized medicine, phenotype, race to the bottom, Richard Feynman, Richard Feynman, Ronald Reagan, semantic web, speech recognition, stem cell, Steven Pinker, supply-chain management, Turing machine, web application

Even if it is clear which kinds of measurements we want to make (for example, whole-brain calcium imaging of the larval zebrafish, two-photon imaging of multiple areas of mouse cortex), it is not clear which behaviors the organism should be performing while we collect those data, or which environment it should be experiencing. It is hard to imagine a single dataset, however massive, from which the truths we seek will emerge with only the right analysis, especially when we consider the nearly infinite set of alternative experiments we might have performed. Instead, we need an iterative process by which we move back and forth between using analytic tools to identify patterns in data and using the recovered patterns to inform and guide the next set of experiments. After many iterations, the patterns we identify may coalesce into rules and themes, perhaps even themes that extend across different systems and modalities. And with luck, we might ultimately arrive at theories of neural computation, which will shape not only the design of our experiments but also the very foundations of neuroscience.

pages: 398 words: 108,889

The Paypal Wars: Battles With Ebay, the Media, the Mafia, and the Rest of Planet Earth by Eric M. Jackson


bank run, business process, call centre, disintermediation, Elon Musk, index fund, Internet Archive, iterative process, Joseph Schumpeter, market design, Menlo Park, moral hazard, Network effects, new economy, offshore financial centre, Peter Thiel, Sand Hill Road, shareholder value, Silicon Valley, Silicon Valley startup, telemarketer, The Chicago School, Turing test

But the liberals are even worse—they always want to rely on regulation to make things better. Neither side is asking the right questions regarding the pressing needs of the day. “In our own way, at PayPal this is what we’ve been doing all along. We’ve been creating a system that enables global commerce for everyone. And we’ve been fighting the people who would do us and our users harm. It’s been a gradual, iterative process, and we’ve gotten plenty of stuff wrong along the way, but we’ve kept moving in the right direction to address these major issues while the rest of the world has been ignoring them. “And so I’d like to send a message back to planet Earth from Palo Alto. Life is good here in Palo Alto. We’ve been able to improve on many of the ways you do things. Come to Palo Alto for a visit sometime and learn something.

pages: 292 words: 94,324

How Doctors Think by Jerome Groopman


affirmative action, Atul Gawande, Daniel Kahneman / Amos Tversky, deliberate practice, fear of failure, framing effect, index card, iterative process, medical malpractice, medical residency, Menlo Park, pattern recognition, placebo effect, stem cell, theory of mind

In addition to his work with patients, Nimer oversees a large research program studying malignant blood diseases like lymphoma and leukemia. "I believe that my thinking in the clinic is helped by having a laboratory. If you do an experiment two times and you don't get results, then it doesn't make sense to do it the same way a third time. You have to ask yourself: What am I missing? How should I do it differently the next time? It is the same iterative process in the clinic. If you are taking care of someone and he is not getting better, then you have to think of a new way to treat him, not just keep giving him the same therapy. You also have to wonder whether you are missing something." This seemingly obvious set of statements is actually a profound realization, because it is much easier both psychologically and logistically for a doctor to keep treating a serious disease with a familiar therapy even when the disease is not responding.

pages: 323 words: 95,939

Present Shock: When Everything Happens Now by Douglas Rushkoff


algorithmic trading, Andrew Keen, bank run, Benoit Mandelbrot, big-box store, Black Swan, British Empire, Buckminster Fuller, cashless society, citizen journalism, clockwork universe, cognitive dissonance, Credit Default Swap, crowdsourcing, Danny Hillis, disintermediation, Donald Trump, double helix, East Village, Elliott wave, European colonialism, Extropian, facts on the ground, Flash crash, game design, global supply chain, global village, Howard Rheingold, hypertext link, Inbox Zero, invention of agriculture, invention of hypertext, invisible hand, iterative process, John Nash: game theory, Kevin Kelly, laissez-faire capitalism, Law of Accelerating Returns, loss aversion, mandelbrot fractal, Marshall McLuhan, Merlin Mann, Milgram experiment, mutually assured destruction, Network effects, New Urbanism, Nicholas Carr, Norbert Wiener, Occupy movement, passive investing, pattern recognition, peak oil, price mechanism, prisoner's dilemma, Ralph Nelson Elliott, RAND corporation, Ray Kurzweil, recommendation engine, Silicon Valley, Skype, social graph, South Sea Bubble, Steve Jobs, Steve Wozniak, Steven Pinker, Stewart Brand, supply-chain management, the medium is the message, The Wisdom of Crowds, theory of mind, Turing test, upwardly mobile, Whole Earth Catalog, WikiLeaks, Y2K

She achieved greater efficiency while also granting herself greater flow. The digital can be stacked; the human gets to live in real time. This experience is what makes us creative, intelligent, and capable of learning. As science and innovation writer Steven Johnson has shown, great ideas don’t really come out of sudden eureka moments, but after long, steady slogs through problems.31 They are slow, iterative processes. Great ideas, as Johnson explained it to a TED audience, “fade into view over long periods of time.” For instance, Charles Darwin described his discovery of evolution as a eureka moment that occurred while he was reading Malthus on a particular night in October of 1838. But Darwin’s notebooks reveal that he had the entire theory of evolution long before this moment; he simply hadn’t fully articulated it yet.

pages: 484 words: 104,873

Rise of the Robots: Technology and the Threat of a Jobless Future by Martin Ford


3D printing, additive manufacturing, Affordable Care Act / Obamacare, AI winter, algorithmic trading, Amazon Mechanical Turk, artificial general intelligence, autonomous vehicles, banking crisis, Baxter: Rethink Robotics, Bernie Madoff, Bill Joy: nanobots, call centre, Capital in the Twenty-First Century by Thomas Piketty, Chris Urmson, Clayton Christensen, clean water, cloud computing, collateralized debt obligation, computer age, debt deflation, deskilling, diversified portfolio, Erik Brynjolfsson, factory automation, financial innovation, Flash crash, Fractional reserve banking, Freestyle chess, full employment, Goldman Sachs: Vampire Squid, High speed trading, income inequality, indoor plumbing, industrial robot, informal economy, iterative process, Jaron Lanier, job automation, John Maynard Keynes: technological unemployment, John von Neumann, Khan Academy, knowledge worker, labor-force participation, labour mobility, liquidity trap, low skilled workers, low-wage service sector, Lyft, manufacturing employment, McJob, moral hazard, Narrative Science, Network effects, new economy, Nicholas Carr, Norbert Wiener, obamacare, optical character recognition, passive income, performance metric, Peter Thiel, Plutocrats, plutocrats, post scarcity, precision agriculture, price mechanism, Ray Kurzweil, rent control, rent-seeking, reshoring, RFID, Richard Feynman, Richard Feynman, Rodney Brooks, secular stagnation, self-driving car, Silicon Valley, Silicon Valley startup, single-payer health, software is eating the world, sovereign wealth fund, speech recognition, Spread Networks laid a new fibre optics cable between New York and Chicago, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steven Levy, Steven Pinker, strong AI, Stuxnet, technological singularity, telepresence, telepresence robot, The Bell Curve by Richard Herrnstein and Charles Murray, The Coming Technological Singularity, Thomas L Friedman, too big to fail, Tyler Cowen: Great Stagnation, union organizing, Vernor Vinge, very high income, Watson beat the top human players on Jeopardy!, women in the workforce

Inevitably, we would soon share the planet with something entirely unprecedented: a genuinely alien—and superior—intellect. And that might well be only the beginning. It’s generally accepted by AI researchers that such a system would eventually be driven to direct its intelligence inward. It would focus its efforts on improving its own design, rewriting its software, or perhaps using evolutionary programming techniques to create, test, and optimize enhancements to its design. This would lead to an iterative process of “recursive improvement.” With each revision, the system would become smarter and more capable. As the cycle accelerated, the ultimate result would be an “intelligence explosion”—quite possibly culminating in a machine thousands or even millions of times smarter than any human being. As Hawking and his collaborators put it, it “would be the biggest event in human history.” If such an intelligence explosion were to occur, it would certainly have dramatic implications for humanity.

pages: 315 words: 93,522

How Music Got Free: The End of an Industry, the Turn of the Century, and the Patient Zero of Piracy by Stephen Witt


4chan, barriers to entry, Berlin Wall, big-box store, cloud computing, collaborative economy, crowdsourcing, game design, Internet Archive, invention of movable type, inventory management, iterative process, Jason Scott:, job automation, late fees, mental accounting, packet switching, pattern recognition, pirate software, Ronald Reagan, security theater, sharing economy, side project, Silicon Valley, software patent, Steve Jobs, zero day

A NOTE ON SOURCES A private detective once explained to me the essence of the investigative method: “You start with a document. Then you take that document to a person, and ask them about it. Then that person tells you about another document. You repeat this process until you run out of people, or documents.” Starting with the Affinity e-zine interview quoted in this book, and following this iterative process for the next four years, I ended up with dozens of people and tens of thousands of documents. A comprehensive catalog would take pages—below is a selection. The key interview subjects for this book were Karlheinz Brandenburg, Robert Buchanan, Brad Buckles, Leonardo Chiariglione, Ernst Eberlein, Keith P. Ellison, Frank Foti, Harvey Geller, Bennie Lydell Glover, Bennie Glover, Jr., Loretta Glover, Iain Grant, Tom Grasso, Bernhard Grill, Bruce Hack, Jürgen Herre, Bruce Huckfeldt, James Johnston, Larry Kenswil, Carlos Linares, Henri Linde, Doug Morris, George Murphy, Tyler Newby, Harald Popp, Eileen Richardson, Domingo Rivera, Hilary Rosen, Johnny Ryan, Patrick Saunders, Dieter Seitzer, Jacob Stahler, Alex Stein, Simon Tai, Steve Van Buren, Terry Yates, and Elizabeth Young.

Writing Effective Use Cases by Alistair Cockburn


business process,, create, read, update, delete, finite state, index card, information retrieval, iterative process, recommendation engine, Silicon Valley, web application

The most common use is when there are many asynchronous or interrupting services the user might use, which should not disturb the base use case. Often, they will be developed by different teams. These situations show up with shrink-wrapped software such as word processors, as illustrated above. The second situation is when you are writing additions to a locked requirements document. Susan Lilly writes, "You’re working on a project with an iterative process and multiple drops. You have baselined requirements for a drop. In a subsequent drop, you extend a baselined use case with new or additional functionality. You do not touch the baselined use case." If the base use case is not locked, then the extension is fragile: changing the base use case can mess up the condition mentioned in the extending use case. Be wary about using extension use cases is such situations.

pages: 325 words: 110,330

Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration by Ed Catmull, Amy Wallace


Albert Einstein, business climate, buy low sell high, complexity theory, fear of failure, Golden Gate Park, iterative process, Menlo Park, rolodex, Sand Hill Road, Silicon Valley, Silicon Valley startup, Steve Jobs, Wall-E

But think about how easy it would be for a movie about talking toys to feel derivative, sappy, or overtly merchandise-driven. Think about how off-putting a movie about rats preparing food could be, or how risky it must’ve seemed to start WALL-E with 39 dialogue-free minutes. We dare to attempt these stories, but we don’t get them right on the first pass. And this is as it should be. Creativity has to start somewhere, and we are true believers in the power of bracing, candid feedback and the iterative process—reworking, reworking, and reworking again, until a flawed story finds its throughline or a hollow character finds its soul. As I’ve discussed, first we draw storyboards of the script and then edit them together with temporary voices and music to make a crude mock-up of the film, known as reels. Then the Braintrust watches this version of the movie and discusses what’s not ringing true, what could be better, what’s not working at all.

pages: 378 words: 110,408

Peak: Secrets From the New Science of Expertise by Anders Ericsson, Robert Pool


Albert Einstein, deliberate practice, iterative process, meta analysis, meta-analysis, pattern recognition, randomized controlled trial, Richard Feynman, Richard Feynman, sensible shoes

Once you get to the edge of your field, you may not know exactly where you’re headed, but you know the general direction, and you have spent a good deal of your life building this ladder, so you have a good sense of what it takes to add on one more step. Researchers who study how the creative geniuses in any field—science, art, music, sports, and so on—come up with their innovations have found that it is always a long, slow, iterative process. Sometimes these pathbreakers know what they want to do but don’t know how to do it—like a painter trying to create a particular effect in the eye of the viewer—so they explore various approaches to find one that works. And sometimes they don’t know exactly where they’re going, but they recognize a problem that needs a solution or a situation that needs improving—like mathematicians trying to prove an intractable theorem—and again they try different things, guided by what has worked in the past.

pages: 366 words: 94,209

Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity by Douglas Rushkoff


3D printing, Airbnb, algorithmic trading, Amazon Mechanical Turk, Andrew Keen, bank run, banking crisis, barriers to entry, bitcoin, blockchain, Burning Man, business process, buy low sell high, California gold rush, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, centralized clearinghouse, citizen journalism, clean water, cloud computing, collaborative economy, collective bargaining, colonial exploitation, Community Supported Agriculture, corporate personhood, crowdsourcing, cryptocurrency, disintermediation, diversified portfolio, Elon Musk, Erik Brynjolfsson, ethereum blockchain, fiat currency, Firefox, Flash crash, full employment, future of work, gig economy, Gini coefficient, global supply chain, global village, Google bus, Howard Rheingold, IBM and the Holocaust, impulse control, income inequality, index fund, iterative process, Jaron Lanier, Jeff Bezos, jimmy wales, job automation, Joseph Schumpeter, Kickstarter, loss aversion, Lyft, Mark Zuckerberg, market bubble, market fundamentalism, Marshall McLuhan, means of production, medical bankruptcy, minimum viable product, Naomi Klein, Network effects, new economy, Norbert Wiener, Oculus Rift, passive investing, payday loans, peer-to-peer lending, Peter Thiel, post-industrial society, profit motive, quantitative easing, race to the bottom, recommendation engine, reserve currency, RFID, Richard Stallman, ride hailing / ride sharing, Ronald Reagan, Satoshi Nakamoto, Second Machine Age, shareholder value, sharing economy, Silicon Valley, Snapchat, social graph, software patent, Steve Jobs, TaskRabbit, trade route, transportation-network company, Turing test, Uber and Lyft, Uber for X, unpaid internship, Y Combinator, young professional, Zipcar

Indeed, the more that algorithms dominate the marketplace, the more the market begins to take on the properties of a dynamic system. It’s no longer a marketplace driven directly by supply and demand, business conditions, or commodity prices. Rather, prices, flows, and volatility are determined by the trading going on among all the algorithms. Each algorithm is a feedback loop, taking an action, observing the resulting conditions, and taking another action after that. Again, and again, and again. It’s an iterative process, in which the algorithms adjust themselves and their activity on every loop, responding less to the news on the ground than to one another. Such systems go out of control because the feedback of their own activity has become louder than the original signal. It’s like when a performer puts a microphone too close to an amplified speaker. It picks up its own feedback, sends it to the speaker, picks it up again, and sends it through again, ad infinitum.

pages: 345 words: 86,394

Frequently Asked Questions in Quantitative Finance by Paul Wilmott


Albert Einstein, asset allocation, Black-Scholes formula, Brownian motion, butterfly effect, capital asset pricing model, collateralized debt obligation, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, discrete time, diversified portfolio, Emanuel Derman, Eugene Fama: efficient market hypothesis, fixed income, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, iterative process, London Interbank Offered Rate, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, market bubble, martingale, Norbert Wiener, quantitative trading / quantitative finance, random walk, regulatory arbitrage, risk/return, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, transaction costs, urban planning, value at risk, volatility arbitrage, volatility smile, Wiener process, yield curve, zero-coupon bond

Since it is not traded on an exchange it must be priced using some mathematical model. See pages 305-325. Expected loss The average loss once a specified threshold has been breached. Used as a measure of Value at Risk. See page 48. Finite difference A numerical method for solving differential equations wherein derivatives are approximated by differences. The differential equation thus becomes a difference equation which can be solved numerically, usually by an iterative process. Gamma The sensitivity of an option’s delta to the underlying. Therefore it is the second derivative of an option price with respect to the underlying. See page 111. GARCH Generalized Auto Regressive Conditional Heteroscedasticity, an econometric model for volatility in which the current variance depends on the previous random increments. Hedge To reduce risk by exploiting correlations between financial instruments.

pages: 597 words: 119,204

Website Optimization by Andrew B. King


AltaVista, bounce rate, don't be evil,, Firefox, In Cold Blood by Truman Capote, information retrieval, iterative process, medical malpractice, Network effects, performance metric, search engine result page, second-price auction, second-price sealed-bid, semantic web, Silicon Valley, slashdot, social graph, Steve Jobs, web application

Acquire inbound links Search engines use external factors such as inbound links, anchor text, surrounding text, and domain history, among others, to determine the relative importance of your site. Most of your rankings in search engines are determined by the number and popularity of your inbound links. [19]. These concepts will come up again and again as you optimize for search-friendliness, and we'll discuss them in more detail shortly. Step 1: Determine Your Keyword Phrases Finding the best keyword phrases to target is an iterative process. First, start with a list of keywords that you want to target with your website. Next, expand that list by brainstorming about other phrases, looking at competitor sites and your logfiles, and including plurals, splits, stems, synonyms, and common misspellings. Then triage those phrases based on search demand and the number of result pages to find the most effective phrases. Finally, play the long tail by targeting multiword phrases to get more targeted traffic and higher conversion rates.

pages: 420 words: 124,202

The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention by William Rosen


Albert Einstein, All science is either physics or stamp collecting, barriers to entry, collective bargaining, computer age, Copley Medal, David Ricardo: comparative advantage, decarbonisation, delayed gratification, Fellow of the Royal Society, Flynn Effect, fudge factor, full employment, invisible hand, Isaac Newton, Islamic Golden Age, iterative process, Jacquard loom, James Hargreaves, James Watt: steam engine, John Harrison: Longitude, Joseph Schumpeter, Joseph-Marie Jacquard, knowledge economy, moral hazard, Network effects, Peace of Westphalia, Peter Singer: altruism, QWERTY keyboard, Ralph Waldo Emerson, rent-seeking, Ronald Coase, Simon Kuznets, spinning jenny, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, transaction costs, transcontinental railway, éminence grise

And not just a screw fastener; the reason lathes are frequently called history’s “first self-replicating machines” is that, beginning in the sixteenth century, they were used to produce their own leadscrews. A dozen inventors from all over Europe, including the Huguenots Jacques Besson and Salomon de Caus, the Italian clockmaker Torriano de Cremona, the German military engineer Konrad Keyser, and the Swede Christopher Polhem, mastered the iterative process by which a lathe could use one leadscrew to cut another, over and over again, each time achieving a higher order of accuracy. By connecting the lathe spindle and carriage to the leadscrew, the workpiece could be moved a set distance for every revolution of the spindle; if the workpiece revolved eight times while the cutting tool was moved a single inch, then eight spiral grooves would be cut on the metal for every inch: eight turns per inch.

Programming Android by Zigurd Mednieks, Laird Dornin, G. Blake Meike, Masumi Nakamura


anti-pattern, business process, conceptual framework, create, read, update, delete, database schema, Debian, domain-specific language,, fault tolerance, Google Earth, interchangeable parts, iterative process, loose coupling, MVC pattern, revision control, RFID, web application

In the following text, we describe SQLite commands as they are used inside the sqlite3 command-line utility. Later we will show ways to achieve the same effects using the Android API. Although command-line SQL will not be part of the application you ship, it can certainly help to debug applications as you’re developing them. You will find that writing database code in Android is usually an iterative process of writing Java code to manipulate tables, and then peeking at created data using the command line. SQL Data Definition Commands Statements in the SQL language fall into two distinct categories: those used to create and modify tables—the locations where data is stored—and those used to create, read, update, and delete the data in those tables. In this section we’ll look at the former, the data definition commands: CREATE TABLE Developers start working with SQL by creating a table to store data.

pages: 471 words: 124,585

The Ascent of Money: A Financial History of the World by Niall Ferguson


Admiral Zheng, Andrei Shleifer, Asian financial crisis, asset allocation, asset-backed security, Atahualpa, bank run, banking crisis, banks create money, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, BRICs, British Empire, capital asset pricing model, capital controls, Carmen Reinhart, Cass Sunstein, central bank independence, collateralized debt obligation, colonial exploitation, Corn Laws, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, currency manipulation / currency intervention, currency peg, Daniel Kahneman / Amos Tversky, deglobalization, diversification, diversified portfolio, double entry bookkeeping, Edmond Halley, Edward Glaeser, Edward Lloyd's coffeehouse, financial innovation, financial intermediation, fixed income, floating exchange rates, Fractional reserve banking, Francisco Pizarro, full employment, German hyperinflation, Hernando de Soto, high net worth, hindsight bias, Home mortgage interest deduction, Hyman Minsky, income inequality, interest rate swap, Isaac Newton, iterative process, joint-stock company, joint-stock limited liability company, Joseph Schumpeter, Kenneth Rogoff, knowledge economy, labour mobility, London Interbank Offered Rate, Long Term Capital Management, market bubble, market fundamentalism, means of production, Mikhail Gorbachev, money: store of value / unit of account / medium of exchange, moral hazard, mortgage debt, mortgage tax deduction, Naomi Klein, Nick Leeson, Northern Rock, pension reform, price anchoring, price stability, principal–agent problem, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, quantitative hedge fund, RAND corporation, random walk, rent control, rent-seeking, reserve currency, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, savings glut, seigniorage, short selling, Silicon Valley, South Sea Bubble, sovereign wealth fund, spice trade, structural adjustment programs, technology bubble, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Malthus, Thorstein Veblen, too big to fail, transaction costs, value at risk, Washington Consensus, Yom Kippur War

This provides the basis for the concept of statistical significance and modern formulations of probabilities at specified confidence intervals (for example, the statement that 40 per cent of the balls in the jar are white, at a confidence interval of 95 per cent, implies that the precise value lies somewhere between 35 and 45 per cent - 40 plus or minus 5 per cent). 4. Normal distribution. It was Abraham de Moivre who showed that outcomes of any kind of iterated process could be distributed along a curve according to their variance around the mean or standard deviation. ‘Tho’ Chance produces Irregularities,’ wrote de Moivre in 1733, ‘still the Odds will be infinitely great, that in process of Time, those Irregularities will bear no proportion to recurrency of that Order which naturally results from Original Design.’ The bell curve that we encountered in Chapter 3 represents the normal distribution, in which 68.2 per cent of outcomes are within one standard deviation (plus or minus) of the mean. 5.

pages: 1,064 words: 114,771

Tcl/Tk in a Nutshell by Paul Raines, Jeff Tranter


AltaVista, iterative process, place-making, Silicon Valley

Semicolons are another way to separate commands, in addition to newline characters: set n {[0--9]};# regular expression to match a digit Without the semicolon before the comment character, the set command will fail because it would receive too many arguments. Tcl treats "#" as an ordinary character if it is not at the beginning of a command. A Symbolic Gesture Much of Tcl's strength as a programming languages lies in the manipulation of strings and lists. Compare the following two methods for printing each element of a list: set cpu_types [list pentium sparc powerpc m88000 alpha mips hppa] # "C-like" method of iterative processing for {set i 0} {$i < [llength $cpu_types]} {incr i} { puts [lindex $cpu_types $i] } # "The Tcl Way"-using string symbols foreach cpu $cpu_types { puts $cpu } The loop coded with for is similar to how a C program might be coded, iterating over the list by the use of an integer index value. The second loop, coded with foreach, is more natural for Tcl. The loop coded with foreach contains over 50% less characters, contributing to greater readability and less code to maintain.

pages: 655 words: 141,257

Programming Android: Java Programming for the New Generation of Mobile Devices by Zigurd Mednieks, Laird Dornin, G. Blake Meike, Masumi Nakamura


anti-pattern, business process, conceptual framework, create, read, update, delete, database schema, Debian, domain-specific language,, fault tolerance, Google Earth, interchangeable parts, iterative process, loose coupling, MVC pattern, revision control, RFID, web application

In the following text, we describe SQLite commands as they are used inside the sqlite3 command-line utility. Later we will show ways to achieve the same effects using the Android API. Although command-line SQL will not be part of the application you ship, it can certainly help to debug applications as you’re developing them. You will find that writing database code in Android is usually an iterative process of writing Java code to manipulate tables, and then peeking at created data using the command line. SQL Data Definition Commands Statements in the SQL language fall into two distinct categories: those used to create and modify tables—the locations where data is stored—and those used to create, read, update, and delete the data in those tables. In this section we’ll look at the former, the data definition commands: CREATE TABLE Developers start working with SQL by creating a table to store data.

pages: 503 words: 131,064

Liars and Outliers: How Security Holds Society Together by Bruce Schneier


airport security, barriers to entry, Berlin Wall, Bernie Madoff, Bernie Sanders, Brian Krebs, Broken windows theory, carried interest, Cass Sunstein, Chelsea Manning, corporate governance, crack epidemic, credit crunch, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, desegregation, don't be evil, Double Irish / Dutch Sandwich, Douglas Hofstadter, experimental economics, Fall of the Berlin Wall, financial deregulation, George Akerlof, hydraulic fracturing, impulse control, income inequality, invention of agriculture, invention of gunpowder, iterative process, Jean Tirole, John Nash: game theory, joint-stock company, Julian Assange, meta analysis, meta-analysis, microcredit, moral hazard, mutually assured destruction, Nate Silver, Network effects, Nick Leeson, offshore financial centre, patent troll, phenotype, pre–internet, principal–agent problem, prisoner's dilemma, profit maximization, profit motive, race to the bottom, Ralph Waldo Emerson, RAND corporation, rent-seeking, RFID, Richard Thaler, risk tolerance, Ronald Coase, security theater, shareholder value, slashdot, statistical model, Steven Pinker, Stuxnet, technological singularity, The Market for Lemons, The Nature of the Firm, The Spirit Level, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, too big to fail, traffic fines, transaction costs, ultimatum game, UNCLOS, union organizing, Vernor Vinge, WikiLeaks, World Values Survey, Y2K

Security systems are often what economists call an experiential good: something you don't understand the value of until you've already bought, installed, and experienced it.3 This holds true for other forms of societal pressure as well. If you're knowledgeable and experienced and perform a good analysis, you can make some good guesses, but it can be impossible to know the actual effects—or unintended consequences—of a particular societal pressure until you've already implemented it. This means that implementing societal pressures is always an iterative process. We try something, see how well it works, then fine-tune. Any society—a family, a business, a government—is constantly balancing its need for security with the side effects, unintended consequences, and other considerations. Can we afford this particular societal pressure system? Are our fundamental freedoms and liberties more important than more security?4 More onerous ATM security will result in fewer ATM transactions, costing a bank more than the ATM fraud.

pages: 560 words: 158,238

Fifty Degrees Below by Kim Stanley Robinson


airport security, bioinformatics, Burning Man, clean water, Donner party, full employment, invisible hand, iterative process, means of production, minimum wage unemployment, North Sea oil, Ralph Waldo Emerson, Richard Feynman, Richard Feynman, statistical model, Stephen Hawking, the scientific method

When Frank expressed doubt that any major climate mitigation was possible, either physically or politically, Wracke waved a hand. “The Corps has always done things on a big scale. Huge scale. Sometimes with huge blunders. All with the best intentions of course. That’s just the way things happen. We’re still gung-ho to try. Lots of things are reversible, in the long run. Hopefully this time around we’ll be working with better science. But, you know, it’s an iterative process. So, long story short, you get a project approved, and we’re good to go. We’ve got the expertise. The Corps’ esprit de corps is always high.” “What about budget?” Frank asked. “What about it? We’ll spend what we’re given.” “Well, but is there any kind of, you know, discretionary fund that you can tap into?” “We don’t seek funding, usually,” the general admitted. “But could you?” “Well, in tandem with a request for action.

pages: 574 words: 164,509

Superintelligence: Paths, Dangers, Strategies by Nick Bostrom


agricultural Revolution, AI winter, Albert Einstein, algorithmic trading, anthropic principle, anti-communist, artificial general intelligence, autonomous vehicles, barriers to entry, bioinformatics, brain emulation, cloud computing, combinatorial explosion, computer vision, cosmological constant, dark matter, DARPA: Urban Challenge, data acquisition, delayed gratification, demographic transition, Douglas Hofstadter, Drosophila, Elon Musk,, epigenetics, fear of failure, Flash crash, Flynn Effect, friendly AI, Gödel, Escher, Bach, income inequality, industrial robot, informal economy, information retrieval, interchangeable parts, iterative process, job automation, John von Neumann, knowledge worker, Menlo Park, meta analysis, meta-analysis, mutually assured destruction, Nash equilibrium, Netflix Prize, new economy, Norbert Wiener, NP-complete, nuclear winter, optical character recognition, pattern recognition, performance metric, phenotype, prediction markets, price stability, principal–agent problem, race to the bottom, random walk, Ray Kurzweil, recommendation engine, reversible computing, social graph, speech recognition, Stanislav Petrov, statistical model, stem cell, Stephen Hawking, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, The Coming Technological Singularity, The Nature of the Firm, Thomas Kuhn: the structure of scientific revolutions, transaction costs, Turing machine, Vernor Vinge, Watson beat the top human players on Jeopardy!, World Values Survey

The idea of using learning as a means of bootstrapping a simpler system to human-level intelligence can be traced back at least to Alan Turing’s notion of a “child machine,” which he wrote about in 1950: Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s? If this were then subjected to an appropriate course of education one would obtain the adult brain.3 Turing envisaged an iterative process to develop such a child machine: We cannot expect to find a good child machine at the first attempt. One must experiment with teaching one such machine and see how well it learns. One can then try another and see if it is better or worse. There is an obvious connection between this process and evolution…. One may hope, however, that this process will be more expeditious than evolution. The survival of the fittest is a slow method for measuring advantages.

pages: 1,025 words: 150,187

ZeroMQ by Pieter Hintjens


anti-pattern, carbon footprint, cloud computing, Debian, distributed revision control, domain-specific language, factory automation, fault tolerance, fear of failure, finite state, Internet of things, iterative process, premature optimization, profit motive, pull request, revision control, RFC: Request For Comment, Richard Stallman, Skype, smart transportation, software patent, Steve Jobs, Valgrind, WebSocket

This was how we built (or grew, or gently steered) the ØMQ community into existence. Your goal as leader of a community is to motivate people to get out there and explore; to ensure they can do so safely and without disturbing others; to reward them when they make successful discoveries; and to ensure they share their knowledge with everyone else (and not because we ask them, not because they feel generous, but because it’s The Law). It is an iterative process. You make a small product, at your own cost, but in public view. You then build a small community around that product. If you have a small but real hit, the community then helps design and build the next version, and grows larger. And then that community builds the next version, and so on. It’s evident that you remain part of the community, maybe even a majority contributor, but the more control you try to assert over the material results, the less people will want to participate.

Poking a Dead Frog: Conversations With Today's Top Comedy Writers by Mike Sacks


Bernie Madoff, Columbine, hive mind, index card, iterative process, Ponzi scheme, pre–internet, Saturday Night Live, Upton Sinclair

They’re overly clever and jump around a lot, and have more conversational fill in them—clichés and empty phrases and so on. And they meander in terms of their causality. Things happen for no reason, and lead to nothing, or lead to something, but with weak causation. But in revision they get tighter and funnier and also gentler. And one thing leads to the next in a tighter, more undeniable way—a way that seems to “mean.” Which, I guess, makes sense, if we think of revision as just an iterative process of exerting one’s taste. Gradually the story comes to feel more like “you” than you could have imagined at the outset, and starts to manifest a sort of superlogic—an internal logic that is more direct and “caused” than mere real-life logic. The thing is, writing is really just the process of charming someone via prose—compelling them to keep reading. So, as with actual personality, part of the process is learning what it is that you’ve got to work with: How do I keep that reader reading?

pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence by John Brockman


3D printing, agricultural Revolution, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic trading, artificial general intelligence, augmented reality, autonomous vehicles, bitcoin, blockchain, clean water, cognitive dissonance, Colonization of Mars, complexity theory, computer age, computer vision, constrained optimization, corporate personhood, cosmological principle, cryptocurrency, cuban missile crisis, Danny Hillis, dark matter, discrete time, Elon Musk, Emanuel Derman, endowment effect, epigenetics, Ernest Rutherford, experimental economics, Flash crash, friendly AI, Google Glasses, hive mind, income inequality, information trail, Internet of things, invention of writing, iterative process, Jaron Lanier, job automation, John von Neumann, Kevin Kelly, knowledge worker, loose coupling, microbiome, Moneyball by Michael Lewis explains big data, natural language processing, Network effects, Norbert Wiener, pattern recognition, Peter Singer: altruism, phenotype, planetary scale, Ray Kurzweil, recommendation engine, Republic of Letters, RFID, Richard Thaler, Rory Sutherland, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, strong AI, Stuxnet, superintelligent machines, supervolcano, the scientific method, The Wisdom of Crowds, theory of mind, Thorstein Veblen, too big to fail, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

AI can easily look like the real thing but still be a million miles away from being the real thing—like kissing through a pane of glass: It looks like a kiss but is only a faint shadow of the actual concept. I concede to AI proponents all of the semantic prowess of Shakespeare, the symbol juggling they do perfectly. Missing is the direct relationship with the ideas the symbols represent. Much of what is certain to come soon would have belonged in the old-school “Strong AI” territory. Anything that can be approached in an iterative process can and will be achieved, sooner than many think. On this point I reluctantly side with the proponents: exaflops in CPU+GPU performance, 10K resolution immersive VR, personal petabyte databases . . . here in a couple of decades. But it is not all “iterative.” There’s a huge gap between that and the level of conscious understanding that truly deserves to be called Strong, as in “Alive AI.” The big elusive question: Is consciousness an emergent behavior?

Commodity Trading Advisors: Risk, Performance Analysis, and Selection by Greg N. Gregoriou, Vassilios Karavas, François-Serge Lhabitant, Fabrice Douglas Rouah


Asian financial crisis, asset allocation, backtesting, capital asset pricing model, collateralized debt obligation, commodity trading advisor, compound rate of return, constrained optimization, corporate governance, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, discrete time, distributed generation, diversification, diversified portfolio, dividend-yielding stocks, fixed income, high net worth, implied volatility, index arbitrage, index fund, interest rate swap, iterative process, linear programming, London Interbank Offered Rate, Long Term Capital Management, market fundamentalism, merger arbitrage, Mexican peso crisis / tequila crisis, p-value, Ponzi scheme, quantitative trading / quantitative finance, random walk, risk-adjusted returns, risk/return, Sharpe ratio, short selling, stochastic process, systematic trading, technology bubble, transaction costs, value at risk

Following Chang, Pinegar, and Schachter (1997), the volume and volatility relationship is modeled without including past volatility. 2. Following Irwin and Yoshimaru (1999), volatility lags are included as independent variables to account for the time series persistence of volatility. 3. Following Bessembinder and Seguin (1993), the persistence in volume and volatility is modeled through specification of an iterative process.5 Since estimation results for the different model specifications are quite similar, only results for a modified version of Chang, Pinegar, and Schachter’s specification are reported here.6 Chang, Pinegar, and Schachter (1997) regress futures price volatility on volume associated with large speculators (as provided by the CFTC large trader reports) and all other market volume. Including two additional sets 5Another approach would be to use a model with a mean equation and a volatility equation that has both volume and GARCH (generalized autoregressive conditional heteroskedasticity) terms.

pages: 397 words: 110,130

Smarter Than You Think: How Technology Is Changing Our Minds for the Better by Clive Thompson


3D printing, 4chan, A Declaration of the Independence of Cyberspace, augmented reality, barriers to entry, Benjamin Mako Hill, butterfly effect, citizen journalism, Claude Shannon: information theory, conceptual framework, corporate governance, crowdsourcing, Deng Xiaoping, discovery of penicillin, Douglas Engelbart, Edward Glaeser,, experimental subject, Filter Bubble, Freestyle chess, Galaxy Zoo, Google Earth, Google Glasses, Henri Poincaré, hindsight bias, hive mind, Howard Rheingold, information retrieval, iterative process, jimmy wales, Kevin Kelly, Khan Academy, knowledge worker, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Netflix Prize, Nicholas Carr, patent troll, pattern recognition, pre–internet, Richard Feynman, Richard Feynman, Ronald Coase, Ronald Reagan, sentiment analysis, Silicon Valley, Skype, Snapchat, Socratic dialogue, spaced repetition, telepresence, telepresence robot, The Nature of the Firm, the scientific method, The Wisdom of Crowds, theory of mind, transaction costs, Vannevar Bush, Watson beat the top human players on Jeopardy!, WikiLeaks, X Prize, éminence grise

Young chess enthusiasts could buy CD-ROMs filled with hundreds of thousands of chess games. Chess-playing software could show you how an artificial opponent would respond to any move. This dramatically increased the pace at which young chess players built up intuition. If you were sitting at lunch and had an idea for a bold new opening move, you could instantly find out which historic players had tried it, then war-game it yourself by playing against software. The iterative process of thought experiments—“If I did this, then what would happen?”—sped up exponentially. Chess itself began to evolve. “Players became more creative and daring,” as Frederic Friedel, the publisher of the first popular chess databases and software, tells me. Before computers, grand masters would stick to lines of attack they’d long studied and honed. Since it took weeks or months for them to research and mentally explore the ramifications of a new move, they stuck with what they knew.

pages: 624 words: 127,987

The Personal MBA: A World-Class Business Education in a Single Volume by Josh Kaufman


Albert Einstein, Atul Gawande, Black Swan, business process, buy low sell high, capital asset pricing model, Checklist Manifesto, cognitive bias, correlation does not imply causation, Credit Default Swap, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, Dean Kamen, delayed gratification, discounted cash flows, double entry bookkeeping, Douglas Hofstadter,, Frederick Winslow Taylor, Gödel, Escher, Bach, high net worth, hindsight bias, index card, inventory management, iterative process, job satisfaction, Johann Wolfgang von Goethe, Kevin Kelly, Lao Tzu, loose coupling, loss aversion, market bubble, Network effects, Parkinson's law, Paul Buchheit, Paul Graham, place-making, premature optimization, Ralph Waldo Emerson, rent control, side project, statistical model, stealth mode startup, Steve Jobs, Steve Wozniak, subscription business, telemarketer, the scientific method, time value of money, Toyota Production System, tulip mania, Upton Sinclair, Walter Mischel, Y Combinator, Yogi Berra

Even the most discouraging Feedback contains crucial pieces of information that can help you make your offering better. The worst response you can get when asking for Feedback isn’t emphatic dislike: it’s total apathy. If no one seems to care about what you’ve created, you don’t have a viable business idea. 5. Give potential customers the opportunity to preorder. One of the most important pieces of Feedback you can receive during the iteration process is the other person’s willingness to actually purchase what you’re creating. It’s one thing for a person to say that they’d purchase something and quite another for them to be willing to pull out their wallet or credit card and place a real order. You can do this even if the offer isn’t ready yet—a tactic called Shadow Testing (discussed later). Whenever possible, give the people who are giving you Feedback the opportunity to preorder the offering.

pages: 923 words: 516,602

The C++ Programming Language by Bjarne Stroustrup


combinatorial explosion, conceptual framework, database schema, distributed generation, fault tolerance, general-purpose programming language, index card, iterative process, job-hopping, locality of reference, Menlo Park, Parkinson's law, premature optimization, sorting algorithm

All rights reserved. 696 Development and Design Chapter 23 The purpose of ‘‘design’’ is to create a clean and relatively simple internal structure, sometimes also called an architecture, for a program. In other words, we want to create a framework into which the individual pieces of code can fit and thereby guide the writing of those individual pieces of code. A design is the end product of the design process (as far as there is an end product of an iterative process). It is the focus of the communication between the designer and the programmer and between programmers. It is important to have a sense of proportion here. If I – as an individual programmer – design a small program that I’m going to implement tomorrow, the appropriate level of precision and detail may be some scribbles on the back of an envelope. At the other extreme, the development of a system involving hundreds of designers and programmers may require books of specifications carefully written using formal or semi-formal notations.

In particular, consider the needs for construction, copying, and destruction. – Consider minimalism, completeness, and convenience. [3] Refine the classes by specifying their dependencies. – Consider parameterization, inheritance, and use dependencies. [4] Specify the interfaces. – Separate functions into public and protected operations. – Specify the exact type of the operations on the classes. Note that these are steps in an iterative process. Typically, several loops through this sequence are needed to produce a design one can comfortably use for an initial implementation or a reimplementation. One advantage of well-done analysis and data abstraction as described here is that it becomes relatively easy to reshuffle class relationships even after code has been written. This is never a trivial task, though. After that, we implement the classes and go back and review the design based on what was learned from implementing them.

pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden


business intelligence, business process, cellular automata, cloud computing, complexity theory, conceptual framework, continuous integration, data acquisition, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, general-purpose programming language, HyperCard, information retrieval, iterative process, John von Neumann, linear programming, loose coupling, Mars Rover, millennium bug, NP-complete, Paul Graham, performance metric, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

Well, all of a sudden I’ve just made some decisions like, “Wow, it’s maybe like a 2D language as opposed to a 3D language.” Maybe I make the decision that color is important for me and all of a sudden I realize, “Wow, I’ve just alienated the whole community of color-blind programmers.” Every one of those things becomes a constraint that I have to work out, and I have to deal with the consequences of those constraints. That argues for an iterative process. Grady: Absolutely. All of life is iterative. It goes back to the point I made earlier, which is you can’t a priori know enough to even ask the right questions. One has to take a leap of faith and move forward in the presence of imperfect information. Is it likely we’ll see a break-out visual programming language or system in the next 10 years? Grady: Oh, it already exists. It’s National Instruments’ Lab View.

pages: 1,156 words: 229,431

The IDA Pro Book by Chris Eagle


barriers to entry, business process,, information retrieval, iterative process

However, the process seldom runs that smoothly. Note The sigmake documentation file, sigmake.txt, recommends that signature filenames follow the MS-DOS 8.3 name-length convention. This is not a hard-and-fast requirement, however. When longer filenames are used, only the first eight characters of the base filename are displayed in the signature-selection dialog. Signature generation is often an iterative process, as it is during this phase when collisions must be handled. A collision occurs anytime two functions have identical patterns. If collisions are not resolved in some manner, it is not possible to determine which function is actually being matched during the signature-application process. Therefore, sigmake must be able to resolve each generated signature to exactly one function name. When this is not possible, based on the presence of identical patterns for one or more functions, sigmake refuses to generate a .sig file and instead generates an exclusions file (.exc).

pages: 685 words: 203,949

The Organized Mind: Thinking Straight in the Age of Information Overload by Daniel J. Levitin


airport security, Albert Einstein, Amazon Mechanical Turk, Anton Chekhov, big-box store, business process, call centre, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, delayed gratification, Donald Trump,, epigenetics, Eratosthenes, Exxon Valdez, framing effect, friendly fire, fundamental attribution error, Golden Gate Park, Google Glasses, haute cuisine, impulse control, index card, indoor plumbing, information retrieval, invention of writing, iterative process, jimmy wales, job satisfaction, Kickstarter, life extension, meta analysis, meta-analysis, more computing power than Apollo, Network effects, new economy, Nicholas Carr, optical character recognition, pattern recognition, phenotype, placebo effect, pre–internet, profit motive, randomized controlled trial, Skype, Snapchat, statistical model, Steve Jobs, supply-chain management, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Turing test, ultimatum game

And the drawers could not be allowed to get too full, since then papers would catch and tear as the drawers were opened. Letter boxes had to be taken down from a shelf and opened up, a time-consuming operation when large amounts of filing were done. As Yates notes, keeping track of whether a given document or pile of documents was deemed active or archival was not always made explicit. Moreover, if the user wanted to expand, this might require transferring the contents of one box to another in an iterative process that might require dozens of boxes being moved down in the cabinet, to make room for the new box. To help prevent document loss, and to keep documents in the order they were filed, a ring system was introduced around 1881, similar to the three-ring binders we now use. The advantages of ringed flat files were substantial, providing random access (like Phaedrus’s 3 x 5 index card system) and minimizing the risk of document loss.

The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling by Ralph Kimball, Margy Ross


Albert Einstein, business intelligence, business process, call centre, cloud computing, data acquisition, discrete time, inventory management, iterative process, job automation, knowledge worker, performance metric, platform as a service, side project, supply-chain management

The modeling effort typically works through the following sequence of tasks and deliverables, as illustrated in Figure 18-1: ■ High-level model defining the model’s scope and granularity ■ Detailed design with table-by-table attributes and metrics ■ Review and validation with IT and business representatives ■ Finalization of the design documentation Dimensional Modeling Process and Tasks 435 As with any data modeling effort, dimensional modeling is an iterative process. You will work back and forth between business requirements and source details to further refine the model, changing the model as you learn more. This section describes each of these major tasks. Depending on the design team’s experience and exposure to dimensional modeling concepts, you might begin with basic dimensional modeling education before kicking off the effort to ensure everyone is on the same page regarding standard dimensional vocabulary and best practices.

pages: 828 words: 232,188

Political Order and Political Decay: From the Industrial Revolution to the Globalization of Democracy by Francis Fukuyama


Affordable Care Act / Obamacare, Andrei Shleifer, Asian financial crisis, Atahualpa, banking crisis, barriers to entry, Berlin Wall, blood diamonds, British Empire, centre right, clean water, collapse of Lehman Brothers, colonial rule, conceptual framework, crony capitalism, deindustrialization, Deng Xiaoping, double entry bookkeeping, Edward Snowden, Erik Brynjolfsson, European colonialism, facts on the ground, failed state, Fall of the Berlin Wall, first-past-the-post, Francis Fukuyama: the end of history, Francisco Pizarro, Frederick Winslow Taylor, full employment, Gini coefficient, Hernando de Soto, Home mortgage interest deduction, income inequality, invention of the printing press, iterative process, knowledge worker, land reform, land tenure, life extension, low skilled workers, manufacturing employment, means of production, Menlo Park, Mohammed Bouazizi, Monroe Doctrine, moral hazard, new economy, open economy, out of africa, Peace of Westphalia, Port of Oakland, post-industrial society, Post-materialism, post-materialism, price discrimination, quantitative easing, RAND corporation, rent-seeking, road to serfdom, Ronald Reagan, Scientific racism, Scramble for Africa, Second Machine Age, Silicon Valley, special economic zone, stem cell, the scientific method, The Wealth of Nations by Adam Smith, Thomas L Friedman, Thomas Malthus, too big to fail, trade route, transaction costs, Tyler Cowen: Great Stagnation, women in the workforce, World Values Survey

But while variation in biological evolution is random, human beings exercise some degree of agency over the design of their institutions. It is true, as authors like Friedrich A. Hayek have argued, that human beings are never knowledgeable or wise enough to be able to predict the outcomes of their efforts to design institutions or plan policies with full ex ante knowledge of the results.1 But the exercise of human agency is not a one-shot affair: human beings learn from their mistakes and take actions to correct them in an iterative process. The constitution adopted by the Federal Republic of Germany in 1949 differed in significant ways from the constitution of the Weimar Republic, precisely because Germans had learned from the failure of democracy during the 1930s. In biological evolution, there are separate specific and general processes. Under specific evolution, organisms adapt to particular environments and diverge in their characteristics.

pages: 823 words: 220,581

Debunking Economics - Revised, Expanded and Integrated Edition: The Naked Emperor Dethroned? by Steve Keen


accounting loophole / creative accounting, banking crisis, banks create money, barriers to entry, Benoit Mandelbrot, Big bang: deregulation of the City of London, Black Swan, Bonfire of the Vanities, butterfly effect, capital asset pricing model, cellular automata, central bank independence, citizen journalism, clockwork universe, collective bargaining, complexity theory, correlation coefficient, credit crunch, David Ricardo: comparative advantage, debt deflation, diversification, double entry bookkeeping,, Eugene Fama: efficient market hypothesis, experimental subject, Financial Instability Hypothesis, Fractional reserve banking, full employment, Henri Poincaré, housing crisis, Hyman Minsky, income inequality, invisible hand, iterative process, John von Neumann, laissez-faire capitalism, liquidity trap, Long Term Capital Management, mandelbrot fractal, margin call, market bubble, market clearing, market microstructure, means of production, minimum wage unemployment, open economy, place-making, Ponzi scheme, profit maximization, quantitative easing, RAND corporation, random walk, risk tolerance, risk/return, Robert Shiller, Robert Shiller, Ronald Coase, Schrödinger's Cat, scientific mainstream, seigniorage, six sigma, South Sea Bubble, stochastic process, The Great Moderation, The Wealth of Nations by Adam Smith, Thorstein Veblen, time value of money, total factor productivity, tulip mania, wage slave

The auctioneer then refuses to allow any sale to take place, and instead adjusts prices – increasing the price of those commodities where demand exceeded supply, and decreasing the price where demand was less than supply. This then results in a second set of prices, which are also highly unlikely to balance demand and supply for all commodities; so another round of price adjustments will take place, and another, and another. Walras called this iterative process of trying to find a set of prices which equates supply to demand for all commodities ‘tatonnement’ – which literally translates as ‘groping.’ He believed that this process would eventually converge to an equilibrium set of prices, where supply and demand are balanced in all markets (so long as trade at disequilibrium prices can be prevented). This was not necessarily the case, since adjusting one price so that supply and demand are balanced for one commodity could well push demand and supply farther apart for all other commodities.

pages: 1,294 words: 210,361

The Emperor of All Maladies: A Biography of Cancer by Siddhartha Mukherjee


Barry Marshall: ulcers, conceptual framework, discovery of penicillin, experimental subject, iterative process, life extension, Louis Pasteur, medical residency, meta analysis, meta-analysis, mouse model, New Journalism, phenotype, randomized controlled trial, scientific mainstream, Silicon Valley, social web, statistical model, stem cell, women in the workforce, éminence grise

In other words, if you started off with 100,000 leukemia cells in a mouse and administered a drug that killed 99 percent of those cells in a single round, then every round would kill cells in a fractional manner, resulting in fewer and fewer cells after every round of chemotherapy: 100,000 . . . 1,000 . . . 10 . . . and so forth, until the number finally fell to zero after four rounds. Killing leukemia was an iterative process, like halving a monster’s body, then halving the half, and halving the remnant half. Second, Skipper found that by adding drugs in combination, he could often get synergistic effects on killing. Since different drugs elicited different resistance mechanisms, and produced different toxicities in cancer cells, using drugs in concert dramatically lowered the chance of resistance and increased cell killing.

pages: 892 words: 91,000

Valuation: Measuring and Managing the Value of Companies by Tim Koller, McKinsey, Company Inc., Marc Goedhart, David Wessels, Barbara Schwimmer, Franziska Manoury


air freight, barriers to entry, Basel III, BRICs, business climate, business process, capital asset pricing model, capital controls, cloud computing, compound rate of return, conceptual framework, corporate governance, corporate social responsibility, credit crunch, Credit Default Swap, discounted cash flows, distributed generation, diversified portfolio, energy security, equity premium, index fund, iterative process, Long Term Capital Management, market bubble, market friction, meta analysis, meta-analysis, new economy, p-value, performance metric, Ponzi scheme, price anchoring, purchasing power parity, quantitative easing, risk/return, Robert Shiller, Robert Shiller, shareholder value, six sigma, sovereign wealth fund, speech recognition, technology bubble, time value of money, too big to fail, transaction costs, transfer pricing, value at risk, yield curve, zero-coupon bond

ENTERPRISE DISCOUNTED CASH FLOW MODEL 141 EXHIBIT 8.4 UPS: Enterprise DCF Valuation Forecast year 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 Continuing value Present value of cash flow Free cash flow Discount factor, Present value (FCF), $ million @ 8.0% of FCF, $ million 3,472 4,108 4,507 4,892 5,339 5,748 6,194 6,678 7,086 7,523 168,231 0.926 0.857 0.794 0.735 0.681 0.630 0.584 0.541 0.501 0.463 0.463 3,215 3,522 3,579 3,596 3,634 3,623 3,615 3,609 3,547 3,486 77,967 113,395 Midyear adjustment factor Value of operations 1.039 117,840 Value of excess cash Value of investments Enterprise value 4,136 148 122,124 Less: Value of debt Less: Value of after-tax unfunded retirement obligations Less: Value of capitalized operating leases Less: Value of noncontrolling interest Equity value (10,872) (5,042) (5,841) (14) 100,355 Millions of shares outstanding (December 2013) Equity value per share ($) 923 109 amount of noncontrolling interests.5 Divide the resulting equity value of $100.4 billion by the number of shares outstanding (923 million) to estimate a pershare value of $109. During the middle part of 2014, when we performed this valuation, UPS’s stock traded between $95 and $105 per share, well within a reasonable range of the DCF valuation (reasonable changes in forecast assumptions or WACC estimates can easily move a company’s value by up to 15 percent). Although this chapter presents the enterprise DCF valuation sequentially, valuation is an iterative process. To value operations, first reorganize the company’s financial statements to separate operating items from nonoperating items and capital structure. Then analyze the company’s historical performance; define and project free cash flow over the short, medium, and long 5 A noncontrolling interest arises when an outside investor owns a minority share of a subsidiary. Since this outside investor has a partial claim on cash flows, the claim’s value must be deducted from enterprise value to compute equity value. 142 FRAMEWORKS FOR VALUATION run; and discount the projected free cash flows at the weighted average cost of capital.

The Art of Computer Programming by Donald Ervin Knuth


Brownian motion, complexity theory, correlation coefficient, Eratosthenes, Georg Cantor, information retrieval, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Menlo Park, NP-complete, P = NP, Paul Erdős, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, random walk, sorting algorithm, Turing machine, Y2K

(On the other hand the constant of proportionality is such that N must be really large before Algorithms L and T lose out to this "high-speed" method.) Historical note: J. N. Bramhall and M. A. Chappie published the first O(N3) method for power series reversion in CACM 4 A961), 317-318, 503. It was an offline algorithm essentially equivalent to the method of exercise 16, with running time approximately the same as that of Algorithms L and T. Iteration of series. If we want to study the behavior of an iterative process xn «— f(xn-i), we are interested in studying the n-fold composition of a given function / with itself, namely xn = /(/(... f{xo) •••))• Let us define f^(x) = x and /W(x) = /(/[n~1](z))> so that A8) for all integers m, n > 0. In many cases the notation f^n\x) makes sense also when n is a negative integer, namely if /M and /t~nl are inverse functions such that x — f^(f^~n^(x)); if inverse functions are unique, A8) holds for all integers m and n.

pages: 1,797 words: 390,698

Power at Ground Zero: Politics, Money, and the Remaking of Lower Manhattan by Lynne B. Sagalyn


affirmative action, airport security, Bonfire of the Vanities, clean water, conceptual framework, corporate governance, deindustrialization, Donald Trump, Edward Glaeser, estate planning, Frank Gehry, Guggenheim Bilbao, high net worth, informal economy, intermodal, iterative process, Jane Jacobs, mortgage debt, New Urbanism, place-making, rent control, Rosa Parks, Silicon Valley, sovereign wealth fund, the built environment, the High Line, time value of money, too big to fail, Torches of Freedom, urban decay, urban planning, urban renewal, white flight, young professional

Two months later, the LMDC released both statements to the public and then launched “an aggressive public outreach campaign to solicit public input,” receiving some twenty-four hundred comments from public hearings, meetings with its advisory councils and Community Board 1, mailings to the families of victims and elected officials, and input from its official website, e-mail, and regular mail. The drafting committees convened again to review the public comments and make adjustments. During the process, Contini constantly went back to the Families Advisory Council to keep its members informed; there were always a few who didn’t agree, but most agreed with what was being formulated. The memorial, she said, “had to be about the individual and about the larger event.” The iterative process, Goldberger wrote, produced a final version “not nearly so genteel” as the initial attempt at a mission statement, which was “notable for its cautious, even hesitant language and sense of propriety.” The final version was “short, simpler, and blunter”:21 Remember and honor the thousands of innocent men, women, and children murdered by terrorists in the horrific attacks of February 26, 1993, and September 11, 2001.