finite state

61 results back to index

Syntactic Structures by Noam Chomsky

finite state, P = NP, statistical model

The point i s that there are processes of sentence formation that finite state grammars are i ntrinsically not equipped to handle. If these pro­ cesses have no finite limit, we can prove the literal inapplicabi l i ty of this elementary theory. I f the processes have a limit, then the construction of a finite state grammar will not be litera lly out of the question, since it will be possible to list the sentences, and a list is essent ially a trivial finite state grammar. But this grammar will be so complex that it will be of little use or i nterest. In general, the assumption that languages are infinite is made in order to simplify 24 SYNTACTIC STRUCTURES the description of these languages. If a grammar does not have recursive devices (closed loops, as in (8), in the finite state grammar) it will be prohibitively complex.

We can also define the grammatical relations in these languages in a formal way in terms of the associated diagrams. 4.2 In § 3 we considered languages, calJed "fin i tt: state languages", which were generated by finite state M arkov processes. Now we are considering terminal languages that are generated by systems of the form [L, Fl. These two types of languages are related in the fol lowing way Theorem : Every finite state language is a term inal language, but there are terminal languages which are not finite state languages.4 The import of this theorem is that description in terms of phrase structure is essentially more powerful than description in terms of the elementary theory presented above in § 3. As examples of terminal languages that are not finite state l anguages we have the languages ( ! o i), ( 1 0 ii) discussed in § 3. Thus the language ( ! o i), consisting of all and only the strings ab, aabb, aaabbb, . . . can be prod uced by the [L, F) grammar ( I S)

Suppose that the machine begins in the i nitial state, runs through a sequence of states (producing a word with each transition), and ends in the final state. Then we call the sequence of words that has been produced a "sentence". Each such machine thus defines a certain language ; namely, the set of sentences that can be produced i n this way. Any language that can be produced by a machine of this sort we call a finite state language ; and we can call the machine itself a finite state grammar. A finite state grammar can be represented graphically i n the form o f a "state diagram".l For example, the grammar that produces just the two sentences "the man comes" and "the men come" can be represented by the following state diagram : THE ( 7) We can extend this grammar to produce an i nfinite number of sen­ tences by adding closed loops. Thus the finite grammar of the subpart of English containing the above sentences in addition to "the old man comes", "the old old man comes", . . . , "the old men come", "the old old men come", . . . , can be represented by the following state diagram : OLD THE (8) 1 C.

pages: 496 words: 70,263

Erlang Programming by Francesco Cesarini

cloud computing, fault tolerance, finite state, loose coupling, revision control, RFC: Request For Comment, sorting algorithm, Turing test, type inference, web application

. • If the stop message is received, terminate is called, destroying the window and all the widgets associated with it. The loop function is not called, allowing the process to terminate normally. Finite State Machines Erlang processes can be used to implement finite state machines. A finite state machine, or FSM for short, is a model that consists of a finite number of states and events. You can think of an FSM as a model of the world which will contain abstractions from the details of the real system. At any one time, the FSM is in a specific state. Depending on 126 | Chapter 5: Process Design Patterns the incoming event and the current state of the FSM, a set of actions and a transition to a new state will occur (see Figure 5-5). Figure 5-5. A finite state machine In Erlang, each state is represented as a tail-recursive function, and each event is represented as an incoming message.

Behaviors we have not covered but which we briefly introduced in this chapter include finite state machines, event handlers, and special processes. All of these behavior library modules have manual pages that you can reference. In addition, the Erlang documentation has a section on OTP design principles that provides more details and examples of these behaviors. Finite state machines are a crucial component of telecom systems. In Chapter 5, we introduced the idea of modeling a phone as a finite state machine. If the phone is not being used, it is in state idle. If an incoming call arrives, it goes to state ringing. This does not have to be a phone; it could instead be an ATM cross-connect or the handling of data in a protocol stack. The gen_fsm module provides you with a finite state machine behavior that you can use to solve these problems.

State machines are very common in all sorts of processing applications. In telecom systems, they are used not only to handle the state of equipment, as in the preceding example, but also in complex protocol stacks. The fact that Erlang handles them † Or any other relative of your choice who tends to call you very early on a Saturday morning. Finite State Machines | 127 Figure 5-6. Fixed-line phone finite state machine graciously is not a surprise. When prototyping with the early versions of Erlang between 1987 and 1991, it was the Plain Old Telephony System (POTS) finite state machines described in this section that the development team used to test their ideas of what Erlang should look like. With a tail-recursive function for every state, actions implemented as function calls, and events represented as messages, this is what the code for the idle state would look like: idle() -> receive {Number, incoming} -> start_ringing(), ringing(Number); off_hook -> start_tone(), dial() end. ringing(Number) -> receive {Number, other_on_hook} -> stop_ringing(), idle(); {Number, off_hook} -> 128 | Chapter 5: Process Design Patterns stop_ringing(), connected(Number) end.

pages: 229 words: 67,599

The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J. Nahin

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Claude Shannon: information theory, conceptual framework, Edward Thorp, Fellow of the Royal Society, finite state, four colour theorem, Georg Cantor, Grace Hopper, Isaac Newton, John von Neumann, knapsack problem, New Journalism, Pierre-Simon Laplace, reversible computing, Richard Feynman, Schrödinger's Cat, Steve Jobs, Steve Wozniak, thinkpad, Thomas Bayes, Turing machine, Turing test, V2 rocket

Depending both on its present state and the symbol just read, the fourth and final operation of a machine cycle occurs when the finite-state machine transitions to a new state (which may, in fact, be the present state). Then a new machine cycle begins. Figure 9.1.1 shows the connection of a finite-state machine, the read/write head, and the tape. The entire arrangement, all together, is what we call a Turing machine. When placed into operation we’ll imagine that the tape is initially blank (that is, the symbol 0 is on all of the tape’s squares)—except for some finite number of squares that have 1s. By convention, we’ll always take the finite-state machine as initially in state 1. And finally, we must specify over which square on the tape the read/write head is initially placed. The finite-state machine and the read/write head then move along the tape (we imagine the tape is motionless) according to the internal details of the finite-state machine and the particular sequence of symbols encountered on the tape.

—Claude Shannon, in a March 1952 talk at Bell Labs on creativity, during which he explained how he arrived at the logic circuitry for a machine that plays a perfect game of Nim 9.1 THE FIRST MODERN COMPUTER A Turing machine is the combination of a sequential, finite-state machine plus an external read/write memory storage medium called the tape (think of a ribbon of magnetic tape). The tape is a linear sequence of squares, with each square holding one of several possible symbols. Most generally, a Turing machine can have any number of different symbols it can recognize, but I’ll assume here that we are discussing the 2-symbol case (0 or 1). In 1956, Shannon showed that this in no way limits the power of what a Turing machine can do. The tape is arbitrarily long in at least one, perhaps both, directions. The finite-state machine is connected to a read/write head, which at each machine cycle (I’ll define what that is in just a moment) is located over a square on the tape.

The finite-state machine and the read/write head then move along the tape (we imagine the tape is motionless) according to the internal details of the finite-state machine and the particular sequence of symbols encountered on the tape. At some time after we turn the Turing machine on, it presumably completes its task (whatever that might be), and the finite-state machine enters state 0 (called the halting state) and stops. Figure 9.1.1. A Turing machine. As far as I know, nobody has ever actually constructed a Turing machine. It is a purely theoretical concept. Described in 1936 by the English mathematician Alan Turing (see Shannon’s mini-biography in Chapter 3)— that is, ten years before the first actual electronic computers began to be constructed—Turing’s machines are nevertheless as powerful as any modern machine in what they can compute. Compared to a modern, electronic-speed computer, a Turing machine is really, really slow (think of a race between a snail and a photon), but time is irrelevant.

pages: 1,025 words: 150,187

ZeroMQ by Pieter Hintjens

AGPL, anti-pattern, carbon footprint, cloud computing, Debian, distributed revision control, domain-specific language, factory automation, fault tolerance, fear of failure, finite state, Internet of things, iterative process, premature optimization, profit motive, pull request, revision control, RFC: Request For Comment, Richard Stallman, Skype, smart transportation, software patent, Steve Jobs, Valgrind, WebSocket

#include "czmq.h" // States in which we can be at any point in time typedef enum { STATE_PRIMARY = 1, // Primary, waiting for peer to connect STATE_BACKUP = 2, // Backup, waiting for peer to connect STATE_ACTIVE = 3, // Active - accepting connections STATE_PASSIVE = 4 // Passive - not accepting connections } state_t; // Events, which start with the states our peer can be in typedef enum { PEER_PRIMARY = 1, // HA peer is pending primary PEER_BACKUP = 2, // HA peer is pending backup PEER_ACTIVE = 3, // HA peer is active PEER_PASSIVE = 4, // HA peer is passive CLIENT_REQUEST = 5 // Client makes request } event_t; // Our finite-state machine typedef struct { state_t state; // Current state event_t event; // Current event int64_t peer_expiry; // When peer is considered "dead" } bstar_t; // We send state information this often // If peer doesn't respond in two heartbeats, it is "dead" #define HEARTBEAT 1000 // In msec The heart of the Binary Star design is its finite-state machine (FSM). The FSM runs one event at a time. We apply an event to the current state, which checks if the event is accepted, and if so sets a new state (Example 4-62). Example 4-62. Binary Star server (bstarsrv.c): Binary Star state machine static Bool s_state_machine (bstar_t *fsm) { Bool exception = FALSE; // These are the PRIMARY and BACKUP states; we're waiting to become // ACTIVE or PASSIVE depending on events we get from our peer if (fsm->state == STATE_PRIMARY) { if (fsm->event == PEER_BACKUP) { printf ("I: connected to backup (passive), ready as active\n"); fsm->state = STATE_ACTIVE; } else if (fsm->event == PEER_ACTIVE) { printf ("I: connected to backup (active), ready as passive\n"); fsm->state = STATE_PASSIVE; } // Accept client connections } else if (fsm->state == STATE_BACKUP) { if (fsm->event == PEER_ACTIVE) { printf ("I: connected to primary (active), ready as passive\n"); fsm->state = STATE_PASSIVE; } else // Reject client connections when acting as backup if (fsm->event == CLIENT_REQUEST) exception = TRUE; } else The ACTIVE and PASSIVE states are laid out in Example 4-63.

\n", server [server_nbr]); client = zsocket_new (ctx, ZMQ_REQ); zsocket_connect (client, server [server_nbr]); // Send request again, on new socket zstr_send (client, request); } } } zctx_destroy (&ctx); return 0; } To test our Binary Star implementation, start the servers and client in any order: bstarsrv -p # Start primary bstarsrv -b # Start backup bstarcli You can then provoke failover by killing the primary server, and recovery by restarting the primary and killing the backup. Note how it’s the client vote that triggers failover and recovery. Binary Star is driven by a finite-state machine (Figure 4-8). States in white accept client requests, and states in gray refuse them. Events are the peer state, so “Peer Active” means the other server has told us it’s active. “Client Request” means we’ve received a client request. “Client Vote” means we’ve received a client request and our peer has been inactive for two heartbeats. Figure 4-8. Binary Star finite-state machine Note that the servers use PUB-SUB sockets for state exchange. No other socket combination will work here. PUSH and DEALER block if there is no peer ready to receive a message. PAIR does not reconnect if the peer disappears and comes back.

This section isn’t really about GSL at all, but about a useful and little-known trick that’s handy for ambitious architects who want to scale themselves, as well as their work. Once you learn the trick, you can whip up your own code generators in a short time. The code generators most software engineers know about come with a single hard-coded model. For instance, Ragel “compiles executable finite state machines from regular languages” (i.e., Ragel’s model is a regular language). This certainly works for a good set of problems, but it’s far from universal. How do you describe an API in Ragel? Or a project makefile? Or even a finite-state machine like the one we used to design the Binary Star pattern in Chapter 4? All of these would benefit from code generation, but there’s no universal model. So, the trick is to design your own models as you need them, and then make code generators as cheap compilers for those models.

pages: 232

A Discipline of Programming by E. Dijkstra

finite state, Turing machine, Y Combinator

He recalls, only when the transition table has been completed, that he has only binary variables (he calls them "flip-flops") at his disposal and that, if the number of states is, say, between 33 and 64 (bounds included) he needs at least six of them. Those six binary variables span a state space of 64 points and the designer is free in his choice how to identify the different states of his finite-state automaton with points in the 64-point state space. That this choice is not irrelevant becomes clear as soon as we realize that a state transition of the finite-state automaton has to be translated in combinations of boolean variables being operated upon under control of boolean values. To circuit designers this choice is known as the "state assignment problem" and technical constraints or optimization goals may make it a hairy one-so hairy, as a matter of fact, that one may be tempted to challenge the adequacy of the design methodology evoking it.

From a very etheric point of view, we could dis- miss the question, saying that this is none of our business, that this is the implementer's concern, and that as long as we don't want our programs to be executed the question is irrelevant anyhow. But to dismiss this question is not my intention. Actual machines have stores of a finite capacity and, left to themselves, they behave in principle like finite state automata which after a sufficient large number of steps must return to a state in which they have been before. (If, in addition, the machine is deterministic, history will from then on repeat itself.) For the overall behaviour of today's modern computers, the theory of finite state automata is, however, of limited significance, because their number of possible states is so incredibly huge that they could go on working for years without returning to a state in which they have been before. As an alternative we can study "the infinite state automaton" and drop the restric- tion of a finite store capacity.

T is the predicate that is true in all points of the state space concerned: the corresponding set is the universe. F is the predicate that is false in all points of the state space: it corresponds to the empty set. 3 THE CHARACTERIZATION OF SEMANTICS We are primarily interested in systems that, when started in an "initial state", will end up in a "final state" which, as a rule, depends on the choice of the initial state. This is a view that is somewhat different from the idea of the finite state automaton that on the one hand absorbs a stream of input characters and on the other hand produces a stream of output characters. To translate that in our picture we must assume that the value of the input (i.e. the argument) is reflected in the choice of the initial state and that the value of the output (i.e. the answer) is reflected in the final state. Our view relieves us from all sorts of peripheral complications.

pages: 647 words: 43,757

Types and Programming Languages by Benjamin C. Pierce

Albert Einstein, combinatorial explosion, experimental subject, finite state, Henri Poincaré, Perl 6, Russell's paradox, sorting algorithm, Turing complete, Turing machine, type inference, Y Combinator

To describe the class, we need some additional terminology. 21.5.10 Definition Given an invertible generating function F and an element x Î U, the set predF(x) (or just pred(x)) of immediate predecessors of x is and its extension to sets X ⊆ U is The set reachableF(X) (or just reachable(X)) of all elements reachable from a set X via support is defined as and its extension to single elements x Î U is reachable(x) = reachable({x}). An element y Î U is reachable from an element x if y Î reachable(x). 21.5.11 Definition An invertible generating function F is said to be finite state if reachable(x) is finite for each x Î U. For a finite-state generating function, the search space explored by gfp is finite and gfp always terminates: 21.5.12 Theorem If reachableF(X) is finite, then gfpF(X) is defined. Consequently, if F is finite state, then gfpF(X) terminates for any finite X ⊆ U. Proof: For each recursive call gfp(Y) in the call graph generated by the original invocation gfp(X), we have Y ⊆ reachable(X). Moreover, Y strictly increases on each call. Since reachable(X) is finite, m(Y) = |reachable(X)| - |Y| serves as a termination measure for gfp. 21.5.13 Exercise [⋆⋆⋆] Suppose F is an invertible generating function.

Next, we want to characterize the generating functions F for which lfp is guaranteed to terminate on all finite inputs. For this, some new terminology is helpful. Given a finite-state generating function F Î P(U) → P(U), the partial function (or just height) is the least partial function satisfying the following condition:[2] (Note that height(x) is undefined if x either participates in a reachability cycle itself or depends on an element from a cycle.) A generating function F is said to be finite height if heightF is a total function. It is easy to check that, if y Î support(x) and both height(x) and height(y) are defined, then height(y) < height(x). Now, if F is finite state and finite height, then lfp(X) terminates for any finite input set X ⊆ U. To see this, observe that, since F is finite state, for every recursive call lfp(Y) descended from the original call lfp(X), the set Y is finite.

These can be used to express very general correctness properties but are often cumbersome to use and demand a good deal of sophistication on the part of programmers. At the other end are techniques of much more modest power—modest enough that automatic checkers can be built into compilers, linkers, or program analyzers and thus be applied even by programmers unfamiliar with the underlying theories. One well-known instance of this sort of lightweight formal methods is model checkers, tools that search for errors in finite-state systems such as chip designs or communication protocols. Another that is growing in popularity is run-time monitoring, a collection of techniques that allow a system to detect, dynamically, when one of its components is not behaving according to specification. But by far the most popular and best established lightweight formal methods are type systems, the central focus of this book. As with many terms shared by large communities, it is difficult to define "type system" in a way that covers its informal usage by programming language designers and implementors but is still specific enough to have any bite.

Reactive Messaging Patterns With the Actor Model: Applications and Integration in Scala and Akka by Vaughn Vernon

A Pattern Language, business intelligence, business process, cloud computing, cognitive dissonance, domain-specific language,, fault tolerance, finite state, Internet of things, Kickstarter, loose coupling, remote working, type inference, web application

True, there is a layer of indirection between the sending and receiving actor—a mailbox. Even so, the word direct is used because the programming model provides a worthy abstraction that makes it appear that messages are sent directly from one actor to another. • State machines: Actors support finite state machines. When an actor transitions to some expected state, it can modify its behavior in preparation for future messages. By becoming another kind of message handler, the actor implements a finite state machine. • Share nothing: Actors do not share their mutable state with any other actor, or any other component for that matter. • Lock-free concurrency: Since actors do not share their mutable state, and because they receive only one message at a time, actors never need to attempt to lock their state before reacting to a message.

In simple terms, an actor does this by exchanging its current receive function for a different one. Internally an actor can completely replace its current receive function for another, or it can keep a stack of previous receive functions, essentially pushing them and then popping them back into context.5 An actor may have a few or several different receive functions that it swaps among, but it can have only a finite number of them. This capability enables actors to be effective finite state machines. 5. Akka calls this feature become, enabling an actor to become a different kind of message receiver. Obviously, there are better situations than others to exchange your actor’s receive function. The feature needn’t be overused. Assuming that an actor can reach a given state only if it has received a given prerequisite message, when an actor has received a message in a specific receive block, it can assume that the prerequisite message has already been received within its current instance lifetime.

You probably imagined that it could be difficult since there are so many things that could happen on the outside of the actor and how the inside would have to be flexible enough to react to all those happenings. For one thing, an actor can’t determine whether messages are received out of order since the system outside is asynchronous and won’t generally react in exact, predetermined, serialized ways. Well, that’s all part of dealing with nondeterminism as discussed in Chapter 1. Remember, however, that an actor that holds and manages state may be designed as a finite state machine. This allows you to design your actor to understand a given state, which messages can be handled in that state, how the current state will be transitioned, and what to do about messages that can’t be handled while in that state. Yet, that’s design, not test. Gladly, the Akka team understands the ins and outs since they have to test actors themselves to ensure Akka’s robustness. When you have to eat your own dog food, you usually try to make it tasty; you know, simple and as usable as possible.

pages: 834 words: 180,700

The Architecture of Open Source Applications by Amy Brown, Greg Wilson

8-hour work day, anti-pattern, bioinformatics,, cloud computing, collaborative editing, combinatorial explosion, computer vision, continuous integration, create, read, update, delete, David Heinemeier Hansson, Debian, domain-specific language, Donald Knuth,, fault tolerance, finite state, Firefox, friendly fire, Guido van Rossum, linked data, load shedding, locality of reference, loose coupling, Mars Rover, MITM: man-in-the-middle, MVC pattern, peer-to-peer, Perl 6, premature optimization, recommendation engine, revision control, Ruby on Rails, side project, Skype, slashdot, social web, speech recognition, the scientific method, The Wisdom of Crowds, web application, WebSocket

Other Worker Behaviors A large number of other worker behaviors can and have been implemented using these same ideas. 15.4.1. Finite State Machines Finite state machines (FSMs), implemented in the gen_fsm behavior module, are a crucial component when implementing protocol stacks in telecom systems (the problem domain Erlang was originally invented for). States are defined as callback functions named after the state that return a tuple containing the next State and the updated loop data. You can send events to these states synchronously and asynchronously. The finite state machine callback module should also export the standard callback functions such as init, terminate, and handle_info. Of course, finite state machines are not telecom specific. In Riak, they are used in the request handlers. When a client issues a request such as get, put, or delete, the process listening to that request will spawn a process implementing the corresponding gen_fsm behavior.

If a process terminates, it sends an EXIT signal to processes in its link set. If the termination reason is non-normal, the process terminates itself, propagating the EXIT signal further. By calling the process_flag(trap_exit, true) BIF, processes can receive the EXIT signals as Erlang messages in their mailbox instead of terminating. Riak uses EXIT signals to monitor the well-being of helper processes performing non-critical work initiated by the request-driving finite state machines. When these helper processes terminate abnormally, the EXIT signal allows the parent to either ignore the error or restart the process. 15.2. Process Skeletons We previously introduced the notion that processes follow a common pattern regardless of the particular purpose for which the process was created. To start off, a process has to be spawned and then, optionally, have its alias registered.

When in the body of the receive-evaluate loop, processes will receive different messages and handle them in different ways. Finally, on termination, the cleanup will vary from process to process. So, even if a skeleton of generic actions exists, these actions are complemented by specific ones that are directly related to the tasks assigned to the process. Using this skeleton as a template, programmers can create Erlang processes that act as servers, finite state machines, event handlers and supervisors. But instead of re-implementing these patterns every time, they have been placed in library modules referred to as behaviors. They come as part as the OTP middleware. 15.3. OTP Behaviors The core team of developers committing to Riak is spread across nearly a dozen geographical locations. Without very tight coordination and templates to work from, the result would consist of different client/server implementations not handling special borderline cases and concurrency-related errors.

On Language: Chomsky's Classic Works Language and Responsibility and Reflections on Language in One Volume by Noam Chomsky, Mitsou Ronat

conceptual framework, finite state, Paul Samuelson, theory of mind

.: An intuition again founded on the same anti-empiricism. In my view, a finite state Markov source model might reasonably be considered as characterizing something like the outer limits of empiricist learning theory. In fact, a mathematical psychologist and logician, Patrick Suppes, gave precise expression to this intuition, or one version of it, a few years ago. He proved that a certain very rich version of stimulus-response learning theory must remain within the limits of finite state sources of the kind we have been discussing. He considered this to be a positive result. To me it seemed to be a negative result. The reason is this. As has been known for a long time, even elementary systems of knowledge cannot be represented in terms of finite state Markov sources—for example, our knowledge of English, or even much simpler systems, such as propositional calculus.

The viewpoint is very different. M.R.: Certain mathematical theories have seduced many linguists. I’m thinking of their “historic” encounters with telecommunications engineers . . . N.C.: Well, at the end of the forties and the beginning of the fifties, there were important developments in the mathematical theory of communication, information theory, and the theory of automata. Technically, models such as finite state Markov sources were proposedh . . . Very often it was supposed that these models were appropriate for the description of language. Jakobson referred to this vaguely, but Hockett utilized them quite explicitly. In 1955 he proposed a theory of language structure based on a Markov source model borrowed from the mathematical theory of communication. Similar theories were developed by psychologists, engineers, mathematicians.

.: Yes, and I hope that these studies10 will continue to be pursued, as well as the mathematical investigation of transformational grammars. There has been some interesting recent work by Stanley Peters and Robert Ritchie on this latter topic. Returning to the earlier point, it seems clear that empiricist learning theories are much too limited to be adequate; and it is even possible to demonstrate this if we accept the assumption that finite state Markov sources are the richest systems that can be attained by such theories, at the limit. This conclusion does not seem to be unreasonable to me, although naturally it is not a precise conclusion because the notion “empiricist theory” is not well defined. M.R.: Did you link your critique of-these theories right away to the critique of structural linguistics? N.C.: Well, indirectly. These theories were then very much in fashion, and they even aroused a certain degree of euphoria, I think it is fair to say.

pages: 661 words: 187,613

The Language Instinct: How the Mind Creates Language by Steven Pinker

Albert Einstein, cloud computing, David Attenborough, double helix, Drosophila, elephant in my pajamas, finite state, illegal immigration, Joan Didion, Loebner Prize, mass immigration, Maui Hawaii, meta analysis, meta-analysis, MITM: man-in-the-middle, natural language processing, out of africa, phenotype, rolodex, Ronald Reagan, Sapir-Whorf hypothesis, Saturday Night Live, speech recognition, Steven Pinker, theory of mind, transatlantic slave trade, Turing machine, Turing test, twin studies, Yogi Berra

Goldwasser shut his eyes to draw the next card. It turned out to read in these days when. He pondered whether to select it is fashionable to scoff at the traditional morality of marriage and family life or it is no longer fashionable to scoff at the traditional morality of marriage and family life. The latter had more of the form’s authentic baroque splendor, he decided. Let’s call this a word-chain device (the technical name is a “finite-state” or “Markov” model). A word-chain device is a bunch of lists of words (or prefabricated phrases) and a set of directors for going from list to list. A processor builds a sentence by selecting a word from one list, then a word from another list, and so on. (To recognize a sentence spoken by another person, one just checks the words against each list in order.) Word-chain systems are commonly used in satires like Frayn’s, usually as do-it-yourself recipes for composing examples of a kind of verbiage.

Sleeping esophagus: Twain, “Double-Barreled Detective Story.” Example from Lederer, 1990. Pobbles: Edward Lear, “The Pobble Who Has No Toes.” Jabber-wocky: Carroll, 1871/1981. Colorless green ideas: Chomsky, 1957. Automated news story: Frayn, 1965. Example from Miller, 1967. Gobbledygook generators: Brandreth, 1980; Bolinger, 1980; Spy magazine, January 1993. Approximations to English: Miller & Selfridge, 1950. Finite-state devices and their problems: Chomsky, 1957; Miller & Chomsky, 1963; Miller, 1967. TV Guide example from Gleitman, 1981. Cook with round bottom: Columbia Journalism Review, 1980; Lederer, 1987. Impenetrable Chomsky: Chomsky, 1986, p. 79. Textbooks on modern grammatical theory: Friedin, 1992; Radford, 1988; Riemsdijk & Williams, 1986. Sex between parked cars: Columbia Journalism Review, 1980.

A set of affixes, positions, or word forms that a language uses to distinguish the different roles of the participants in some event or state. Cases typically correspond to the subject, object, indirect object, and the objects of various kinds of prepositions. In English, case is what distinguishes between I, he, she, we, they, which are used for subjects, and me, him, her, us, them, which are used for objects of verbs, objects of prepositions, and everywhere else. chain device. See finite-state device. chromosome. A long strand of DNA, containing thousands of genes, in a protective package. There are twenty-three chromosomes in a human sperm or egg; there are twenty-three pairs of chromosomes (one from the mother, one from the father) in all other human cells. clause. A kind of phrase that is generally the same thing as a sentence, except that some kinds of clause can never occur on their own but only inside a bigger sentence: THE CAT IS ON THE MAT; John arranged FOR MARY TO GO; The spy WHO LOVED ME disappeared; He said THAT SHE LEFT.

When Computers Can Think: The Artificial Intelligence Singularity by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann, Michelle Estes

3D printing, AI winter, anthropic principle, artificial general intelligence, Asilomar, augmented reality, Automated Insights, autonomous vehicles, availability heuristic, blue-collar work, brain emulation, call centre, cognitive bias, combinatorial explosion, computer vision, create, read, update, delete, cuban missile crisis, David Attenborough, Elon Musk,, epigenetics, Ernest Rutherford, factory automation, feminist movement, finite state, Flynn Effect, friendly AI, general-purpose programming language, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, industrial robot, Isaac Newton, job automation, John von Neumann, Law of Accelerating Returns, license plate recognition, Mahatma Gandhi, mandelbrot fractal, natural language processing, Parkinson's law, patent troll, patient HM, pattern recognition, phenotype, ransomware, Ray Kurzweil, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, sorting algorithm, speech recognition, statistical model, stem cell, Stephen Hawking, Stuxnet, superintelligent machines, technological singularity, Thomas Malthus, Turing machine, Turing test, uranium enrichment, Von Neumann architecture, Watson beat the top human players on Jeopardy!, wikimedia commons, zero day

So a table of 3n3 states is required to hold all the phones, i.e. about 500 entries for 50 different phones. Hidden Markov models Modern speech understanding systems then feed the result into hidden Markov models, which are a generalization of finite state machines. To understand finite state machines, consider the following secret message, which was encrypted using a simple pen and paper cipher known as single transposition:SONERENEYDMMO The cipher can be decrypted to reveal the following plain text:SENDM OREMO NEY But pen and paper ciphers do not include a space character, so it is difficult to recognize the words that are within it. The following finite state machine addresses this problem. In it, states are represented as numbered circles and transitions as arrows. The system starts at node 0 and then looks for the first letter. If it is an “S” then we move to state 1, but if we see an “M” then we move to state 5, in which case an “O” would move it to state 6.

If it is an “S” then we move to state 1, but if we see an “M” then we move to state 5, in which case an “O” would move it to state 6. It can be seen that the letters “M”, “O”, “N”, “E”, and “Y” will result in the machine being in state 11, which represents the word “MONEY”. A large number of words can be loaded into a finite state machine which can then disambiguate them very efficiently. Finite state machine for recognizing words. Owned A hidden Markov model also has states and transitions. However, the transitions are probabilistic, so that a given input may produce several transitions. Further, the actual states and transitions are hidden, so all that is known is the input phones and the resulting words. There are clever algorithms that can learn these transitions from examples. It is also possible to train recurrent artificial neural networks to handle this type of problem.

pages: 480 words: 99,288

Mastering ElasticSearch by Rafal Kuc, Marek Rogozinski

Amazon Web Services, create, read, update, delete,, fault tolerance, finite state, full text search, information retrieval

This codec may give you performance boost on commonly used fields, but should be used with caution, as it is very memory intensive, because the terms and postings arrays need to be stored in the memory. Note Since all the terms are held in the byte array you can have upto 2.1 GB of memory used for this per segment. memory: As its name suggests, this codec writes all the data to disk, but reads the terms and post listings into the memory, using a structure called FST (Finite State Transducers). More information about this structure can be found in a great post by Mike McCandless at Because of storing the data in memory, this codec may result in performance boost for commonly used terms. bloom_default: It is an extension of the default codec that adds the functionality of a bloom filter that is written to the disk. When reading, the bloom filter is read and held into memory to allow very fast checking if a given value exists.

This allows us to create the autocomplete functionality in a very effective way, because complicated structures are stored in the index instead of being calculated during query time. Although this suggester is not about correcting user spelling mistakes we thought that it will be good to show at least a simple example of this highly efficient suggester. The logic behind completion suggester The prefix suggester is based on the data structure called FST (Finite State Transducer) ( Although it is highly efficient, it may require significant resources to build on systems with large amount of data in them, systems that ElasticSearch is perfectly suitable for. If we like to build these structures on the nodes after each restart or cluster state change we may lose performance. Because of that ElasticSearch creators decided to create FST-like structures during index time and store it in the index so that it can be loaded into memory when needed.

Principles of Protocol Design by Robin Sharp

accounting loophole / creative accounting, business process, discrete time, fault tolerance, finite state, Gödel, Escher, Bach, information retrieval, loose coupling, MITM: man-in-the-middle, packet switching, RFC: Request For Comment, stochastic process

For each of the N parties to an N-peer communication, the protocol defines a language, whose sentences are the legal sequences of messages received by that party, and whose alphabet of symbols is the set of all possible messages. A machine to obey the rules of the protocol must thus essentially be a recogniser for the protocol language. For simple protocols, this is a useful abstraction, as the language is regular or at most context-free, and standard compiler techniques [124] can be used to implement the machine as a finite state machine or push-down automaton respectively. A trivial example of a simple protocol described in this way is given below. It is a type of stop-and-wait protocol. The sender requests the receiver to indicate when it is ready to receive data, and waits for this indication. On receipt of the indication, the sender sends the data and waits for an acknowledgment. The exchange of messages is as shown in Figure 1.1.

The exchange of messages is as shown in Figure 1.1. The languages to be recognised by the sender and receiver 1.2 Protocols as Processes 3 respectively (and of course to be generated by the receiver and sender respectively) are defined by the BNF: sender ::= readyindication acknowledge sender receiver ::= requesttoaccept data receiver Each party must generate and recognise sentences of a regular language. This is a simple task for a finite state machine. Unfortunately, there are some important objections to this language-oriented view of a protocol. The first is a practical objection: Simple languages generally do not correspond to protocols which can tolerate faults, such as missing or duplicated messages. Protocols which are fault-tolerant often require the use of state machines with enormous numbers of states, or they may define context-dependent languages.

There exists no local state of the participant such that its concurrency set contains both an abort and a commit state. 2. All states whose concurrency set contains a commit state are committable states. A committable state is one in which the participant knows with certainty that all the other participants have committed or will do so. The major states of Protocol 16 can be illustrated by the finite-state machines shown in Figure 5.9(a). The concurrency set of the state wc in the FSM for the co the concurrency set of the states wi in the FSMs ordinator is i {qi , wi , ai }, while for the slaves is {wc , ac , cc } j=i {q j , w j , a j , c j }, so the states w fail to fulfil either of the conditions of Theorem 5.1. Thus Protocol 16 is a blocking protocol. The simplest way to create a non-blocking protocol is then to introduce an extra round of message exchange: After receiving only votes for COMMIT from the slaves, the coordinator sends a PREPARE, rather than a COMMIT, message to all of them.

pages: 130 words: 11,880

Optimization Methods in Finance by Gerard Cornuejols, Reha Tutuncu

asset allocation, call centre, constrained optimization, correlation coefficient, diversification, finite state, fixed income, frictionless, frictionless market, index fund, linear programming, Long Term Capital Management, passive investing, Sharpe ratio, transaction costs, value at risk

Furthermore, since we have a strictly feasible (and optimal) dual solution, any optimal solution of the primal must have tight constraints, indicating that there is no type-B arbitrage. 3.2 Arbitrage Detection Using Linear Programming The linear programming problems (3.4) and (3.4) we formulated for the proof of Theorem 3.2 can naturally be used for detection of arbitrage opportunities. However, as we discussed above, this argument works only for finite state spaces. In this section, we discuss how LP formulations can be used to detect arbitrage opportunities without limiting consideration to finite state spaces. The price we pay for this flexibility is the restriction on the selection of the securities: we only consider the prices of a set of derivative securities written on the same underlying with same maturity. This discussion is based on [7]. Consider an underlying security whose current (time 0) price is given by S0 and its (random) price at time 1 is denoted by S1 .

Writing Effective Use Cases by Alistair Cockburn

business process,, create, read, update, delete, finite state, index card, information retrieval, iterative process, recommendation engine, Silicon Valley, web application

While it is not for the requirements document to describe the UI design, it is useful to augment the requirements document with samples of the UI design as it evolves. This design information can add to the readability of the requirements document, since it give both textual (abstract) and visual (concrete) renditions of the system’s behavior. The UI design has three levels of precision, low, medium and high: • The low-precision description of the user interface is a screen-navigation diagram, drawn as a finite state machine or statechart. Each state is the name of a screen the user will encounter. The finite state machine shows what user events cause movement from one screen to another. • The medium-precision description is a drawing or reduced size snapshot of the screen. Place this at the end of the use case, so that readers can both see and read what design is being nominated. • The high-precision description lists all the field types, lengths and validation checks of each screen, and does not belong in the requirements document at all!

The programmer and the user interface designer need to know what exactly is meant by address, which fields it contains, the lengths of the fields, the validation rules for addresses, zip codes, phone numbers, and the like. All of this information belongs in the requirements somewhere - and not in the use case! Use cases are only "chapter three" of the requirements document, the behavioral requirements. They do not contain performance requirements, business rules, user interface design, data descriptions, finite state machine behavior, priority, and probably some other information. "Well, where are those requirements?!" the system developers cry! It is all very well to say the use cases shouldn’t contain those, but they have to get documented sometime. Some of the information can, in fact, be attached to each use case as associated information. This might include * use case priority * expected frequency of occurrence * performance needs * delivery date * list of secondary actors * business rules (possibly) * open issues Different projects adjust this collection of information to contain whatever they consider important.

pages: 706 words: 120,784

The Joy of Clojure by Michael Fogus, Chris Houser

cloud computing, domain-specific language, Donald Knuth, Douglas Hofstadter,, finite state, Gödel, Escher, Bach, haute couture, Larry Wall, Paul Graham, rolodex, traveling salesman

This benefit will likely cause recur to live on, even should the JVM acquire TCO. 16 The Scala 2.8 compiler recognizes a @tailrec annotation and triggers an error whenever a marked function or method can’t be optimized. The final benefit of recur is that it allows the forms fn and loop to act as anonymous recursion points. Why recur indeed. 7.3.3. Don’t forget your trampoline We touched briefly on the fact that Clojure can also optimize a mutually recursive function relationship, but like the tail-recursive case, it’s done explicitly. Mutually recursive functions are nice for implementing finite state machines (FSA), and in this section we’ll show an example of a simple state machine modeling the operation of an elevator (Mozgovoy 2009) for a two-story building. There are only four states that the elevator FSA allows: on the first floor with the doors open or closed and on the second floor with the door open or closed. The elevator can also take four distinct commands: open doors, close doors, go up, and go down.

We’d like to create a function elevator that starts in the ff-open state, takes a sequence of commands, and returns true or false if they correspond to a legal schedule according to the FSA. For example, the sequence [:close :open :done] would be legal, if not pointless, whereas [:open :open :done] wouldn’t be legal, because an open door can’t be reopened. The function elevator could be implemented as shown next. Listing 7.6. Using mutually recursive functions to implement a finite state machine Using letfn in this way allows you to create local functions that reference each other, whereas (let [ff-open #(...)] ...) wouldn’t, because it executes its bindings serially. Each state function contains a case macro that dispatches to the next state based on a contextually valid command. For example, the sf-open state will transition to the sf-closed state given a :close command, will return true on a :done command (corresponding to a legal schedule), or will otherwise return false.

encapsulation, 5th block-level encapsulation local encapsulation namespace encapsulation Enlive enumeration values enumerator env ephemeral equality, 2nd, 6th, 7th equality partitions, 2nd, 3rd equality semantics error handling, 2nd, 3rd escaped evaluation contextual-eval, 2nd eval, 2nd meta-circular exceptions, 5th, 9th, 10th, 20th exceptions exceptions exceptions exceptions catch, 2nd checked compile-time, 2nd ConcurrentModification-Exception finally, 2nd handling java.lang.ClassCastException java.lang.Exception java.lang.NullPointer-Exception java.lang.RuntimeException runtime runtime vs. compile-time throw, 2nd, 3rd expand-clause expansion expected case experimentation expression problem extend, 2nd, 3rd, 4th extend-protocol, 2nd extend-type, 2nd Extensible Markup Language (XML), 2nd, 3rd F Factor (programming language), 2nd factory methods fail, 2nd false, 3rd evil-false Fantom (programming language) fence post errors filter, 2nd, 3rd, 4th, 5th find-doc find-ns finite state machines first, 2nd, 3rd, 4th, 5th First In, First Out (FIFO), 2nd First In, Last Out (FILO) first-class, 2nd, 3rd fixed-size pool FIXO, 3rd, 5th fixo-peek fixo-push, 2nd, 3rd flexibility float, 2nd floating point, 2nd, 5th overflow rounding error underflow, 2nd floats fluent builder FluentMove fn, 2nd, 3rd, 4th, 5th, 6th for, 2nd force, 2nd, 3rd forever form free variables freedom to focus frequencies Frink (programming language), 2nd frustrating fully qualified, 2nd, 3rd fun functions, 6th anonymous, 2nd, 3rd, 4th arity Calling Functions dangerous function signatures local multiple function bodies named arguments G Gang of Four garbage collection, 2nd, 3rd gcd gen-class, 2nd, 3rd, 4th, 5th, 6th generalized tail-call optimization, 2nd generic genotype gensym get, 2nd, 3rd, 4th get-in getter global hierarchy map goal Gödel, Escher, Bach: An Eternal Golden Braid good-move Graham, Paul, 2nd graphic graphical user interface (GUI), 2nd, 3rd, 4th graphics context greatest common denominator, 2nd green thread Greenspun’s Tenth Rule, 2nd Groovy (programming language) H Halloway, Stuart, 2nd has hash maps hash-map, 2nd, 3rd, 4th Haskell (programming language), 2nd, 3rd, 4th, 5th, 7th, 9th, 11th, 13th out of order execution Template Haskell typeclasses, 2nd heuristic Hickey, Rich, 2nd hidden hierarchy history homoiconicity, 2nd hooks, 2nd hops host semantics Hoyte, Doug hyphens I I/O, 2nd, 3rd idempotent, 2nd identical?

Monte Carlo Simulation and Finance by Don L. McLeish

Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, discrete time, distributed generation, finite state, frictionless, frictionless market, implied volatility, incomplete markets, invention of the printing press, martingale, p-value, random walk, Sharpe ratio, short selling, stochastic process, stochastic volatility, survivorship bias, the market place, transaction costs, value at risk, Wiener process, zero-coupon bond, zero-sum game

This uses the simplest form (1) of the binomial generator and is not computationally efficient for large n. In R, rbinom(m,n,p) will generate a vector of length m of Binomial(n, p) variates. Random Samples Associated with Markov Chains Consider a finite state Markov Chain, a sequence of (discrete) random variables X1 , X2 , . . .each of which takes integer values 1, 2, . . . N (called states). The number of states of a Markov chain may be large or even infinite and it is not always convenient to label them with the positive integers and so it is common to define the state space as the set of all possible states of a Markov chain, but we will give some examples of this later. For the present we restrict attention to the case of a finite state space. The transition probability matrix is a matrix P describing the conditional probability of moving between possible states of the chain, so that P [Xn+1 = j|Xn = i] = Pij , i = 1, . . .

Of course the real power of Gibbs Sampling is achieved in problems that are not two-dimensional such as the example above, but have dimension sufficiently high that calculating the sums or integrals in the denominator of expressions like (3.31) is not computationally feasible. Coupling From the Past: Sampling from the stationary distribution of a Markov Chain All of the above methods assume that we generate from the stationary distribution of a Markov chain by the “until Hele freezes over” method, i.e. wait until run the chain from an arbitrary starting value and then delete the initial transient. An alternative elegant method that is feasible at least for some finite state Markov chains is the method of “coupling from the past” due to Propp and Wilson (1996). We assume that we are able to generate transitions in the Markov Chain. In other words if the chain is presently in state i at time n we are able to generate a random variable Xn+1 from the distribution proportional to Pij , j = 1, ...K. Suppose F (x|i) is the cumulative distribution function P (Xn+1 · x|Xn = i) and let us denote its inverse by F −1 (y|i).

Applied Cryptography: Protocols, Algorithms, and Source Code in C by Bruce Schneier

active measures, cellular automata, Claude Shannon: information theory, complexity theory, dark matter, Donald Davies, Donald Knuth, dumpster diving, Exxon Valdez, fault tolerance, finite state, invisible hand, John von Neumann, knapsack problem, MITM: man-in-the-middle, NP-complete, P = NP, packet switching, RAND corporation, RFC: Request For Comment, software patent, telemarketer, traveling salesman, Turing machine, web of trust, Zimmermann PGP

Put the same stuff into two identical computers, and the same stuff comes out of both of them. A computer can only be in a finite number of states (a large finite number, but a finite number nonetheless), and the stuff that comes out will always be a deterministic function of the stuff that went in and the computer’s current state. That means that any random-number generator on a computer (at least, on a finite-state machine) is, by definition, periodic. Anything that is periodic is, by definition, predictable. And if something is predictable, it can’t be random. A true random-number generator requires some random input; a computer can’t provide that. Pseudo-Random Sequences The best a computer can produce is a pseudo-random-sequence generator. What’s that? Many people have taken a stab at defining this formally, but I’ll hand-wave here.

Real Random Sequences Now we’re drifting into the domain of philosophers. Is there such a thing as randomness? What is a random sequence? How do you know if a sequence is random? Is “101110100” more random than “101010101”? Quantum mechanics tells us that there is honest-to-goodness randomness in the real world. But can we preserve that randomness in the deterministic world of computer chips and finite-state machines? Philosophy aside, from our point of view a sequence generator is real random if it has this additional third property: 3. It cannot be reliably reproduced. If you run the sequence generator twice with the exact same input (at least as exact as humanly possible), you will get two completely unrelated random sequences. The output of a generator satisfying these three properties will be good enough for a one-time pad, key generation, and any other cryptographic applications that require a truly random sequence generator.

A full discussion on these four approaches and the research surrounding them is well beyond the scope of this book. See [1047,1355] for a good introduction to the topic; I am only going to touch on the major contributions to the field. The first approach treats a cryptographic protocol as any other computer program and attempts to prove correctness. Some researchers represent a protocol as a finite-state machine [1449,1565], others use extensions of first-order predicate calculus [822], and still others use specification languages to analyze protocols [1566]. However, proving correctness is not the same as proving security and this approach fails to detect many flawed protocols. Although it was widely studied at first, most of the work in this area has been redirected as the third approach gained popularity.

pages: 239 words: 64,812

Geek Sublime: The Beauty of Code, the Code of Beauty by Vikram Chandra

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, British Empire, business process, conceptual framework, create, read, update, delete, crowdsourcing, don't repeat yourself, Donald Knuth, East Village, European colonialism, finite state, Firefox, Flash crash, glass ceiling, Grace Hopper, haute couture, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Norman Mailer, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, supercomputer in your pocket, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce

The embodied language of websites, apps, and networks writes itself into us. One of the pioneers of the newly minted field of Aesthetic Computing, Paul Fishwick, points out that digital watches and DVRs surface the abstract computer-science notion of “finite state machines” in their menus—each time you press the “mode” button you move from one state to another: The way in which our thinking is changing culturally surfaces deep abstract concepts in computing to us as we use these devices: from number, to information structure, to process … It is not just that the finite state machine is embedded within the watch’s silicon, but also that the human wearing the watch becomes aware of this virtual machine’s structure and its components through the experience of using the watch. The state machine changes how the wearer thinks, even though the wearer is probably unaware of the formal mathematical notation of a state machine.

pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden

Benevolent Dictator For Life (BDFL), business intelligence, business process, cellular automata, cloud computing, commoditize, complexity theory, conceptual framework, continuous integration, data acquisition, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, general-purpose programming language, Guido van Rossum, HyperCard, information retrieval, iterative process, John von Neumann, Larry Wall, linear programming, loose coupling, Mars Rover, millennium bug, NP-complete, Paul Graham, performance metric, Perl 6, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Ruby on Rails, Sapir-Whorf hypothesis, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

We didn’t take it out of that domain. How do you make the idea of syntax-driven transformations accessible to users who might not know very much or anything at all about finite-state machines and push-down automata? Al: Certainly as a user of AWK, you don’t need to know about these concepts. On the other hand, if you’re into language design and implementation, knowledge of finite-state machines and context-free grammars is essential. Should a user of lex or yacc understand the context-free grammar even if the programs they produce don’t require their users to understand them? Al: Most users of lex can use lex without understanding what a finite-state machine is. A user of yacc is really writing a context-free grammar, so from that perspective, the user of yacc certainly gets to appreciate grammars, but the user doesn’t have to become a formal language theorist to use yacc.

Al: Perhaps the greatest surprise has been its broad applicability. Let me interpret automata theory as formal languages and the automata that recognize them. Automata theory provides useful notations, particularly regular expressions and context-free grammars, for describing the important syntactic features of programming languages. The automata that recognize these formal languages, such as finite-state machines and push-down automata, can serve as models for the algorithms used by compilers to scan and parse programs. Perhaps the greatest benefit of automata theory to compiling comes from being able to build compiler-construction tools such as lex and yacc that automate the construction of efficient scanners and parsers based on these automata. What is preventing us from building a compiler (and/or a language) that identify all potential bugs?

My long-term vision is that through the use of stronger languages, more powerful verification tools, and better software-engineering practices, software will improve in reliability and quality. How can we design pattern-matching algorithms that take advantage of concurrency in multicore hardware? Al: This is currently an active research area. Many researchers are exploring parallel hardware and software implementations of pattern-matching algorithms like the Aho-Corasick algorithm or finite-state algorithms. Some of the strong motivators are genomic analyses and intrusion detection systems. What motivated you and Corasick to develop the Aho-Corasick algorithm? Al: The origin has a very interesting story behind it. I was working on the book The Design and Analysis of Computer Algorithms [Addison-Wesley] with John Hopcroft and Jeffrey Ullman back in the early 70s. I was giving a lecture at Bell Labs on algorithm design techniques.

The Art of Scalability: Scalable Web Architecture, Processes, and Organizations for the Modern Enterprise by Martin L. Abbott, Michael T. Fisher

always be closing, anti-pattern, barriers to entry, Bernie Madoff, business climate, business continuity plan, business intelligence, business process, call centre, cloud computing, combinatorial explosion, commoditize, Computer Numeric Control, conceptual framework, database schema, discounted cash flows,, fault tolerance, finite state, friendly fire, hiring and firing, Infrastructure as a Service, inventory management, new economy, packet switching, performance metric, platform as a service, Ponzi scheme, RFC: Request For Comment, risk tolerance, Rubik’s Cube, Search for Extraterrestrial Intelligence, SETI@home, shareholder value, Silicon Valley, six sigma, software as a service, the scientific method, transaction costs, Vilfredo Pareto, web application, Y2K

If it was a duration calculation for a query, this information would be passed to it upon invocation. You may recall from a computer science computational theory class the description of Mealy and Moore machines, which are known as state machines or finite state machines. A state machine is an abstract model of states and actions that is used to model behavior; these can be implemented in the real world in either hardware or software. There are other ways to model or describe behavior of an application, but the state machine is one of the most common. Mealy Moore Machines A Mealy machine is a finite state machine that generates output based on the input and the current state of the machine. A Moore machine, on the other hand, is a finite state machine that generates output based solely on the current state. A very simple example of a Moore machine is a turn signal that alternates on and off.

The input is a car at the intersection waiting on the light. The output is determined by the current state of the light as well as the input. If a car is waiting and the current state is red, the signal gets turned to green. Obviously, these are both overly simplified examples, but you get the point that there are different ways of modeling behavior using states, inputs, outputs, and actions. Given that finite state machines are one of the fundamental aspects of theoretical computer science as mathematically modeled by automatons, it is no wonder why this is a fundamental structure of our system designs. But why exactly do we see state in almost all of our programs, and are there alternatives? The reason that most applications rely on state is that the languages used for Web based or Software as a Service (SaaS) development are almost all imperative based.

Additionally, we discussed why it is important to have individuals like architects and managers overseeing the entire system design to help point out to engineers when asynchronous calls could be warranted. Another topic that we covered in this chapter was the use of state in an application. We started with what is state within application development. We then dove into a discussion in computational theory on finite state machines and concluded with a distinction between imperative and declarative languages. We finished the stateful versus stateless conversation with one of the most commonly used implementations of state: that being the session state. Session as we defined it was an established communication between the client, typically the user’s browser, and the server, that gets maintained during the life of the session for that user.

pages: 593 words: 118,995

Relevant Search: With Examples Using Elasticsearch and Solr by Doug Turnbull, John Berryman

commoditize, crowdsourcing, domain-specific language, finite state, fudge factor, full text search, information retrieval, natural language processing, premature optimization, recommendation engine, sentiment analysis

Building fast completions via specialized search indexes Because completion is such an important element of relevance feedback, Elasticsearch introduced a specialized component called the completion suggester. This component circumvents the performance problem referenced previously and allows for custom sorting of completion results (rather than sorting by occurrence). Effectively, the completion suggester is a specialized search index that’s stored in parallel with the normal search index. It’s backed by a compact data structure (a finite state transducer) that provides a fast prefix-lookup capability. In many ways, this approach is the ideal solution for completion, but as you’ll see in a moment, it introduces a couple of problems of its own. Setting up the completion suggester is simple: you just declare one of the fields to be of type completion. Listing 8.5. Setting up Elasticsearch’s completion suggester In principle, you could copy the title field to the completion field and index the documents in the same way as demonstrated in the preceding section.

The completion implementation then moves on to find completions beginning with just star. Here, based on the context of the entire query, it would be unreasonable to return anything but Star Trek movies. But in the case of the Elasticsearch completion suggester, the top completion result is Star Wars: Episode IV—A New Hope. This demonstrates that the completion suggester is unaware of the search context. Another unfortunate consequence with using a finite state transducer to back completion is that the data structure is immutable. If a document is deleted from the index, the completions for that document will still exist. Currently, the only remedy for this situation is a full-index optimization, which effectively rebuilds the completion index from scratch. Despite the possible pitfalls, the completion suggester is an important tool for relevance feedback.

simple signals, 2nd Solr additive, with Boolean queries boosting feature mappings multiplicative, with function queries user ratings vs. filtering breadcrumb navigation browse experience browse interface, Yowl buckets section building signals bulk index API bulkMovies string business and domain awareness business concerns group business weight business-ranking logic BusinessScore C field, 2nd, 3rd, 4th, 5th scores field, 2nd, 3rd, 4th character filtering, 2nd, 3rd character offsets classic similarity classification features cleaning click-through rate co-occurrence counting cold-start problem COLLAB_FILTER filter, 2nd collaboration filtering, using co-occurrence counting search relevance and collation collocation extraction combining fields committed documents common words, removing completion field, 2nd completion suggester completion_analyzer completion_prefix variable complexphrase query parser compound queries, 2nd, 3rd concept search basic methods for building augmenting content with synonyms concept signals building using machine learning personalized search and configurations conflate tokens constant_score query, 2nd content augmentation curation engineer/curator pairing risk of miscommunication with content curator role of content curator exploring extracting into documents providing to search engine searching content group content weight, 2nd ContentScore control analysis controlling field matching converge conversion rate coord (coordinating factor), 2nd, 3rd, 4th, 5th, 6th, 7th copyField, 2nd, 3rd copy_to option, 2nd cosine similarity cross_fields, 2nd, 3rd searching, 2nd, 3rd, 4th Solr solving signal discordance with cuisine field cuisine_hifi field, 2nd cuisine_lofi field curation, search relevance and custom all field custom score query D data-driven culture debugging example search application Elasticsearch first searches with The Movie Database Python matching query matching analysis to solve matching issues comparing query to inverted index fixing by changing analyzers query parsing underlying strategy ranking computing weight explain feature scoring matches to measure relevance search term importance similarity vector-space model decay functions, 2nd deep paging default analyzer defType parameter delimiters acronyms modeling specificity phone numbers synonyms tokenizing geographic data tokenizing integers tokenizing melodies deployment, relevance-focused search application description field, 2nd, 3rd, 4th, 5th descriptive query directors field field, 2nd, 3rd, 4th score, 2nd, 3rd disable_coord option disabling tokenization discriminating fields DisjunctionMaximumQuery dismax, 2nd doc frequency, 2nd doc values document search and retrieval aggregations Boolean search facets filtering Lucene-based search positional and phrase matching ranked results relevance sorting document-ranking system documents analysis enhancement enrichment extraction flattening nested grouping similar matching meaning of scored search completion from documents being searched tokens as features of matching process meaning of documents dot character dot product, 2nd down-boosting title DSL (domain-specific language) E e-commerce search, 2nd easy_install utility edismax query parser, 2nd Elasticsearch example search application overview end sentinels engaged field engaged restaurants English analyzer overview reindexing with english_* filters english_bigrams analyzer english_keywords filter english_possessive_stemmer filter english_stemmer filter english_stop filter enrichment, 2nd ETL (extract, transform, load), 2nd every field gets a vote exact matching, 2nd, 3rd, 4th expert search, 2nd, 3rd explanation field external sources extract function, 2nd, 3rd extracting features extraction F faceted browsing overview Solr facet.prefix option facets, 2nd, 3rd fail fast, 2nd, 3rd, 4th fast vector highlighter feature modeling, 2nd feature selection feature space features creation of overview, 2nd, 3rd feedback at search box search completion search suggestions search-as-you-type business and domain awareness content curation risk of miscommunication with content curator role of content curator in search results listing grouping similar documents information presented snippet highlighting when there are no results search relevance and Solr faceted browsing field collapsing match phrase prefix relevance feedback feature mappings suggestion and highlighting components while browsing alternative results ordering breadcrumb navigation faceted browsing field boosts field collapsing overview Solr field discordance field mappings field normalization field scores, 2nd field synchronicity, signal modeling and field-by-field dismax field-centric methods, 2nd field-centric search, combining term-centric search and combining greedy search and conservative amplifiers like fields precision vs. recall Solr fieldNorms, 2nd, 3rd fields fieldType field_value_factor function fieldWeight, 2nd, 3rd, 4th filter clause filter element filter queries filtering Amazon-style collaborative overview using co-occurrence counting score shaping vs. boosting finite state transducer fire token first_name field floating-point numbers fragment_size parameter fudge factors full bulk command full search string full-text search full_name field function decay function queries, multiplicative boosting with Boolean queries vs. combining high-value tiers scored with simple Solr function_score query, 2nd, 3rd, 4th G garbage features Gaussian decay generalizing matches generate_word_parts genres aggregation field geographic data, tokenizing geohashing geolocation, 2nd getCastAndCrew function GitHub repository granular fields grouping fields H has_discount field high-quality signals highlighted snippets highlights HTMLStripCharFilter HTTP commands, 2nd I ideal document IDF (inverse document frequency) ignoring when ranking overview, 2nd inconsistent scoring index-time analysis, 2nd index-time personalization indexing documents information and requirements gathering business needs required and available information users and information needs information retrieval, creating relevance solutions through inner objects innermost calculation integers, tokenizing inventory-related files inventory_dir configuration inverse document frequency.

The Science of Language by Noam Chomsky

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, British Empire, Brownian motion, dark matter, Drosophila, epigenetics, finite state, Howard Zinn, phenotype, statistical model, stem cell, Steven Pinker, theory of mind

I briefly address two questions: (i) where do linguistic hierarchical structures ‘come from,’ and (ii) what is their significance for understanding the way that language ‘works’? On the first question, it is important to recognize that it is difficult to avoid hierarchy. The successor function for generating the natural numbers introduces it, and in fact any finite state grammar introduces hierarchy: each additional element added in generating a string of words or other elements yields a larger set of which the prior-generated set is a subset. If one adds the operation of associativity, one can make the hierarchy disappear and end up with a string, but that requires an additional operation – and justification for introducing it. I do not mention a finite state grammar because it introduces the right hierarchy for natural languages. Chomsky demonstrated long ago that it did not (1957; see also Chomsky & Miller 1963). The issue is not how some kind of hierarchy is introduced, but rather how the ‘right’ kind is – the kind of hierarchy found in natural languages.

R. 26, 33, 197, 203, 268 Gandhi, Mahatma 114, 144 garden path sentences 50 Gauss, Carl F. 127 Gehring, Walter 46, 171, 258, 279 genes 46–49, 148, 173master genes 279, 280 and Merge 49 PAX-6 46, 280 universal genome 53 Gleitman, Lila 196 Godwin, Richard 123 Golinkoff, Roberta 196 Goodman, Nelson 81, 83, 88, 261, 285behaviorism 89, 285 and Chomsky 86–92 constructivism 285 nominalism 87, 91 personal relationship with Chomsky 91–92 Gould, Stephen J. 158, 170, 172, 173 grammar 277–278 and acquisition of language 24, 60 artifacts in theories of 238 extensional equivalence of 153 finite state 232 generative 63, 85, 91, 96, 99 generative capacity 236 phrase structure 233, 235 structure and hierarchy 236 transformational 25 ‘great leap forward' 13, 70, 179 growth 40, 73, 77cognitive growth 121 developmental constraints on 41, 45, 158 Haldane, J. B. S. 51, 53 Hale, Kenneth 17, 62 Halle, Morris 21 Hamilton, William D. 104 Harman, Gilbert 100 Harris, Zellig 38, 80, 81, 86 Hauser, Marc 100, 109, 286evolution of communication 12, 58 faculty of language 60, 170, 172, 268, 269 hearing 48 Helmholtz, Hermann von 73, 97 Herbert of Cherbury 181 Higginbotham, Jim 129, 130 Hirsh-Pasek, Kathy 196 homunculus 37, 290 Hornstein, Norbert 29, 183, 265 human behavior 138–151, 286 human evolution 2, 13, 71developmental constraints on 41 ‘great leap forward' 13, 70, 77 human nature 95–102, 108–112 and biological capacities 95 Chomsky on 95–102 determined and uniform 95, 99 distinctiveness of 176–179 enlightenment conception of 142 and evolution 103–107 ‘great leap forward' 179 moral agency 101 plasticity of 121 humanitarian intervention 121, 122, 287 humans, genetic variation 13 Hume, David 26, 90, 99, 106, 179color problem 247–248, 286 theory of moral nature 63, 99, 109 Huxley, Thomas 23 I-beliefs 153–156 definition of 156 I-concepts 153–156 definition of 155 I-language 81, 153–156, 164, 239, 258, 266intensional specification of 167 imagination 70, 161 inclusiveness 62, 281 induction 88, 90, 95 inference 73, 165, 221 information 208, 213, 218, 228, 229, 254pragmatic 30 semantic 29, 260 innateness 39–45, 60, 89, 91, 255, 267, 284 innatism 123 innovation 71, 74, 95, 177, 178, 185, 282technological 145 insects, study of 147 instinct 96, 143, 178, 181, 247, 248, 287 instrumentalism 211 intention (see also nativism) 163 internalism 6, 228, 248, 262–263, 269, 287and concepts 188, 190, 209, 255–257, 260, 272 intuitions 125, 126 island sentences 50 Jackendoff, Ray 170, 172 Jacob, François 24, 53, 60, 243 Joos, Martin 145 justice 120 Kahneman, Daniel 140 Kant, Immanuel 90 Kauffman, Stuart 21, 22, 266 Kayne, Richard 55, 84, 241 Keller, Helen 45 Kissinger, Henry 101, 107, 113, 287 Klein, Ralph 111 knowledge 70, 193See also information Kripke, Saul 126 Kropotkin, Peter 103, 111 languageand agency 124–128 as an animal instinct 178 and arithmetical capacities 16 and biology 21–30, 80, 235, 284 biophysical explanations of 208 and brain morphology 46 capacity for 70, 164 characteristic uses of 11–12 cognitive benefits of 2 competence and use 63 and complex thought 1 complexity of 52, 146 compositional character of 37 computational theory of 174, 272 and concepts 71, 198 conceptual resources of 212 displacement property 16 distinctive features 22 domination 232–238 expectations for 54 externalization of 52, 78, 79, 153, 222, 278 flexibility 95, 162, 197, 210, 224, 227 formal languages 16, 17, 289 formal theory of 21–30 functions of 11–20, 164, 165 generative capacity 49 head-first 240 hierarchical structure 232–238 I-language 153–156, 164, 239, 258, 266 interface conditions 25 internal 37 internal, individual and intensional 37, 154, 167 internal use of 52, 69, 124, 153, 160, 197, 262–263, 272–274 a ‘knowledge' system 187, 193 localization of 46, 59, 69–74 and mathematics 181 modularity of 59 movement property 16, 85, 108, 264–265 as a natural object 2, 7 nominalizing languages 155 open texture of 273 and other cognitive systems 271 phonetic features 42 phonological features 42, 57 precursors of 43, 77 properties of 22, 37, 60, 62 public language 153, 288 purposes of 224 and reason 181 result of historical events 84 rules of 165, 221, 223, 224, 225, 283, 284 and science 124–128 sounds available in 282 structural features of 42 structure of 236, 277–278 study of 36, 76, 79, 154See also linguistics theories of 164, 193, 239, 243, 285 unboundedness 177, 262 uniqueness to humans 150 variation in the use of 164, 239–242 language faculty 74, 172, 177, 243, 260, 261, 270adicity requirements of 198, 199 perfection of 50 language of thought 27, 71, 189, 190, 220, 230, 269 Lasnik, Howard 85 learning 95, 180, 200, 226, 281, 282empiricism and 173, 179 learning a language 187, 225, 226 Lenneberg, Eric 21, 43, 47, 59 Lepore, E. 195 Lewis, David 153, 165, 220, 222, 223, 224 Lewontin, Richard 58, 157, 170, 172, 173, 175, 231 lexical items 62categories of 234 origin of 46 liberalism 98 linguistic communities 222 linguistic development 39See also development linguistic practices 221, 223 linguistic principles 237, 276 linguistics 19, 36, 82, 145and biology 150 first factor considerations 45, 96, 148 and natural science 38 and politics 152 procedural theories in 149 second factor considerations 148, 277 structural 80 theories of 87, 265 third factor considerations:separate entry Locke, John 26, 125, 267personal identity 31, 271 secondary qualities 256 logic, formal 251 Logical Structure of Linguistic Theory 84–85 Lohndal, Terje 57 Lorenz, Konrad 21 Marx, Karl 122 mathematics 127, 165, 214, 215, 266capacity for 15, 136 formal functions in 166–169 and language 181 semantics for 251, 252 Mayr, Ernst 174 meaning 29, 98, 199, 206, 250, 252, 270, 273computational theory of 213 construction of a science of 226–230 externalist science of 209–220 methodology for a theory of 226, 227 study of 261 theories of 221 theory of 212, 214, 217, 226 Mehler, Jacques 55 Merge 16, 77, 91, 181, 236, 243, 263, 279–280 centrality of 41, 60, 62, 176, 245 consequences of 17 and edge properties 17, 41 Merge, external 17, 166, 201, 238, 263 Merge, internal 16, 25, 29, 85, 201, 238, 264 mutation giving rise to 43, 52 origin of 14, 15 Pair Merge 201, 264 and psychic identity 28 uniqueness to humans 25, 200, 205 metaphor 195 metaphysics 125, 157 Mikhail, John 63, 99, 100, 109, 129, 286 Mill, John Stuart 121, 122, 287 Miller, George 81 mindas a causal mechanism 138 computational sciences of 247 computational theory of 280 philosophy of 186, 255 place of language in 69–74 representational theory of 162, 188 science of 138–151, 212, 288 theory of 14 Minimalist Program 24, 83, 84, 233, 235–236, 237, 245, 246, 264and adaptationism 172 aim of 42, 199 simplicity and 80, 243, 285 modes of presentation (MOPs) 187, 190, 217, 219, 275roles of 218 morality 99, 100, 109, 287character of 110 conflicting systems 114 generation of action or judgment 110 moral truisms 101, 102 theories of 110, 135 trolley problems 109 and universalization 113–117 Moravcsik, Julius 164 morphemes 81, 149 morphology 52, 54, 195distributed 27 and syntax 200 Morris, Charles 250 Move 108 mutations 14, 43, 170, 171survival of 51, 53 mysterianism 97 Nagel, Thomas 98 Narita, Hiroki 57 nativism 187, 217, 283 natural numbers 204 natural sciences 18, 38 natural selection 58, 76, 104, 143, 157 Navajo language 277 neural networks 225 neurophysiology 74 Newton, Isaac 66, 67, 72, 88, 127, 134alchemy 67 nominalism 87, 91 non-violence 114 Norman Conquest 84 objective existence 169 optimism 118–123, 288 parameters 39–45, 54, 239–242, 277, 282, 283and acquisition of language 241 choice of 45, 83 developmental constraints in 243 functional categories 240 head-final 55, 240 headedness macroparameter 241, 276 linearization parameter 55 macroparameters 55 microparameters 55, 84, 241 polysynthesis 55 and simplicity 80 Peck, James 288 Peirce, Charles Sanders 96, 132, 184, 250abduction 168, 183, 246, 248 truth 133, 136 perfection 50–58, 172, 175, 263–264, 279 person, concept of 125, 126, 271, 284‘forensic' notion of 125 persuasion 114, 116 Pesetsky, David 30 Petitto, Laura-Ann 48, 78 phenomenalism 211 philosophers 129–131, 282, 283contribution of 129 contribution to science 129 philosophy 181accounts of visual sensations 255–257 of language 35, 273 of mind 186, 255 problems in 286 and psychology 140 phonemes 81 phonetic/phonological interfaces 161, 194, 253, 278 phonology 28, 40, 52, 54, 57, 109, 208 physicalism 187 physics 19, 65, 106, 144and chemistry 65 folk physics 72 theoretical 18, 65, 73, 100 Piattelli-Palmarini, Massimo 140, 246, 279 Pietroski, Paulconcepts 47, 199, 200, 209 semantics 198, 211, 223, 229, 254 Pinker, Steven 166, 170, 172, 176 Pirahã language 30 Plato 115 Plato's Problem 23, 195, 236, 244, 246, 266 Poincaré, Henri 65 politics 116, 119, 145, 146, 152 poverty of the stimulus observations 5, 23, 40, 177, 200, 227, 233, 262 power 120 pragmatic information 30 pragmatics 36, 130, 250–254, 289definition of 250 and reference 253 principles and parameters approach to linguistic theory 24, 53, 235, 236, 240, 245, 276language acquisition 60, 82, 83, 149 and simplicity 246 progress 118, 145, 183 projection problem 83, 89 prosody 37 psychic continuity 26, 205, 207, 271 psychology 219of belief and desire 138, 141 comparative 21 evolutionary 103–107, 111 folk psychology 72, 141 and philosophy 140 rationalistic 255 scientific 140 psychology, comparative 21 public intellectuals 122 Pustejovsky, James 164, 195 Putnam, Hilary 95, 126, 138 Quine, W.

pages: 721 words: 197,134

Data Mining: Concepts, Models, Methods, and Algorithms by Mehmed Kantardzić

Albert Einstein, bioinformatics, business cycle, business intelligence, business process, butter production in bangladesh, combinatorial explosion, computer vision, conceptual framework, correlation coefficient, correlation does not imply causation, data acquisition, discrete time, El Camino Real, fault tolerance, finite state, Gini coefficient, information retrieval, Internet Archive, inventory management, iterative process, knowledge worker, linked data, loose coupling, Menlo Park, natural language processing, Netflix Prize, NP-complete, PageRank, pattern recognition, peer-to-peer, phenotype, random walk, RFID, semantic web, speech recognition, statistical model, Telecommunications Act of 1996, telemarketer, text mining, traveling salesman, web application

In bioinformatics, genes are known to appear as local patterns interspersed between chunks of noncoding DNA. Matching and discovery of such patterns are very useful in many applications, not only in bioinformatics. Due to their readily interpretable structure, patterns play a particularly dominant role in data mining. There have been many techniques used to model global or local temporal events. We will introduce only some of the most popular modeling techniques. Finite State Machine (FSM) has a set of states and a set of transitions. A state may have transitions to other states that are caused by fulfilling some conditions within the state. An FSM must have an initial state, usually drawn with an arrow, and it is a state that provides a starting point of the model. Inputs to the states, in our case representing symbols in a sequence, act as triggers for the transition from one state to another state.

The machine reaches the final state when it has performed the procedure successfully, or in our case recognized a sequence pattern. An FSM can be represented using a state-transition table or state-transition diagram. Figures 12.20a,b shows both of these representations for a modeling recognition of a binary number with an even number of ones. FSM does not work very well when the transitions are not precise and does not scale well when the set of symbols for sequence representation is large. Figure 12.20. Finite-state machine. (a) State-transition table; (b) state-transition diagram. Markov Model (MM) extends the basic idea behind FSM. Both FSM and MM are directed graphs. As with FSM, MM always has a current state. Start and end nodes are drawn for illustrative purposes and need not be present. Unlike in FSM, transitions are not associated with specific input values. Arcs carry a probability value for transition from one state to another.

This process is continued till frequent sequences of all lengths are found. The task of a sequence mining is to systematically discover all sequential patterns in database D. Counting frequencies of parallel itemsets is straightforward and described in traditional algorithms for frequent itemsets detection. Counting serial itemsets, on the other hand, requires more computational resources. For example, unlike for parallel itemsets, we need finite-state automata to recognize serial episodes. More specifically, an appropriate l-state automaton can be used to recognize occurrences of an l-node serial sequence. For example, for the sequence (A → B → A → A), there would be a five-state automaton (FSA) given in Figure 12.23. It transits from its first state on seeing an event of type A and then waits for an event of type B to transit to its next state, and so on.

pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI by John Brockman

AI winter, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, artificial general intelligence, Asilomar, autonomous vehicles, basic income, Benoit Mandelbrot, Bill Joy: nanobots, Buckminster Fuller, cellular automata, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, Danny Hillis, David Graeber, easy for humans, difficult for computers, Elon Musk, Eratosthenes, Ernest Rutherford, finite state, friendly AI, future of work, Geoffrey West, Santa Fe Institute, gig economy, income inequality, industrial robot, information retrieval, invention of writing, James Watt: steam engine, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kickstarter, Laplace demon, Loebner Prize, market fundamentalism, Marshall McLuhan, Menlo Park, Norbert Wiener, optical character recognition, pattern recognition, personalized medicine, Picturephone, profit maximization, profit motive, RAND corporation, random walk, Ray Kurzweil, Richard Feynman, Rodney Brooks, self-driving car, sexual politics, Silicon Valley, Skype, social graph, speech recognition, statistical model, Stephen Hawking, Steven Pinker, Stewart Brand, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, telemarketer, telerobotics, the scientific method, theory of mind, Turing machine, Turing test, universal basic income, Upton Sinclair, Von Neumann architecture, Whole Earth Catalog, Y2K, zero-sum game

This is starting to change: from the bottom up, as the threefold drivers of drone warfare, autonomous vehicles, and cell phones push the development of neuromorphic microprocessors that implement actual neural networks, rather than simulations of neural networks, directly in silicon (and other potential substrates); and from the top down, as our largest and most successful enterprises increasingly turn to analog computation in their infiltration and control of the world. While we argue about the intelligence of digital computers, analog computing is quietly supervening upon the digital, in the same way that analog components like vacuum tubes were repurposed to build digital computers in the aftermath of World War II. Individually deterministic finite-state processors, running finite codes, are forming large-scale, nondeterministic, non-finite-state metazoan organisms running wild in the real world. The resulting hybrid analog/digital systems treat streams of bits collectively, the way the flow of electrons is treated in a vacuum tube, rather than individually, as bits are treated by the discrete-state devices generating the flow. Bits are the new electrons. Analog is back, and its nature is to assume control.

Design Patterns: Elements of Reusable Object-Oriented Software (Joanne Romanovich's Library) by Erich Gamma, Richard Helm, Ralph Johnson, John Vlissides

A Pattern Language, Donald Knuth, finite state, loose coupling, MVC pattern, yield curve

Decorator Pattern The Decorator (175) pattern captures class and object relationships that support embellishment by transparent enclosure. The term “embellishment” actually has broader meaning than what we’ve considered here. In the Decorator pattern, embellishment refers to anything that adds responsibilities to an object. We can think for example of embellishing an abstract syntax tree with semantic actions, a finite state automaton with new transitions, or a network of persistent objects with attribute tags. Decorator generalizes the approach we’ve used in Lexi to make it more widely applicable. 2.5 Supporting Multiple Look-and-Feel Standards Achieving portability across hardware and software platforms is a major problem in system design. Retargeting Lexi to a new platform shouldn’t require a major overhaul, or it wouldn’t be worth retargeting.

Each of the classes defining the abstract syntax tree implements this operation. It takes inputState as an argument representing the current state of the matching process, having read part of the input string. This current state is characterized by a set of input streams representing the set of inputs that the regular expression could have accepted so far. (This is roughly equivalent to recording all states that the equivalent finite state automata would be in, having recognized the input stream to this point). The current state is most important to the repeat operation. For example, if the regular expression were 'a' repeat then the interpreter could match "a", "aa", "aaa", and so on. If it were 'a' repeat & 'be' then it could match "abc", "aabc", "aaabc", and so on. But if the regular expression were 'a' repeat & 'abc' then matching the input "aabc" against the subexpression "'a' repeat" would yield two input streams, one having matched one character of the input, and the other having matched two characters.

pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, British Empire, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, low earth orbit, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, spectrum auction, strong AI, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

Newman said, “was like catching mice just as they were entering a hole in the wall.”38 A series of electrical pulses, about a microsecond apart, were converted to a train of sound waves circulating in a long tube of mercury equipped with crystal transducers at both ends. About a thousand digits could be stored in the millisecond it took a train of pulses to travel the length of a five-foot “tank.” Viewed as part of a finite-state Turing machine, the delay line represented a continuous loop of tape, a thousand squares in length and making a thousand complete passes per second under the read-write head. Turing specified some two hundred tubes, each storing thirty-two words of 32 bits, for a total, “comparable with the memory capacity of a minnow,” of about 200,000 bits.39 “The property of being digital,” announced Turing to the London Mathematical Society in a 1947 lecture on his design, “should be of more interest than that of being electronic.”40 Whether memory took the form of paper tape, vacuum-tube flip-flops, mercury pulse trains, or even papyrus scrolls did not matter as long as discrete symbols could be freely read, written, relocated, and, when so instructed, erased.

See Institute for Advanced Study IAS (Institute for Advanced Study) computer, xii, 78–79, 91–92, 93–107 as ancestor of the microprocessor, 98, 203 and artificial life, xii, 111–18, 121, 124–26, 129, 192. see also Barricelli construction and operation, 97–107, 111 and digital computing at RAND, 104, 148, 178 duplication of, 97, 98, 107 logical and physical architecture, 98, 99–107, 157 and nuclear weapons, 78–79, 91–92, 107, 111 and origins of IBM model 701, 91, 106 origins of, and weather prediction, 87–88 peripheral equipment, 98, 101–102, 106, 144 programming of, 102, 106–107, 114, 121, 130 progress reports, and impact of, 98, 99, 121 and random-access memory, 98, 103–105, 113 shakedown run, 78–79, 111 siblings and offspring, listed, 97 and von Neumann, 78–79, 87–88, 91–92, 97, 98–102, 106–108, 125, 153 IBM (International Business Machines) 12, 91, 103–104, 106, 122, 144, 148, 179. see also SAGE and evolution of operating systems, 122, 189 and IAS computer project, 91, 106 and punched-card computing, 60, 78, 81, 82, 83, 106, 122, 144 and von Neumann, 91 IBM computers: model 650, 122; model 701, 91, 106, 178; model 704, 118, 184; model 7090, 151, 182 iconoscope, 85, 104 ideas. see also consciousness; meaning; mind Darwinian evolution of, 28, 184 and formal logic, 38, 43, 46, 49, 129 nature of, 136, 158, 225 IFF (Identification Friend or Foe) radar, 104 Illinois, University of, 107 immortality, and composite organisms, 175, 191, 210 and non-Darwinian evolution, 31 improbability, and origins of life, 29–30, 112, 177 incompleteness (mathematical), 49–50, 53–54, 70, 72, 78, 120, 167, 228 Industrial Revolution, 21–22, 134 infinity, and finite-state machines, 10, 35, 43, 56, 130, 190 information. see also bandwidth; bits; communication; cybernetics; telecommunication and cybernetics, 6, 98, 101 defined, by Bateson, 167 flow, in data networks, 12, 110, 150, 158–59, 205 mathematical theory of, 110, 153, 155 and meaning, 8, 155, 158, 167, 171, 184–85 and money, 162, 165 and origins of life, 12, 29 insects, 8, 13, 129, 170, 174, 210 Instinct and Reason (Smee), 45, 48 Institute for Advanced Study (IAS), Princeton, N.J.

pages: 249 words: 45,639

Learn Python the Hard Way by Zed Shaw

complexity theory, finite state, index card, web application

Write the room description as doc comments, and change the runner to print them. Once you have doc comments as the room description, do you need to have the function prompt even? Have the runner prompt the user, and pass that in to each function. Your functions should just be if-statements printing the result and returning the next room. This is actually a small version of something called a "finite state machine". Read about them. They might not make sense but try anyway. © Copyright 2010, Zed A. Shaw. Last updated on Jun 24, 2011. Exercise 42: Gothons Are Getting Classy While it's fun to put functions inside of dictionaries, you'd think there'd be something in Python that does this for you. There is: the class keyword. Using class is how you create an even more awesome "dict with functions" than the one you made in the last exercise.

pages: 214 words: 14,382

Monadic Design Patterns for the Web by L.G. Meredith

barriers to entry, domain-specific language, don't repeat yourself, finite state, Georg Cantor, ghettoisation, John von Neumann, Kickstarter, semantic web, social graph, type inference, web application, WebSocket

Cover · Overview · Contents · Discuss · Suggest · Glossary · Index Section 7.6 Chapter 7 · A Review of Collections as Monads Kleene star and other connectives Now, since these sets of concatenations of symbols (aka words) are often infinitary, some means of representing them without having to explicitly or extensionally enumerate their constituents have developed. Most programmers are familiar with the idea of regular expressions and the corresponding regular languages built from very simple operators: concatenation (ab), choice (a + b), and iteration (a∗ ) – sometimes called Kleene star. These operators actually denote many things at once: • Finite state machines that can recognize when a concatenation symbol has a corresponding form • That form’s sets of words; thus, ab∗ , for example, denotes the set of all words {ε, ab, abab, ababab, ...} Download from Wow! eBook <> • The means of generating this set of words The operators form a kind of algebra over regular expressions and the algebraic operations correspond to operations on the things the regular expressions denote – thus, for example, the choice operator corresponds to the union of sets.

Speaking Code: Coding as Aesthetic and Political Expression by Geoff Cox, Alex McLean

4chan, Amazon Mechanical Turk, augmented reality, bash_history, bitcoin, cloud computing, computer age, computer vision, crowdsourcing, dematerialisation, Donald Knuth, Douglas Hofstadter,, Everything should be made as simple as possible, finite state, Gödel, Escher, Bach, Jacques de Vaucanson, Larry Wall, late capitalism, means of production, natural language processing, new economy, Norbert Wiener, Occupy movement, packet switching, peer-to-peer, Richard Stallman, Ronald Coase, Slavoj Žižek, social software, social web, software studies, speech recognition, stem cell, Stewart Brand, The Nature of the Firm, Turing machine, Turing test, Vilfredo Pareto, We are Anonymous. We are Legion, We are the 99%, WikiLeaks

Chomsky assumes that somehow grammar is given in advance, hard-wired or preprogrammed: that humans possess innate grammatical competence that is presocial (and one of the controversies of Chomsky’s system is its separation of consciousness from the outside social world). Thus he explains the deepseated rules by which language operates and how principles and processes are set by which sentences are generated. In procedural terms, a natural language can be simply described as a “finite state Markov process that produces sentences from left to right.”10 In mathematics and other formalized systems, the logic adheres to similar grammars that generate their own grammatical sequences. Chomsky’s conclusion is that grammar is independent of meaning, and that the separation from semantics is essential to the study of syntactic structure. Clearly syntax is also fundamental for the structure of a statement in a computer language, such as with the use of parentheses to produce subclauses while retaining the overall flow of logic.

pages: 504 words: 89,238

Natural language processing with Python by Steven Bird, Ewan Klein, Edward Loper

bioinformatics, business intelligence, conceptual framework, Donald Knuth, elephant in my pajamas,, finite state, Firefox, Guido van Rossum, information retrieval, Menlo Park, natural language processing, P = NP, search inside the book, speech recognition, statistical model, text mining, Turing test

Check http: // for news about developments after the publication date of this book. Contributions in the following areas are particularly encouraged: 444 | Afterword: The Language Challenge Phonology and morphology Computational approaches to the study of sound patterns and word structures typically use a finite-state toolkit. Phenomena such as suppletion and non-concatenative morphology are difficult to address using the string-processing methods we have been studying. The technical challenge is not only to link NLTK to a highperformance finite-state toolkit, but to avoid duplication of lexical data and to link the morphosyntactic features needed by morph analyzers and syntactic parsers. High-performance components Some NLP tasks are too computationally intensive for pure Python implementations to be feasible. However, in some cases the expense arises only when training models, not when using them to label inputs.

Turing's Cathedral by George Dyson

1919 Motor Transport Corps convoy, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Benoit Mandelbrot, British Empire, Brownian motion, cellular automata, cloud computing, computer age, Danny Hillis, dark matter, double helix, fault tolerance, Fellow of the Royal Society, finite state, Georg Cantor, Henri Poincaré, housing crisis, IFF: identification friend or foe, indoor plumbing, Isaac Newton, Jacquard loom, John von Neumann, mandelbrot fractal, Menlo Park, Murray Gell-Mann, Norbert Wiener, Norman Macrae, packet switching, pattern recognition, Paul Erdős, Paul Samuelson, phenotype, planetary scale, RAND corporation, random walk, Richard Feynman, SETI@home, social graph, speech recognition, Thorstein Veblen, Turing complete, Turing machine, Von Neumann architecture

He delivered a complete description of a million-cycle-per-second Automatic Computing Engine (ACE), accompanied by circuit diagrams, a detailed physical and logical analysis of the internal storage system, sample programs, detailed (if bug-ridden) subroutines, and even an estimated cost of £11,200.41 As Sara Turing later explained, her son’s goal was “to see his logical theory of a universal machine, previously set out in his paper ‘Computable Numbers,’ take concrete form.”42 After a comparison of available forms of storage, ranging from punched paper tape through “cerebral cortex” to electrostatic storage tubes, Turing specified mercury-filled acoustic delay lines for high-speed storage. He estimated cost, access time, and “spacial economy” (in digits/liter) for all forms of storage, putting the cost of a cerebral cortex at £300 per annum—his King’s College fellowship for the year. Viewed as part of a finite-state Turing machine, the delay line represented a continuous loop of tape, 1,000 squares in length and making 1,000 complete passes per second under the read/write head. Turing specified some 200 tubes, each storing 32 words of 32 bits each, for a total, “comparable with the memory capacity of a minnow,” of about 200,000 bits.43 Taking ten pages of the proposal to do so, Turing worked out the storage capacity, attenuation, noise, temperature sensitivity, and regenerative requirements, all from first principles.

Data that are widely replicated, or associated frequently by search requests, establish physical proximity that is manifested as proximity in time. More meaningful results appear higher on the list not only because of some mysterious, top-down, weighting algorithm, but because when microseconds count, they are closer, from the bottom up, in time. Meaning just seems to “come to mind” first. An Internet search engine is a finite-state, deterministic machine, except at those junctures where people, individually and collectively, make a nondeterministic choice as to which results are selected as meaningful and given a click. These clicks are then immediately incorporated into the state of the deterministic machine, which grows ever so incrementally more knowledgeable with every click. This is what Turing defined as an oracle machine.

pages: 259 words: 67,456

The Mythical Man-Month by Brooks, Jr. Frederick P.

finite state, HyperCard, Menlo Park, sorting algorithm, speech recognition, Steve Jobs, Tacoma Narrows Bridge, Turing machine

Harel usefully defines a prototype as [A version of a program that] reflects only the design decisions made in the process of preparing the conceptual model, and not decisions driven by implementation concerns.12 It is possible to build a prototype that is not at all part of a product growing toward shipment. For example, one may build an interface prototype that has no real program function behind it, merely the finite-state machine that makes it appear to go through its paces. One can even prototype and test interfaces by the Wizard-of-Oz technique, with a concealed human simulating the system's responses. Such prototyping can be very useful for getting early user feedback, but it is quite apart from testing a product growing toward shipment. Similarly, implementers may well undertake to build a vertical slice of a product, in which a very limited function set is constructed in full, so as to let early sunlight into places where performance snakes may lurk.

Introducing Elixir by Simon St.Laurent, J. David Eisenberg

Debian, finite state, Pluto: dwarf planet, Ruby on Rails, web application

If needed, they set up other resources or processes to get started. Once running, they listen for mes‐ sages and process them, collapsing if they fail. Some of them might restart a failed process if needed. OTP formalizes those activities, and a few more, into a set of behaviors (or behav‐ iours—the original spelling was British). The most common behaviors are GenServer (generic server) and Supervisor. Through Erlang, you can use the gen_fsm (finite state machine) and gen_event behaviors. Elixir provides the Mix build tool for creat‐ 169 ing applications so that you can package your OTP code into a single runnable (and updatable) system. The behaviors predefine the mechanisms you’ll use to create and interact with pro‐ cesses, and the compiler will warn you if you’re missing some of them. Your code will handle the callbacks, specifying how to respond to particular kinds of events, and you will need to decide upon a structure for your application.

pages: 999 words: 194,942

Clojure Programming by Chas Emerick, Brian Carper, Christophe Grand

Amazon Web Services, Benoit Mandelbrot, cloud computing, continuous integration, database schema, domain-specific language, don't repeat yourself,, failed state, finite state, Firefox, game design, general-purpose programming language, Guido van Rossum, Larry Wall, mandelbrot fractal, Paul Graham, platform as a service, premature optimization, random walk, Ruby on Rails, Schrödinger's Cat, semantic web, software as a service, sorting algorithm, Turing complete, type inference, web application

This count will be stored in a map of words to their respective counts, held in an atom called word-freqs: (def url-queue (LinkedBlockingQueue.)) (def crawled-urls (atom #{})) (def word-freqs (atom {})) We’ll set up a bunch of agents in order to fully utilize all the resources we have available,[157] but we need to think through what state they’ll hold and what actions will be used to transition those states. In many cases, it is useful to think about agent state and the actions applied to it as forming a finite state machine; we’ve already walked through the workflow of our crawler, but we should formalize it. Figure 4-8. A web crawler’s primary state transitions The state of an agent at each point in this process should be obvious: prior to retrieving a URL, an agent will need a URL to retrieve (or some source of such URLs); prior to scraping, an agent will need a page’s content to scrape; and, prior to updating the cumulative crawler state, it will need the results of scraping.

It will leave the state of the agent set to a map containing the URL it pulls off the queue and its content: (declare run process handle-results) (defn ^::blocking get-url [{:keys [^BlockingQueue queue] :as state}] (let [url (as-url (.take queue))] (try (if (@crawled-urls url) state {:url url :content (slurp url) ::t #'process}) (catch Exception e ;; skip any URL we failed to load state) (finally (run *agent*))))) We’ll show run and explain what it’s doing in a little bit. If we’ve already crawled the URL pulled off the queue (or if we encounter an error while loading the URL’s content), we leave the agent’s state untouched. This implementation detail in our finite state machine adds a cycle to it where get-url will sometimes be invoked on a single agent multiple times before it transitions states. process will parse a URL’s content, using the links-from and words-from functions to obtain the URL’s content’s links and build a map containing the frequencies of each word found in the content. It will leave the state of the agent set to a map containing these values as well as the originating URL: (defn process [{:keys [url content]}] (try (let [html (enlive/html-resource ( content))] {::t #'handle-results :url url :links (links-from url html) :words (reduce (fn [m word] (update-in m [word] (fnil inc 0))) {} (words-from html))}) (finally (run *agent*)))) The :words map associates found words with their count within the retrieved page, which we produce by reducing a map through the seq of those words. fnil is a HOF that returns a function that uses some default value(s) (here, 0) in place of any nil arguments when calling the function provided to fnil; this keeps us from having to explicitly check if the value in the words map is nil, and returning 1 if so.

pages: 294 words: 81,292

Our Final Invention: Artificial Intelligence and the End of the Human Era by James Barrat

AI winter, AltaVista, Amazon Web Services, artificial general intelligence, Asilomar, Automated Insights, Bayesian statistics, Bernie Madoff, Bill Joy: nanobots, brain emulation, cellular automata, Chuck Templeton: OpenTable:, cloud computing, cognitive bias, commoditize, computer vision, cuban missile crisis, Daniel Kahneman / Amos Tversky, Danny Hillis, data acquisition, don't be evil, drone strike, Extropian, finite state, Flash crash, friendly AI, friendly fire, Google Glasses, Google X / Alphabet X, Isaac Newton, Jaron Lanier, John Markoff, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, Loebner Prize, lone genius, mutually assured destruction, natural language processing, Nicholas Carr, optical character recognition, PageRank, pattern recognition, Peter Thiel, prisoner's dilemma, Ray Kurzweil, Rodney Brooks, Search for Extraterrestrial Intelligence, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, smart grid, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, superintelligent machines, technological singularity, The Coming Technological Singularity, Thomas Bayes, traveling salesman, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, zero day

He said the halting problem is unsolvable because if the debugger encounters a halting problem in the target program, it will succumb to the infinite loop while analyzing it, and never determine if the halting problem was there. You, the programmer, would be waiting for it to come up with the answer for the same amount of time you’d wait for the original program to halt. That is, a very long time, perhaps forever. Marvin Minsky, one of the fathers of artificial intelligence, pointed out that “any finite-state machine, if left completely to itself, will fall eventually into a perfectly periodic repetitive pattern. The duration of this repeating pattern cannot exceed the number of internal states of the machine.” Translated, that means a computer of average memory, while running a program with a halting problem, would take a very long time to fall into a pattern of repetition, which could then be detected by a diagnostic program.

pages: 336 words: 88,320

Being Geek: The Software Developer's Career Handbook by Michael Lopp

finite state, game design, job satisfaction, John Gruber, knowledge worker, remote working, rolodex, Silicon Valley, Silicon Valley startup, Skype, sorting algorithm, web application

It's your inner dialog, and it's often full of shit. I'm not saying you deliberately lie to yourself. OK, maybe I am, but we're all doing it. We're all gathering data through the course of the day and creating a story based on that data, our experience, and our moods. It's a perfectly natural phenomenon to guide the narrative in our favor. We see the world how we want. A carpenter sees all problems as a nail. I see problems as finite state machines. As we edit our days into these stories, there is always a risk of fiction. This is why you need to identify and nurture Your People. You tell these stories to Your People without reservation. Your People love your stories—fiction and all. They love how you tell them, they laugh about the lies you tell yourself, and then they stop and they tell you the truth. Networking is the art of finding those who are willing to listen to and critique your stories, so go look at your Inbox.

pages: 722 words: 90,903

Practical Vim: Edit Text at the Speed of Thought by Drew Neil

Bram Moolenaar, don't repeat yourself,, fault tolerance, finite state, place-making, QWERTY keyboard, web application

Meet Operator-Pending Mode Normal, Insert, and Visual modes are readily identified, but Vim has other modes that are easy to overlook. Operator-Pending mode is a case in point. We use it dozens of times daily, but it usually lasts for just a fraction of a second. For example, we invoke it when we run the command dw. It lasts during the brief interval between pressing d and w keys. Blink and you’ll miss it! If we think of Vim as a finite-state machine, then Operator-Pending mode is a state that accepts only motion commands. It is activated when we invoke an operator command, and then nothing happens until we provide a motion, which completes the operation. While Operator-Pending mode is active, we can return to Normal mode in the usual manner by pressing escape, which aborts the operation. Many commands are invoked by two or more keystrokes (for examples, look up g ​, z ​, ctrl-w ​, or [ ​), but in most cases, the first keystroke merely acts as a prefix for the second.

High-Frequency Trading by David Easley, Marcos López de Prado, Maureen O'Hara

algorithmic trading, asset allocation, backtesting, Brownian motion, capital asset pricing model, computer vision, continuous double auction, dark matter, discrete time, finite state, fixed income, Flash crash, High speed trading, index arbitrage, information asymmetry, interest rate swap, latency arbitrage, margin call, market design, market fragmentation, market fundamentalism, market microstructure, martingale, natural language processing, offshore financial centre, pattern recognition, price discovery process, price discrimination, price stability, quantitative trading / quantitative finance, random walk, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, Tobin tax, transaction costs, two-sided market, yield curve

. • Signed transaction volume: a signed quantity indicating the number of shares bought in the last 15 seconds minus the number of shares sold in the last 15 seconds. • Immediate market order cost: the cost we would pay for purchasing our remaining shares immediately with a market order. All of the features above were normalised in a standard fashion by subtracting their means, dividing by their standard deviations and time-averaging over a recent interval. In order to obtain a finite state space, features were discretised into bins in multiples of standard deviation units. Experiments can also be performed using continuous features and a parametric model representation, but are beyond the scope of this chapter. Along with our original state variables (v, t), the features above provide a rich language for dynamically conditioning our order placement on potentially relevant properties of the order book.

pages: 487 words: 95,085

JPod by Douglas Coupland

Asperger Syndrome, Drosophila, finite state, G4S, game design, Maui Hawaii, McMansion, neurotypical, pez dispenser, pre–internet, QWERTY keyboard, Ronald Reagan, special economic zone, wage slave, Y2K

God is an Xkb state indicator God is a Window Maker docked application God is a multi-platform Z80cross-assembler God is a lightweight XML encoding library for Java God is a programmatic APIwritten in C++ God is Oracle's OCI8 and OCI9 APIs God is a configuration backup utility God is Web-based group ware and collaboration software God is a graphical editor for drawing finite state machines . . . Kaitlin was on the phone again, trying to extract herself from jPod. Cowboy was over by a ventilation unit, having a smoke. One of jPod's quirks is an air intake duct in front of which you can puff away on anything. Hell, you could let off an Exocet missile, and it'd suck everything up and away in a jiffy. "If that had been my mother who showed up today, she'd have made a big deal of telling people she doesn't shave her armpits," said John Doe.

pages: 268 words: 109,447

The Cultural Logic of Computation by David Golumbia

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, American ideology, Benoit Mandelbrot, borderless world, business process, cellular automata, citizen journalism, Claude Shannon: information theory, computer age, corporate governance, creative destruction,, finite state, future of work, Google Earth, Howard Zinn, IBM and the Holocaust, iterative process, Jaron Lanier, jimmy wales, John von Neumann, Joseph Schumpeter, late capitalism, means of production, natural language processing, Norbert Wiener, packet switching, RAND corporation, Ray Kurzweil, RFID, Richard Stallman, semantic web, Shoshana Zuboff, Slavoj Žižek, social web, stem cell, Stephen Hawking, Steve Ballmer, Stewart Brand, strong AI, supply-chain management, supply-chain management software, Ted Nelson, telemarketer, The Wisdom of Crowds, theory of mind, Turing machine, Turing test, Vannevar Bush, web application

New Horizons in the Study of Language and Mind. New York: Cambridge University Press. ———. 2004. “Beyond Explanatory Adequacy.” In Adriana Belletti, ed., The Cartography of Syntactic Structures. Volume 3: Structures and Beyond. New York: Cambridge University Press, 104–131. ———. 2005. “Three Factors in Language Design.” Linguistic Inquiry 36 (Winter), 1–22. Chomsky, Noam, and George Miller. 1958. “Finite State Languages.” Information and Control 1, 91–112. Chomsky, Noam, and M. P. Schützenberger. 1963. “The Algebraic Theory of Context-Free Languages.” In P. Braffort and D. Hirschberg, eds., Computer Programming and Formal Systems. Amsterdam: North-Holland, 118–161. Chun, Wendy H. K. 2006. Control and Freedom: Power and Paranoia in the Age of Fiber Optics. Cambridge, MA: The MIT Press. Chun, Wendy H.

Lessons-Learned-in-Software-Testing-A-Context-Driven-Approach by Anson-QA

anti-pattern, Chuck Templeton: OpenTable:, finite state, framing effect, full employment, information retrieval, job automation, knowledge worker, lateral thinking, Ralph Nader, Richard Feynman, side project, Silicon Valley, statistical model, web application

PowerQuest Corporation. 2001. Drive Image. Available at -RRational Software Corporation. 2001. Rational Purify for Unix. Available at Rational Software Corporation. 2001. Rational Test Foundation for Windows 2000. Available at Robinson, Harry. 1999. Finite State Model-Based Testing on a Shoestring. Star West 1999. Available at -SSchneier, Bruce. 2000a. Computer Security: Will We Ever Learn? Crypto-Gram, May 15. Available at Schneier, Bruce. 2000b. Secrets & Lies. New York: John Wiley & Sons. Simmonds, Erik. 2000. When Will We Be Done Testing?

Language and Mind by Noam Chomsky

Alfred Russel Wallace, finite state, John von Neumann, lateral thinking, pattern recognition, phenotype, theory of mind

In fact, it can be argued that the somewhat similar but not formally related concept of real-time deterministic automation is far more “natural” in terms of time and space conditions on computation.16 However, it is pointless to pursue this topic, because what is at stake is not the “simplicity” of phrase structure grammars but rather of transformational grammars with a phrase structure component that plays a role in generating deep structures. And there is absolutely no mathematical concept of “ease of computation” or “simplicity of algorithm” that even vaguely suggests that such systems may have some advantage over the kinds of automata that have been seriously investigated from this point of view – for example, finite state automata, linear bounded automata, and so on. The basic concept of “structure-dependent operation” has never even been considered in a strictly mathematical concept. The source of this confusion is a misconception on Putnam’s part as to the nature of grammatical transformations. They are not rules that “abbreviate” sentences; rather, they are operations that form surface structures from underlying deep structures, in such ways as are illustrated in the preceding lecture and the references there cited.17 Hence, to show that transformational grammars are the “simplest possible,” one would have to demonstrate that the “optimal” computing system would take a string of symbols as input and determine its surface structure, its underlying deep structure, and the sequence of transformational operations that relates them.

pages: 462 words: 172,671

Clean Code: A Handbook of Agile Software Craftsmanship by Robert C. Martin

continuous integration, database schema, domain-specific language, don't repeat yourself, Donald Knuth,, Eratosthenes, finite state, G4S, Ignaz Semmelweis: hand washing, iterative process, place-making, Rubik’s Cube, web application

G7: Base Classes Depending on Their Derivatives The most common reason for partitioning concepts into base and derivative classes is so that the higher level base class concepts can be independent of the lower level derivative class concepts. Therefore, when we see base classes mentioning the names of their derivatives, we suspect a problem. In general, base classes should know nothing about their derivatives. There are exceptions to this rule, of course. Sometimes the number of derivatives is strictly fixed, and the base class has code that selects between the derivatives. We see this a lot in finite state machine implementations. However, in that case the derivatives and base class are strongly coupled and always deploy together in the same jar file. In the general case we want to be able to deploy derivatives and bases in different jar files. Deploying derivatives and bases in different jar files and making sure the base jar files know nothing about the contents of the derivative jar files allow us to deploy our systems in discrete and independent components.

pages: 502 words: 107,510

Natural Language Annotation for Machine Learning by James Pustejovsky, Amber Stubbs

Amazon Mechanical Turk, bioinformatics, cloud computing, computer vision, crowdsourcing, easy for humans, difficult for computers, finite state, game design, information retrieval, iterative process, natural language processing, pattern recognition, performance metric, sentiment analysis, social web, speech recognition, statistical model, text mining

It has been used successfully in Speech Recognition, handwriting recognition, POS tagging, and many other NLP tasks. Because the number of possible label sequences gets unmanageable with long sequences, the model makes the same assumption that the sequence predictor from Chapter 3 did; namely, look at only the previous n labels for calculating the probability (see Language Models). The underlying assumption is that the elements transition from state to state via a finite-state automaton behavior, with probabilistic transitions between states. Then, this can be transformed into a new model, with hidden states, where all observation tokens are emitted from each state with a finite probability. Of particular interest to us is the fact that the richer the encoding of features (e.g., from annotations) the better one can calculate the transition probabilities from state to state in the model.

pages: 366 words: 107,145

Fuller Memorandum by Stross, Charles

Any sufficiently advanced technology is indistinguishable from magic, Beeching cuts, British Empire, cognitive dissonance, complexity theory, congestion charging, dumpster diving, finite state, Firefox, HyperCard, invisible hand, land reform, linear programming, MITM: man-in-the-middle, peak oil, post-work, security theater, sensible shoes, side project, Sloane Ranger, telemarketer, Turing machine

He's been in his grave for nearly half a century, and would doubtless be spinning in it fast enough to qualify for carbon credits as an environmentally friendly power source if he could see us today in all our multi-ethnic anti-discriminatory splendor. But who cares? That is, indeed, the big-ticket question. Before the Laundry, things were a bit confused. You can do magic by hand, without computers, but magic performed by ritual without finite state automata in the loop--calculating machines, in other words--tends to be haphazard, unreliable, uncontrollable, prone to undesirable side effects, and difficult to repeat. It also tends to fuck with causality, the logical sequence of events, in a most alarming way. We've unintentionally rewritten our history over the centuries, would-be sorcerers unwinding chaos and pinning down events with the dead hand of consistency--always tending towards a more stable ground state because chaos is unstable; entropy is magic's great enemy.

Wireless by Charles Stross

anthropic principle, back-to-the-land, Benoit Mandelbrot, Buckminster Fuller, Cepheid variable, cognitive dissonance, colonial exploitation, cosmic microwave background, epigenetics, finite state, Georg Cantor, gravity well, hive mind, jitney, Khyber Pass, lifelogging, Magellanic Cloud, mandelbrot fractal, MITM: man-in-the-middle, peak oil, phenotype, Pluto: dwarf planet, security theater, sensible shoes, Turing machine, undersea cable

There’s an odd kind of electrical tingle you get when you make contact with certain types of summoning grid. I get a powerful jolt of it right now, sizzling up my arm and locking my fingers in place around the head of the chess piece. I try to pull it away from the board, but it’s no good: it only wants to move up or down, left or right . . . Left or right? I blink. It’s a state machine all right: one that’s locked by the law of sympathy to some other finite state automaton, one that grinds down slow and hard. I move the piece forward one square. It’s surprisingly heavy, the magnet a solid weight in its base—but more than magnetism holds it in contact with the board. As soon as I stop moving I feel a sharp sting in my fingertips. “Ouch!” I raise them to my mouth just as there’s a crash from outside. “InMATE! InMATE!” I begin to turn as a shadow falls across the board.

pages: 523 words: 112,185

Doing Data Science: Straight Talk From the Frontline by Cathy O'Neil, Rachel Schutt

Amazon Mechanical Turk, augmented reality, Augustin-Louis Cauchy, barriers to entry, Bayesian statistics, bioinformatics, computer vision, correlation does not imply causation, crowdsourcing, distributed generation, Edward Snowden, Emanuel Derman, fault tolerance, Filter Bubble, finite state, Firefox, game design, Google Glasses, index card, information retrieval, iterative process, John Harrison: Longitude, Khan Academy, Kickstarter, Mars Rover, Nate Silver, natural language processing, Netflix Prize, p-value, pattern recognition, performance metric, personalized medicine, pull request, recommendation engine, rent-seeking, selection bias, Silicon Valley, speech recognition, statistical model, stochastic process, text mining, the scientific method, The Wisdom of Crowds, Watson beat the top human players on Jeopardy!, X Prize

Each bucket has a leak, so some water escapes into whatever bucket is directly below the drip. Depending on the rate of the water coming in, this system exhibits a chaotic process that depends on molecular-level interactions of water molecules on the sides of the buckets. Read more about it in this associated Wikipedia article. Many systems can exhibit inherent chaos. Philippe M. Binder and Roderick V. Jensen have written a paper entitled “Simulating chaotic behavior with finite-state machines”, which is about digital computer simulations of chaos. An interdisciplinary program involving M.I.T., Harvard, and Tufts involved teaching a technique that was entitled “Simulating chaos to teach order”. They simulated an emergency on the border between Chad and Sudan’s troubled Darfur region, with students acting as members of Doctors Without Borders, International Medical Corps, and other humanitarian agencies.

pages: 570 words: 115,722

The Tangled Web: A Guide to Securing Modern Web Applications by Michal Zalewski

barriers to entry, business process, defense in depth, easy for humans, difficult for computers, fault tolerance, finite state, Firefox, Google Chrome, information retrieval, RFC: Request For Comment, semantic web, Steve Jobs, telemarketer, Turing test, Vannevar Bush, web application, WebRTC, WebSocket

One could argue that practitioners are not the ones to be asked for nuanced definitions, but go ahead and pose the same question to a group of academics and they’ll offer you roughly the same answer. For example, the following common academic definition traces back to the Bell-La Padula security model, published in the 1960s. (This was one of about a dozen attempts to formalize the requirements for secure systems, in this case in terms of a finite state machine;[86] it is also one of the most notable ones.) A system is secure if and only if it starts in a secure state and cannot enter an insecure state. Definitions along these lines are fundamentally true, of course, and may serve as the basis for dissertations or even a couple of government grants. But in practice, models built on these foundations are bound to be nearly useless for generalized, real-world software engineering for at least three reasons: There is no way to define desirable behavior for a sufficiently complex computer system.

Pragmatic.Programming.Erlang.Jul.2007 by Unknown

Debian,, fault tolerance, finite state, full text search, RFC: Request For Comment

Module:terminate(Arg, State) -> term() Clean up before deletion. add_handler(EventMgrRef, Handler, Args) -> Result Add an event handler to a generic event manager. add_sup_handler(EventMgrRef, Handler, Args) -> Result Add a supervised event handler to a generic event manager. call(EventMgrRef, Handler, Request, Timeout) -> Result Make a synchronous call to a generic event manager. delete_handler(EventMgrRef, Handler, Args) -> Result Delete an event handler from a generic event manager. start(EventMgrName) -> Result Create a stand-alone event manager process. start_link(EventMgrName) -> Result Create a generic event manager process in a supervision tree. stop(EventMgrRef) -> ok Terminate a generic event manager. 475 M ODULE : GEN _ FSM swap_handler(EventMgrRef, {Handler1,Args1}, {Handler2,Args2}) -> Result Replace an event handler in a generic event manager. swap_sup_handler(EventMgrRef, {Handler1,Args1}, {Handler2,Args2}) -> Result Replace an event handler in a generic event manager. sync_notify(EventMgrRef, Event) -> ok Notify an event manager about an event. which_handlers(EventMgrRef) -> [Handler] Return all event handlers installed in a generic event manager. F.29 Module: gen_fsm Generic finite state machine behavior. Module:StateName(Event, StateData) -> Result Handle an asynchronous event. Module:StateName(Event, From, StateData) -> Result Handle a synchronous event. Module:code_change(OldVsn, StateName, StateData, Extra) -> {ok, NextStateName, NewStateData} Update the internal state data during upgrade/downgrade. Module:handle_event(Event, StateName, StateData) -> Result Handle an asynchronous event.

Elixir in Action by Saša Jurić

demand response,, fault tolerance, finite state, general-purpose programming language, place-making, Ruby on Rails, WebSocket

For details, see the official documentation ( The Erlang standard library includes the following OTP behaviours: ¡ gen_server — Generic implementation of a stateful server process ¡ supervisor — Provides error handling and recovery in concurrent systems ¡ application — Generic implementation of components and libraries ¡ gen_event — Provides event-handling support ¡ gen_statem — Runs a finite state machine in a stateful server process Elixir provides its own wrappers for the most frequently used behaviours via the modules GenServer, Supervisor, and Application. This book focuses on these behaviours. The GenServer behaviour receives detailed treatment in this chapter and chapter 7, Supervisor is discussed in chapters 8 and 9, and Application is presented in chapter 11. 168 Chapter 6 Generic server processes The remaining behaviours, although useful, are used less often and won’t be discussed in this book.

pages: 489 words: 148,885

Accelerando by Stross, Charles

business cycle, call centre, carbon-based life, cellular automata, cognitive dissonance, commoditize, Conway's Game of Life, dark matter, dumpster diving, Extropian, finite state, Flynn Effect, glass ceiling, gravity well, John von Neumann, Kickstarter, knapsack problem, Kuiper Belt, Magellanic Cloud, mandelbrot fractal, market bubble, means of production, MITM: man-in-the-middle, orbital mechanics / astrodynamics, packet switching, performance metric, phenotype, planetary scale, Pluto: dwarf planet, reversible computing, Richard Stallman, SETI@home, Silicon Valley, Singularitarianism, slashdot, South China Sea, stem cell, technological singularity, telepresence, The Chicago School, theory of mind, Turing complete, Turing machine, Turing test, upwardly mobile, Vernor Vinge, Von Neumann architecture, web of trust, Y2K, zero-sum game

"Oh, one of Johnny's toys – a micromechanical digital phonograph player," Gianni says dismissively. "He used to design Babbage engines for the Pentagon – stealth computers. (No van Eck radiation, you know.) Look." He carefully pulls a fabric-bound document out of the obsolescent data wall and shows the spine to Manfred: "On the Theory of Games, by John von Neumann. Signed first edition." Aineko meeps and dumps a slew of confusing purple finite state automata into Manfred's left eye. The hardback is dusty and dry beneath his fingertips as he remembers to turn the pages gently. "This copy belonged to the personal library of Oleg Kordiovsky. A lucky man is Oleg: He bought it in 1952, while on a visit to New York, and the MVD let him to keep it." "He must be –" Manfred pauses. More data, historical time lines. "Part of GosPlan?" "Correct."

pages: 818 words: 153,952

C++ Concurrency in Action: Practical Multithreading by Anthony Williams

car-free, finite state, SETI@home

Synchronizing operations with message passing The idea of CSP is simple: if there’s no shared data, each thread can be reasoned about entirely independently, purely on the basis of how it behaves in response to the messages that it received. Each thread is therefore effectively a state machine: when it receives a message, it updates its state in some manner and maybe sends one or more messages to other threads, with the processing performed depending on the initial state. One way to write such threads would be to formalize this and implement a Finite State Machine model, but this isn’t the only way; the state machine can be implicit in the structure of the application. Which method works better in any given scenario depends on the exact behavioral requirements of the situation and the expertise of the programming team. However you choose to implement each thread, the separation into independent processes has the potential to remove much of the complication from shared-data concurrency and therefore make programming easier, lowering the bug rate.

pages: 757 words: 193,541

The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services, Volume 2 by Thomas A. Limoncelli, Strata R. Chalup, Christina J. Hogan

active measures, Amazon Web Services, anti-pattern, barriers to entry, business process, cloud computing, commoditize, continuous integration, correlation coefficient, database schema, Debian, defense in depth, delayed gratification, DevOps, domain-specific language,, fault tolerance, finite state, Firefox, Google Glasses, information asymmetry, Infrastructure as a Service, intermodal, Internet of things, job automation, job satisfaction, Kickstarter, load shedding, longitudinal study, loose coupling, Malcom McLean invented shipping containers, Marc Andreessen, place-making, platform as a service, premature optimization, recommendation engine, revision control, risk tolerance, side project, Silicon Valley, software as a service, sorting algorithm, standardized shipping container, statistical model, Steven Levy, supply-chain management, Toyota Production System, web application, Yogi Berra

This is generally a good practice anyway, because it makes your code more robust. 11.9 Live Code Changes Some systems permit live code changes. This makes performing upgrades much easier. Generally we frown on the technique of modifying a live system but some languages are designed specifically to support it. Erlang is one such language. A service written in Erlang can be upgraded while it is running. Properly structured Erlang programs are designed as event-driven finite-state machines (FSM). For each event received, a specific function is called. One event that the service can receive is a notification that code has been upgraded. The function that is called is responsible for upgrading or converting any data structures. All subsequent events will trigger the new versions of the functions. No processes survive a restart of the Erlang engine itself, but carefully planned upgrades can be arranged with zero downtime.

pages: 612 words: 187,431

The Art of UNIX Programming by Eric S. Raymond

A Pattern Language, Albert Einstein, barriers to entry, bioinformatics, Clayton Christensen, combinatorial explosion, commoditize, correlation coefficient, David Brooks, Debian, domain-specific language, don't repeat yourself, Donald Knuth, Everything should be made as simple as possible, facts on the ground, finite state, general-purpose programming language, George Santayana, Innovator's Dilemma, job automation, Larry Wall, MVC pattern, pattern recognition, Paul Graham, peer-to-peer, premature optimization, pre–internet, publish or perish, revision control, RFC: Request For Comment, Richard Stallman, Robert Metcalfe, Steven Levy, transaction costs, Turing complete, Valgrind, wage slave, web application

If it had lacked this consistent mathematical model, it would probably look like the design of the original glob(1) facility in the oldest Unixes, a handful of ad-hoc wildcards that can't be combined. The yacc(1) utility for generating language parsers is a thin wrapper around the formal theory of LR(1) grammars. Its partner, the lexical analyzer generator lex(1), is a similarly thin wrapper around the theory of nondeterministic finite-state automata. All three of these programs are so bug-free that their correct functioning is taken utterly for granted, and compact enough to fit easily in a programmer's hand. Only a part of these good qualities are due to the polishing that comes with a long service life and frequent use; most of it is that, having been constructed around a strong and provably correct algorithmic core, they never needed much polishing in the first place.

pages: 1,085 words: 219,144

Solr in Action by Trey Grainger, Timothy Potter

business intelligence, cloud computing, commoditize, conceptual framework, crowdsourcing, data acquisition,, failed state, fault tolerance, finite state, full text search, glass ceiling, information retrieval, natural language processing, openstreetmap, performance metric, premature optimization, recommendation engine, web application

In a large index, there can be many terms matching a prefix, so we also need a way to sort the suggestions by popularity, as we don’t want low-frequency, rare terms polluting our suggestion lists. Think of it in terms of doing the most good for the most users; if a term only occurs in a few documents out of millions, it’s probably not a good suggestion. In listing 10.11, we’re using the org.apache.solr.spelling.suggest.fst.FSTLookup class, which uses a data structure based on finite state automaton (FST) to provide fast, constant-time lookups regardless of prefix length. The FSTLookupFactory implementation is slower to build but requires a smaller memory footprint, which makes it a good choice for large-term dictionaries. The suggestion dictionary must be rebuilt to incorporate new terms as new documents are indexed. In our configuration, we chose to rebuild the dictionary after every commit (buildOnCommit=true).

Martin Kleppmann-Designing Data-Intensive Applications. The Big Ideas Behind Reliable, Scalable and Maintainable Systems-O’Reilly (2017) by Unknown

active measures, Amazon Web Services, bitcoin, blockchain, business intelligence, business process,, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, database schema, DevOps, distributed ledger, Donald Knuth, Edward Snowden, Ethereum, ethereum blockchain, fault tolerance, finite state, Flash crash, full text search, general-purpose programming language, informal economy, information retrieval, Internet of things, iterative process, John von Neumann, Kubernetes, loose coupling, Marc Andreessen, microservices, natural language processing, Network effects, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, statistical model, undersea cable, web application, WebSocket, wikimedia commons

To cope with typos in documents or queries, Lucene is able to search text for words within a certain edit distance (an edit distance of 1 means that one letter has been added, removed, or replaced) [37]. As mentioned in “Making an LSM-tree out of SSTables” on page 78, Lucene uses a SSTable-like structure for its term dictionary. This structure requires a small inmemory index that tells queries at which offset in the sorted file they need to look for a key. In LevelDB, this in-memory index is a sparse collection of some of the keys, but in Lucene, the in-memory index is a finite state automaton over the characters in the keys, similar to a trie [38]. This automaton can be transformed into a Levenshtein automaton, which supports efficient search for words within a given edit distance [39]. Other fuzzy search techniques go in the direction of document classification and machine learning. See an information retrieval textbook for more detail [e.g., 40]. Keeping everything in memory The data structures discussed so far in this chapter have all been answers to the limi‐ tations of disks.

pages: 846 words: 232,630

Darwin's Dangerous Idea: Evolution and the Meanings of Life by Daniel C. Dennett

Albert Einstein, Alfred Russel Wallace, anthropic principle, assortative mating, buy low sell high, cellular automata, combinatorial explosion, complexity theory, computer age, conceptual framework, Conway's Game of Life, Danny Hillis, double helix, Douglas Hofstadter, Drosophila, finite state, Gödel, Escher, Bach, In Cold Blood by Truman Capote, invention of writing, Isaac Newton, Johann Wolfgang von Goethe, John von Neumann, Murray Gell-Mann, New Journalism, non-fiction novel, Peter Singer: altruism, phenotype, price mechanism, prisoner's dilemma, QWERTY keyboard, random walk, Richard Feynman, Rodney Brooks, Schrödinger's Cat, selection bias, Stephen Hawking, Steven Pinker, strong AI, the scientific method, theory of mind, Thomas Malthus, Turing machine, Turing test

Chomsky's work followed closely in the path of Turing's purely logical investigations into the powers of what we now call computers. Chomsky eventually defined an ascending scale of types of grammars or types of languages — the Chomsky Hierarchy, on which all students of computation theory still cut their teeth — and showed how these grammars were interdefinable with an ascending scale of types of automata or computers — from "finite state machines" through "push-down automata" and "linear bounded machines" to "Turing machines." I can vividly remember the shock wave that rolled through philosophy when Chomsky's work first came to our attention a few years later. In 1960, my sophomore year at Harvard, I asked Professor Quine what critics of his views I should be reading. (I considered myself at the time to be an anti-Quinian of ferocious conviction, and was already beginning to develop the arguments for my senior thesis, attacking him.

pages: 1,237 words: 227,370

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, bitcoin, blockchain, business intelligence, business process,, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, database schema, DevOps, distributed ledger, Donald Knuth, Edward Snowden, Ethereum, ethereum blockchain, fault tolerance, finite state, Flash crash, full text search, general-purpose programming language, informal economy, information retrieval, Infrastructure as a Service, Internet of things, iterative process, John von Neumann, Kubernetes, loose coupling, Marc Andreessen, microservices, natural language processing, Network effects, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, statistical model, undersea cable, web application, WebSocket, wikimedia commons

To cope with typos in documents or queries, Lucene is able to search text for words within a certain edit distance (an edit distance of 1 means that one letter has been added, removed, or replaced) [37]. As mentioned in “Making an LSM-tree out of SSTables”, Lucene uses a SSTable-like structure for its term dictionary. This structure requires a small in-memory index that tells queries at which offset in the sorted file they need to look for a key. In LevelDB, this in-memory index is a sparse collection of some of the keys, but in Lucene, the in-memory index is a finite state automaton over the characters in the keys, similar to a trie [38]. This automaton can be transformed into a Levenshtein automaton, which supports efficient search for words within a given edit distance [39]. Other fuzzy search techniques go in the direction of document classification and machine learning. See an information retrieval textbook for more detail [e.g., 40]. Keeping everything in memory The data structures discussed so far in this chapter have all been answers to the limitations of disks.

Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei

bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, distributed generation, finite state, information retrieval, iterative process, knowledge worker, linked data, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, random walk, recommendation engine, RFID, semantic web, sentiment analysis, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, Thomas Bayes, web application

If the behavior attribute values of the object significantly deviate from the values predicted by the model, then the object can be declared a contextual outlier. By using a prediction model that links the contexts and behavior, these methods avoid the explicit identification of specific contexts. A number of classification and prediction techniques can be used to build such models such as regression, Markov models, and finite state automaton. Interested readers are referred to Chapter 8 and Chapter 9 on classification and the bibliographic notes for further details (Section 12.11). In summary, contextual outlier detection enhances conventional outlier detection by considering contexts, which are important in many applications. We may be able to detect outliers that cannot be detected otherwise. Consider a credit card user whose income level is low but whose expenditure patterns are similar to those of millionaires.

Europe: A History by Norman Davies

agricultural Revolution, Albert Einstein, anti-communist, Berlin Wall, Bretton Woods, British Empire, business climate, centre right, charter city, clean water, Columbian Exchange, conceptual framework, continuation of politics by other means, Corn Laws, cuban missile crisis, Defenestration of Prague, discovery of DNA, double entry bookkeeping, Edmond Halley, Edward Lloyd's coffeehouse, equal pay for equal work, Eratosthenes, Etonian, European colonialism, experimental economics, financial independence, finite state, Francis Fukuyama: the end of history, Francisco Pizarro, full employment, global village, Honoré de Balzac, Index librorum prohibitorum, interchangeable parts, invention of agriculture, invention of movable type, Isaac Newton, James Hargreaves, James Watt: steam engine, Johann Wolfgang von Goethe, Johannes Kepler, John Harrison: Longitude, joint-stock company, Joseph-Marie Jacquard, land reform, liberation theology, long peace, Louis Blériot, Louis Daguerre, Mahatma Gandhi, mass immigration, Mikhail Gorbachev, Monroe Doctrine, Murano, Venice glass, music of the spheres, New Urbanism, North Sea oil, offshore financial centre, Peace of Westphalia, popular capitalism, Potemkin village, purchasing power parity, Ralph Waldo Emerson, road to serfdom, sceptred isle, Scramble for Africa, spinning jenny, Thales of Miletus, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, trade route, transatlantic slave trade, Transnistria, urban planning, urban sprawl

These downtrodden Ostjuden were the butt of much prejudice, not only from the locals but also from their fellow-Jews who had made the grade in Germany and Austria, and who had left the old Jewish world completely behind.55 European Imperialism in the late nineteenth century differed from earlier forms of the phenomenon in several important ways. It was part of a world-wide scramble for control of the last remaining countries suitable for exploitation. It was evident that the world’s resources were finite: states which set up a colonial empire quickly stood to gain a permanent advantage; those who delayed might be excluded from the ‘First Division’ forever. In the two decades starting in 1875, over one-quarter of the land surface of the globe was seized by half-a-dozen European powers. Colonies were viewed as an integral part of the advanced industrial economies. The supply of raw materials, cheap labour, and semi-finished products was planned to maximize the benefit to the ‘mother country’.