finite state

61 results back to index


Syntactic Structures by Noam Chomsky

finite state, language acquisition, P = NP, statistical model

We can also define the grammatical relations in these languages in a formal way in terms of the associated diagrams. 4.2 In § 3 we considered languages, calJed "fin i tt: state languages", which were generated by finite state M arkov processes. Now we are considering terminal languages that are generated by systems of the form [L, Fl. These two types of languages are related in the fol lowing way Theorem : Every finite state language is a term inal language, but there are terminal languages which are not finite state languages.4 The import of this theorem is that description in terms of phrase structure is essentially more powerful than description in terms of the elementary theory presented above in § 3. As examples of terminal languages that are not finite state l anguages we have the languages ( !

Then we call the sequence of words that has been produced a "sentence". Each such machine thus defines a certain language ; namely, the set of sentences that can be produced i n this way. Any language that can be produced by a machine of this sort we call a finite state language ; and we can call the machine itself a finite state grammar. A finite state grammar can be represented graphically i n the form o f a "state diagram".l For example, the grammar that produces just the two sentences "the man comes" and "the men come" can be represented by the following state diagram : THE ( 7) We can extend this grammar to produce an i nfinite number of sen­ tences by adding closed loops.

Such arbitrary limitations serve no useful purpose, however. The point i s that there are processes of sentence formation that finite state grammars are i ntrinsically not equipped to handle. If these pro­ cesses have no finite limit, we can prove the literal inapplicabi l i ty of this elementary theory. I f the processes have a limit, then the construction of a finite state grammar will not be litera lly out of the question, since it will be possible to list the sentences, and a list is essent ially a trivial finite state grammar. But this grammar will be so complex that it will be of little use or i nterest. In general, the assumption that languages are infinite is made in order to simplify 24 SYNTACTIC STRUCTURES the description of these languages.


pages: 496 words: 70,263

Erlang Programming by Francesco Cesarini

cloud computing, fault tolerance, finite state, functional programming, higher-order functions, loose coupling, revision control, RFC: Request For Comment, social bookmarking, sorting algorithm, Turing test, type inference, web application

The loop function is not called, allowing the process to terminate normally. Finite State Machines Erlang processes can be used to implement finite state machines. A finite state machine, or FSM for short, is a model that consists of a finite number of states and events. You can think of an FSM as a model of the world which will contain abstractions from the details of the real system. At any one time, the FSM is in a specific state. Depending on 126 | Chapter 5: Process Design Patterns the incoming event and the current state of the FSM, a set of actions and a transition to a new state will occur (see Figure 5-5). Figure 5-5. A finite state machine In Erlang, each state is represented as a tail-recursive function, and each event is represented as an incoming message.

In telecom systems, they are used not only to handle the state of equipment, as in the preceding example, but also in complex protocol stacks. The fact that Erlang handles them † Or any other relative of your choice who tends to call you very early on a Saturday morning. Finite State Machines | 127 Figure 5-6. Fixed-line phone finite state machine graciously is not a surprise. When prototyping with the early versions of Erlang between 1987 and 1991, it was the Plain Old Telephony System (POTS) finite state machines described in this section that the development team used to test their ideas of what Erlang should look like. With a tail-recursive function for every state, actions implemented as function calls, and events represented as messages, this is what the code for the idle state would look like: idle() -> receive {Number, incoming} -> start_ringing(), ringing(Number); off_hook -> start_tone(), dial() end. ringing(Number) -> receive {Number, other_on_hook} -> stop_ringing(), idle(); {Number, off_hook} -> 128 | Chapter 5: Process Design Patterns stop_ringing(), connected(Number) end.

However, you can go in more detail when working with generic servers, supervisors, and applications. Behaviors we have not covered but which we briefly introduced in this chapter include finite state machines, event handlers, and special processes. All of these behavior library modules have manual pages that you can reference. In addition, the Erlang documentation has a section on OTP design principles that provides more details and examples of these behaviors. Finite state machines are a crucial component of telecom systems. In Chapter 5, we introduced the idea of modeling a phone as a finite state machine. If the phone is not being used, it is in state idle. If an incoming call arrives, it goes to state ringing.


pages: 229 words: 67,599

The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J. Nahin

air gap, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Charles Babbage, Claude Shannon: information theory, Computing Machinery and Intelligence, conceptual framework, Edward Thorp, Fellow of the Royal Society, finite state, four colour theorem, Georg Cantor, Grace Hopper, Isaac Newton, John von Neumann, knapsack problem, New Journalism, Pierre-Simon Laplace, reversible computing, Richard Feynman, Schrödinger's Cat, Steve Jobs, Steve Wozniak, thinkpad, Thomas Bayes, Turing machine, Turing test, V2 rocket

Depending both on its present state and the symbol just read, the fourth and final operation of a machine cycle occurs when the finite-state machine transitions to a new state (which may, in fact, be the present state). Then a new machine cycle begins. Figure 9.1.1 shows the connection of a finite-state machine, the read/write head, and the tape. The entire arrangement, all together, is what we call a Turing machine. When placed into operation we’ll imagine that the tape is initially blank (that is, the symbol 0 is on all of the tape’s squares)—except for some finite number of squares that have 1s. By convention, we’ll always take the finite-state machine as initially in state 1. And finally, we must specify over which square on the tape the read/write head is initially placed.

And finally, we must specify over which square on the tape the read/write head is initially placed. The finite-state machine and the read/write head then move along the tape (we imagine the tape is motionless) according to the internal details of the finite-state machine and the particular sequence of symbols encountered on the tape. At some time after we turn the Turing machine on, it presumably completes its task (whatever that might be), and the finite-state machine enters state 0 (called the halting state) and stops. Figure 9.1.1. A Turing machine. As far as I know, nobody has ever actually constructed a Turing machine.

—Claude Shannon, in a March 1952 talk at Bell Labs on creativity, during which he explained how he arrived at the logic circuitry for a machine that plays a perfect game of Nim 9.1 THE FIRST MODERN COMPUTER A Turing machine is the combination of a sequential, finite-state machine plus an external read/write memory storage medium called the tape (think of a ribbon of magnetic tape). The tape is a linear sequence of squares, with each square holding one of several possible symbols. Most generally, a Turing machine can have any number of different symbols it can recognize, but I’ll assume here that we are discussing the 2-symbol case (0 or 1). In 1956, Shannon showed that this in no way limits the power of what a Turing machine can do. The tape is arbitrarily long in at least one, perhaps both, directions. The finite-state machine is connected to a read/write head, which at each machine cycle (I’ll define what that is in just a moment) is located over a square on the tape.


pages: 1,025 words: 150,187

ZeroMQ by Pieter Hintjens

AGPL, anti-pattern, behavioural economics, carbon footprint, cloud computing, Debian, distributed revision control, domain-specific language, eat what you kill, Eben Moglen, exponential backoff, factory automation, fail fast, fault tolerance, fear of failure, finite state, Internet of things, iterative process, no silver bullet, power law, premature optimization, profit motive, pull request, revision control, RFC: Request For Comment, Richard Stallman, Skype, smart transportation, software patent, Steve Jobs, Valgrind, WebSocket

#include "czmq.h" // States in which we can be at any point in time typedef enum { STATE_PRIMARY = 1, // Primary, waiting for peer to connect STATE_BACKUP = 2, // Backup, waiting for peer to connect STATE_ACTIVE = 3, // Active - accepting connections STATE_PASSIVE = 4 // Passive - not accepting connections } state_t; // Events, which start with the states our peer can be in typedef enum { PEER_PRIMARY = 1, // HA peer is pending primary PEER_BACKUP = 2, // HA peer is pending backup PEER_ACTIVE = 3, // HA peer is active PEER_PASSIVE = 4, // HA peer is passive CLIENT_REQUEST = 5 // Client makes request } event_t; // Our finite-state machine typedef struct { state_t state; // Current state event_t event; // Current event int64_t peer_expiry; // When peer is considered "dead" } bstar_t; // We send state information this often // If peer doesn't respond in two heartbeats, it is "dead" #define HEARTBEAT 1000 // In msec The heart of the Binary Star design is its finite-state machine (FSM). The FSM runs one event at a time. We apply an event to the current state, which checks if the event is accepted, and if so sets a new state (Example 4-62).

\n", server [server_nbr]); client = zsocket_new (ctx, ZMQ_REQ); zsocket_connect (client, server [server_nbr]); // Send request again, on new socket zstr_send (client, request); } } } zctx_destroy (&ctx); return 0; } To test our Binary Star implementation, start the servers and client in any order: bstarsrv -p # Start primary bstarsrv -b # Start backup bstarcli You can then provoke failover by killing the primary server, and recovery by restarting the primary and killing the backup. Note how it’s the client vote that triggers failover and recovery. Binary Star is driven by a finite-state machine (Figure 4-8). States in white accept client requests, and states in gray refuse them. Events are the peer state, so “Peer Active” means the other server has told us it’s active. “Client Request” means we’ve received a client request. “Client Vote” means we’ve received a client request and our peer has been inactive for two heartbeats. Figure 4-8. Binary Star finite-state machine Note that the servers use PUB-SUB sockets for state exchange. No other socket combination will work here.

Once you learn the trick, you can whip up your own code generators in a short time. The code generators most software engineers know about come with a single hard-coded model. For instance, Ragel “compiles executable finite state machines from regular languages” (i.e., Ragel’s model is a regular language). This certainly works for a good set of problems, but it’s far from universal. How do you describe an API in Ragel? Or a project makefile? Or even a finite-state machine like the one we used to design the Binary Star pattern in Chapter 4? All of these would benefit from code generation, but there’s no universal model. So, the trick is to design your own models as you need them, and then make code generators as cheap compilers for those models.


pages: 232

A Discipline of Programming by E. Dijkstra

finite state, Turing machine, Y Combinator

He recalls, only when the transition table has been completed, that he has only binary variables (he calls them "flip-flops") at his disposal and that, if the number of states is, say, between 33 and 64 (bounds included) he needs at least six of them. Those six binary variables span a state space of 64 points and the designer is free in his choice how to identify the different states of his finite-state automaton with points in the 64-point state space. That this choice is not irrelevant becomes clear as soon as we realize that a state transition of the finite-state automaton has to be translated in combinations of boolean variables being operated upon under control of boolean values. To circuit designers this choice is known as the "state assignment problem" and technical constraints or optimization goals may make it a hairy one-so hairy, as a matter of fact, that one may be tempted to challenge the adequacy of the design methodology evoking it.

But to dismiss this question is not my intention. Actual machines have stores of a finite capacity and, left to themselves, they behave in principle like finite state automata which after a sufficient large number of steps must return to a state in which they have been before. (If, in addition, the machine is deterministic, history will from then on repeat itself.) For the overall behaviour of today's modern computers, the theory of finite state automata is, however, of limited significance, because their number of possible states is so incredibly huge that they could go on working for years without returning to a state in which they have been before.

F is the predicate that is false in all points of the state space: it corresponds to the empty set. 3 THE CHARACTERIZATION OF SEMANTICS We are primarily interested in systems that, when started in an "initial state", will end up in a "final state" which, as a rule, depends on the choice of the initial state. This is a view that is somewhat different from the idea of the finite state automaton that on the one hand absorbs a stream of input characters and on the other hand produces a stream of output characters. To translate that in our picture we must assume that the value of the input (i.e. the argument) is reflected in the choice of the initial state and that the value of the output (i.e. the answer) is reflected in the final state.


pages: 647 words: 43,757

Types and Programming Languages by Benjamin C. Pierce

Albert Einstein, combinatorial explosion, experimental subject, finite state, functional programming, Henri Poincaré, higher-order functions, Perl 6, power law, Russell's paradox, sorting algorithm, Turing complete, Turing machine, type inference, Y Combinator

To describe the class, we need some additional terminology. 21.5.10 Definition Given an invertible generating function F and an element x Î U, the set predF(x) (or just pred(x)) of immediate predecessors of x is and its extension to sets X ⊆ U is The set reachableF(X) (or just reachable(X)) of all elements reachable from a set X via support is defined as and its extension to single elements x Î U is reachable(x) = reachable({x}). An element y Î U is reachable from an element x if y Î reachable(x). 21.5.11 Definition An invertible generating function F is said to be finite state if reachable(x) is finite for each x Î U. For a finite-state generating function, the search space explored by gfp is finite and gfp always terminates: 21.5.12 Theorem If reachableF(X) is finite, then gfpF(X) is defined. Consequently, if F is finite state, then gfpF(X) terminates for any finite X ⊆ U. Proof: For each recursive call gfp(Y) in the call graph generated by the original invocation gfp(X), we have Y ⊆ reachable(X).

For this, some new terminology is helpful. Given a finite-state generating function F Î P(U) → P(U), the partial function (or just height) is the least partial function satisfying the following condition:[2] (Note that height(x) is undefined if x either participates in a reachability cycle itself or depends on an element from a cycle.) A generating function F is said to be finite height if heightF is a total function. It is easy to check that, if y Î support(x) and both height(x) and height(y) are defined, then height(y) < height(x). Now, if F is finite state and finite height, then lfp(X) terminates for any finite input set X ⊆ U.

At the other end are techniques of much more modest power—modest enough that automatic checkers can be built into compilers, linkers, or program analyzers and thus be applied even by programmers unfamiliar with the underlying theories. One well-known instance of this sort of lightweight formal methods is model checkers, tools that search for errors in finite-state systems such as chip designs or communication protocols. Another that is growing in popularity is run-time monitoring, a collection of techniques that allow a system to detect, dynamically, when one of its components is not behaving according to specification. But by far the most popular and best established lightweight formal methods are type systems, the central focus of this book.


Reactive Messaging Patterns With the Actor Model: Applications and Integration in Scala and Akka by Vaughn Vernon

A Pattern Language, business intelligence, business logic, business process, cloud computing, cognitive dissonance, domain-specific language, en.wikipedia.org, fault tolerance, finite state, functional programming, Internet of things, Kickstarter, loose coupling, remote working, type inference, web application

Even so, the word direct is used because the programming model provides a worthy abstraction that makes it appear that messages are sent directly from one actor to another. • State machines: Actors support finite state machines. When an actor transitions to some expected state, it can modify its behavior in preparation for future messages. By becoming another kind of message handler, the actor implements a finite state machine. • Share nothing: Actors do not share their mutable state with any other actor, or any other component for that matter. • Lock-free concurrency: Since actors do not share their mutable state, and because they receive only one message at a time, actors never need to attempt to lock their state before reacting to a message.

Internally an actor can completely replace its current receive function for another, or it can keep a stack of previous receive functions, essentially pushing them and then popping them back into context.5 An actor may have a few or several different receive functions that it swaps among, but it can have only a finite number of them. This capability enables actors to be effective finite state machines. 5. Akka calls this feature become, enabling an actor to become a different kind of message receiver. Obviously, there are better situations than others to exchange your actor’s receive function. The feature needn’t be overused. Assuming that an actor can reach a given state only if it has received a given prerequisite message, when an actor has received a message in a specific receive block, it can assume that the prerequisite message has already been received within its current instance lifetime.

For one thing, an actor can’t determine whether messages are received out of order since the system outside is asynchronous and won’t generally react in exact, predetermined, serialized ways. Well, that’s all part of dealing with nondeterminism as discussed in Chapter 1. Remember, however, that an actor that holds and manages state may be designed as a finite state machine. This allows you to design your actor to understand a given state, which messages can be handled in that state, how the current state will be transitioned, and what to do about messages that can’t be handled while in that state. Yet, that’s design, not test. Gladly, the Akka team understands the ins and outs since they have to test actors themselves to ensure Akka’s robustness.


pages: 834 words: 180,700

The Architecture of Open Source Applications by Amy Brown, Greg Wilson

8-hour work day, anti-pattern, bioinformatics, business logic, c2.com, cloud computing, cognitive load, collaborative editing, combinatorial explosion, computer vision, continuous integration, Conway's law, create, read, update, delete, David Heinemeier Hansson, Debian, domain-specific language, Donald Knuth, en.wikipedia.org, fault tolerance, finite state, Firefox, Free Software Foundation, friendly fire, functional programming, Guido van Rossum, Ken Thompson, linked data, load shedding, locality of reference, loose coupling, Mars Rover, MITM: man-in-the-middle, MVC pattern, One Laptop per Child (OLPC), peer-to-peer, Perl 6, premature optimization, recommendation engine, revision control, Ruby on Rails, side project, Skype, slashdot, social web, speech recognition, the scientific method, The Wisdom of Crowds, web application, WebSocket

Other Worker Behaviors A large number of other worker behaviors can and have been implemented using these same ideas. 15.4.1. Finite State Machines Finite state machines (FSMs), implemented in the gen_fsm behavior module, are a crucial component when implementing protocol stacks in telecom systems (the problem domain Erlang was originally invented for). States are defined as callback functions named after the state that return a tuple containing the next State and the updated loop data. You can send events to these states synchronously and asynchronously. The finite state machine callback module should also export the standard callback functions such as init, terminate, and handle_info.

If the termination reason is non-normal, the process terminates itself, propagating the EXIT signal further. By calling the process_flag(trap_exit, true) BIF, processes can receive the EXIT signals as Erlang messages in their mailbox instead of terminating. Riak uses EXIT signals to monitor the well-being of helper processes performing non-critical work initiated by the request-driving finite state machines. When these helper processes terminate abnormally, the EXIT signal allows the parent to either ignore the error or restart the process. 15.2. Process Skeletons We previously introduced the notion that processes follow a common pattern regardless of the particular purpose for which the process was created.

Finally, on termination, the cleanup will vary from process to process. So, even if a skeleton of generic actions exists, these actions are complemented by specific ones that are directly related to the tasks assigned to the process. Using this skeleton as a template, programmers can create Erlang processes that act as servers, finite state machines, event handlers and supervisors. But instead of re-implementing these patterns every time, they have been placed in library modules referred to as behaviors. They come as part as the OTP middleware. 15.3. OTP Behaviors The core team of developers committing to Riak is spread across nearly a dozen geographical locations.


On Language: Chomsky's Classic Works Language and Responsibility and Reflections on Language in One Volume by Noam Chomsky, Mitsou Ronat

conceptual framework, finite state, language acquisition, machine translation, Paul Samuelson, tacit knowledge, theory of mind

An intuition? N.C.: An intuition again founded on the same anti-empiricism. In my view, a finite state Markov source model might reasonably be considered as characterizing something like the outer limits of empiricist learning theory. In fact, a mathematical psychologist and logician, Patrick Suppes, gave precise expression to this intuition, or one version of it, a few years ago. He proved that a certain very rich version of stimulus-response learning theory must remain within the limits of finite state sources of the kind we have been discussing. He considered this to be a positive result. To me it seemed to be a negative result.

I’m thinking of their “historic” encounters with telecommunications engineers . . . N.C.: Well, at the end of the forties and the beginning of the fifties, there were important developments in the mathematical theory of communication, information theory, and the theory of automata. Technically, models such as finite state Markov sources were proposedh . . . Very often it was supposed that these models were appropriate for the description of language. Jakobson referred to this vaguely, but Hockett utilized them quite explicitly. In 1955 he proposed a theory of language structure based on a Markov source model borrowed from the mathematical theory of communication.

He considered this to be a positive result. To me it seemed to be a negative result. The reason is this. As has been known for a long time, even elementary systems of knowledge cannot be represented in terms of finite state Markov sources—for example, our knowledge of English, or even much simpler systems, such as propositional calculus. As a consequence, Suppes’s result showed that knowledge which we possess cannot even be approached at the limit (a fortiori, not attained) by the learning theory he was considering. This constituted a final step in a complete refutation of this learning theory, and consequently, less powerful theories. I did not believe in theories of language based on the Markov source model, which seemed to me to inherit the defects of empiricist learning theory.


pages: 661 words: 187,613

The Language Instinct: How the Mind Creates Language by Steven Pinker

Albert Einstein, Boeing 747, cloud computing, Computing Machinery and Intelligence, David Attenborough, double helix, Drosophila, elephant in my pajamas, finite state, Gregor Mendel, illegal immigration, Joan Didion, language acquisition, Loebner Prize, mass immigration, Maui Hawaii, meta-analysis, MITM: man-in-the-middle, natural language processing, out of africa, phenotype, rolodex, Ronald Reagan, Sapir-Whorf hypothesis, Saturday Night Live, speech recognition, Steven Pinker, Strategic Defense Initiative, tacit knowledge, theory of mind, transatlantic slave trade, Turing machine, Turing test, twin studies, Yogi Berra

He pondered whether to select it is fashionable to scoff at the traditional morality of marriage and family life or it is no longer fashionable to scoff at the traditional morality of marriage and family life. The latter had more of the form’s authentic baroque splendor, he decided. Let’s call this a word-chain device (the technical name is a “finite-state” or “Markov” model). A word-chain device is a bunch of lists of words (or prefabricated phrases) and a set of directors for going from list to list. A processor builds a sentence by selecting a word from one list, then a word from another list, and so on. (To recognize a sentence spoken by another person, one just checks the words against each list in order.)

Pobbles: Edward Lear, “The Pobble Who Has No Toes.” Jabber-wocky: Carroll, 1871/1981. Colorless green ideas: Chomsky, 1957. Automated news story: Frayn, 1965. Example from Miller, 1967. Gobbledygook generators: Brandreth, 1980; Bolinger, 1980; Spy magazine, January 1993. Approximations to English: Miller & Selfridge, 1950. Finite-state devices and their problems: Chomsky, 1957; Miller & Chomsky, 1963; Miller, 1967. TV Guide example from Gleitman, 1981. Cook with round bottom: Columbia Journalism Review, 1980; Lederer, 1987. Impenetrable Chomsky: Chomsky, 1986, p. 79. Textbooks on modern grammatical theory: Friedin, 1992; Radford, 1988; Riemsdijk & Williams, 1986.

Cases typically correspond to the subject, object, indirect object, and the objects of various kinds of prepositions. In English, case is what distinguishes between I, he, she, we, they, which are used for subjects, and me, him, her, us, them, which are used for objects of verbs, objects of prepositions, and everywhere else. chain device. See finite-state device. chromosome. A long strand of DNA, containing thousands of genes, in a protective package. There are twenty-three chromosomes in a human sperm or egg; there are twenty-three pairs of chromosomes (one from the mother, one from the father) in all other human cells. clause. A kind of phrase that is generally the same thing as a sentence, except that some kinds of clause can never occur on their own but only inside a bigger sentence: THE CAT IS ON THE MAT; John arranged FOR MARY TO GO; The spy WHO LOVED ME disappeared; He said THAT SHE LEFT.


When Computers Can Think: The Artificial Intelligence Singularity by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann, Michelle Estes

3D printing, Abraham Maslow, AI winter, air gap, anthropic principle, artificial general intelligence, Asilomar, augmented reality, Automated Insights, autonomous vehicles, availability heuristic, backpropagation, blue-collar work, Boston Dynamics, brain emulation, call centre, cognitive bias, combinatorial explosion, computer vision, Computing Machinery and Intelligence, create, read, update, delete, cuban missile crisis, David Attenborough, DeepMind, disinformation, driverless car, Elon Musk, en.wikipedia.org, epigenetics, Ernest Rutherford, factory automation, feminist movement, finite state, Flynn Effect, friendly AI, general-purpose programming language, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, industrial robot, Isaac Newton, job automation, John von Neumann, Law of Accelerating Returns, license plate recognition, Mahatma Gandhi, mandelbrot fractal, natural language processing, Nick Bostrom, Parkinson's law, patent troll, patient HM, pattern recognition, phenotype, ransomware, Ray Kurzweil, Recombinant DNA, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, sorting algorithm, speech recognition, statistical model, stem cell, Stephen Hawking, Stuxnet, superintelligent machines, technological singularity, Thomas Malthus, Turing machine, Turing test, uranium enrichment, Von Neumann architecture, Watson beat the top human players on Jeopardy!, wikimedia commons, zero day

Hidden Markov models Modern speech understanding systems then feed the result into hidden Markov models, which are a generalization of finite state machines. To understand finite state machines, consider the following secret message, which was encrypted using a simple pen and paper cipher known as single transposition:SONERENEYDMMO The cipher can be decrypted to reveal the following plain text:SENDM OREMO NEY But pen and paper ciphers do not include a space character, so it is difficult to recognize the words that are within it. The following finite state machine addresses this problem. In it, states are represented as numbered circles and transitions as arrows.

If it is an “S” then we move to state 1, but if we see an “M” then we move to state 5, in which case an “O” would move it to state 6. It can be seen that the letters “M”, “O”, “N”, “E”, and “Y” will result in the machine being in state 11, which represents the word “MONEY”. A large number of words can be loaded into a finite state machine which can then disambiguate them very efficiently. Finite state machine for recognizing words. Owned A hidden Markov model also has states and transitions. However, the transitions are probabilistic, so that a given input may produce several transitions. Further, the actual states and transitions are hidden, so all that is known is the input phones and the resulting words.


pages: 480 words: 99,288

Mastering ElasticSearch by Rafal Kuc, Marek Rogozinski

Amazon Web Services, book value, business logic, create, read, update, delete, en.wikipedia.org, fault tolerance, finite state, full text search, information retrieval

Note Since all the terms are held in the byte array you can have upto 2.1 GB of memory used for this per segment. memory: As its name suggests, this codec writes all the data to disk, but reads the terms and post listings into the memory, using a structure called FST (Finite State Transducers). More information about this structure can be found in a great post by Mike McCandless at http://blog.mikemccandless.com/2010/12/using-finite-state-transducers-in.html). Because of storing the data in memory, this codec may result in performance boost for commonly used terms. bloom_default: It is an extension of the default codec that adds the functionality of a bloom filter that is written to the disk.

Although this suggester is not about correcting user spelling mistakes we thought that it will be good to show at least a simple example of this highly efficient suggester. The logic behind completion suggester The prefix suggester is based on the data structure called FST (Finite State Transducer) (http://en.wikipedia.org/wiki/Finite_state_transducer). Although it is highly efficient, it may require significant resources to build on systems with large amount of data in them, systems that ElasticSearch is perfectly suitable for. If we like to build these structures on the nodes after each restart or cluster state change we may lose performance.


Principles of Protocol Design by Robin Sharp

accounting loophole / creative accounting, business process, discrete time, exponential backoff, fault tolerance, finite state, functional programming, Gödel, Escher, Bach, information retrieval, loose coupling, MITM: man-in-the-middle, OSI model, packet switching, quantum cryptography, RFC: Request For Comment, stochastic process

For each of the N parties to an N-peer communication, the protocol defines a language, whose sentences are the legal sequences of messages received by that party, and whose alphabet of symbols is the set of all possible messages. A machine to obey the rules of the protocol must thus essentially be a recogniser for the protocol language. For simple protocols, this is a useful abstraction, as the language is regular or at most context-free, and standard compiler techniques [124] can be used to implement the machine as a finite state machine or push-down automaton respectively. A trivial example of a simple protocol described in this way is given below. It is a type of stop-and-wait protocol. The sender requests the receiver to indicate when it is ready to receive data, and waits for this indication. On receipt of the indication, the sender sends the data and waits for an acknowledgment.

The languages to be recognised by the sender and receiver 1.2 Protocols as Processes 3 respectively (and of course to be generated by the receiver and sender respectively) are defined by the BNF: sender ::= readyindication acknowledge sender receiver ::= requesttoaccept data receiver Each party must generate and recognise sentences of a regular language. This is a simple task for a finite state machine. Unfortunately, there are some important objections to this language-oriented view of a protocol. The first is a practical objection: Simple languages generally do not correspond to protocols which can tolerate faults, such as missing or duplicated messages. Protocols which are fault-tolerant often require the use of state machines with enormous numbers of states, or they may define context-dependent languages.

There exists no local state of the participant such that its concurrency set contains both an abort and a commit state. 2. All states whose concurrency set contains a commit state are committable states. A committable state is one in which the participant knows with certainty that all the other participants have committed or will do so. The major states of Protocol 16 can be illustrated by the finite-state machines shown in Figure 5.9(a). The concurrency set of the state wc in the FSM for the co the concurrency set of the states wi in the FSMs ordinator is i {qi , wi , ai }, while for the slaves is {wc , ac , cc } j=i {q j , w j , a j , c j }, so the states w fail to fulfil either of the conditions of Theorem 5.1.


pages: 130 words: 11,880

Optimization Methods in Finance by Gerard Cornuejols, Reha Tutuncu

asset allocation, call centre, constrained optimization, correlation coefficient, diversification, financial engineering, finite state, fixed income, frictionless, frictionless market, index fund, linear programming, Long Term Capital Management, passive investing, Sharpe ratio, transaction costs, value at risk

Furthermore, since we have a strictly feasible (and optimal) dual solution, any optimal solution of the primal must have tight constraints, indicating that there is no type-B arbitrage. 3.2 Arbitrage Detection Using Linear Programming The linear programming problems (3.4) and (3.4) we formulated for the proof of Theorem 3.2 can naturally be used for detection of arbitrage opportunities. However, as we discussed above, this argument works only for finite state spaces. In this section, we discuss how LP formulations can be used to detect arbitrage opportunities without limiting consideration to finite state spaces. The price we pay for this flexibility is the restriction on the selection of the securities: we only consider the prices of a set of derivative securities written on the same underlying with same maturity. This discussion is based on [7].


Writing Effective Use Cases by Alistair Cockburn

business process, c2.com, create, read, update, delete, finite state, index card, information retrieval, iterative process, operational security, recommendation engine, Silicon Valley, web application, work culture

This design information can add to the readability of the requirements document, since it give both textual (abstract) and visual (concrete) renditions of the system’s behavior. The UI design has three levels of precision, low, medium and high: • The low-precision description of the user interface is a screen-navigation diagram, drawn as a finite state machine or statechart. Each state is the name of a screen the user will encounter. The finite state machine shows what user events cause movement from one screen to another. • The medium-precision description is a drawing or reduced size snapshot of the screen. Place this at the end of the use case, so that readers can both see and read what design is being nominated. • The high-precision description lists all the field types, lengths and validation checks of each screen, and does not belong in the requirements document at all!

The programmer and the user interface designer need to know what exactly is meant by address, which fields it contains, the lengths of the fields, the validation rules for addresses, zip codes, phone numbers, and the like. All of this information belongs in the requirements somewhere - and not in the use case! Use cases are only "chapter three" of the requirements document, the behavioral requirements. They do not contain performance requirements, business rules, user interface design, data descriptions, finite state machine behavior, priority, and probably some other information. "Well, where are those requirements?!" the system developers cry! It is all very well to say the use cases shouldn’t contain those, but they have to get documented sometime. Some of the information can, in fact, be attached to each use case as associated information.


pages: 706 words: 120,784

The Joy of Clojure by Michael Fogus, Chris Houser

cloud computing, Dennis Ritchie, domain-specific language, Donald Knuth, Douglas Hofstadter, duck typing, en.wikipedia.org, finite state, functional programming, Gödel, Escher, Bach, haute couture, higher-order functions, Larry Wall, Paul Graham, rolodex, SQL injection, traveling salesman

The final benefit of recur is that it allows the forms fn and loop to act as anonymous recursion points. Why recur indeed. 7.3.3. Don’t forget your trampoline We touched briefly on the fact that Clojure can also optimize a mutually recursive function relationship, but like the tail-recursive case, it’s done explicitly. Mutually recursive functions are nice for implementing finite state machines (FSA), and in this section we’ll show an example of a simple state machine modeling the operation of an elevator (Mozgovoy 2009) for a two-story building. There are only four states that the elevator FSA allows: on the first floor with the doors open or closed and on the second floor with the door open or closed.

We’d like to create a function elevator that starts in the ff-open state, takes a sequence of commands, and returns true or false if they correspond to a legal schedule according to the FSA. For example, the sequence [:close :open :done] would be legal, if not pointless, whereas [:open :open :done] wouldn’t be legal, because an open door can’t be reopened. The function elevator could be implemented as shown next. Listing 7.6. Using mutually recursive functions to implement a finite state machine Using letfn in this way allows you to create local functions that reference each other, whereas (let [ff-open #(...)] ...) wouldn’t, because it executes its bindings serially. Each state function contains a case macro that dispatches to the next state based on a contextually valid command.

encapsulation, 5th block-level encapsulation local encapsulation namespace encapsulation Enlive enumeration values enumerator env ephemeral equality, 2nd, 6th, 7th equality partitions, 2nd, 3rd equality semantics error handling, 2nd, 3rd escaped evaluation contextual-eval, 2nd eval, 2nd meta-circular exceptions, 5th, 9th, 10th, 20th exceptions exceptions exceptions exceptions catch, 2nd checked compile-time, 2nd ConcurrentModification-Exception finally, 2nd handling java.lang.ClassCastException java.lang.Exception java.lang.NullPointer-Exception java.lang.RuntimeException runtime runtime vs. compile-time throw, 2nd, 3rd expand-clause expansion expected case experimentation expression problem extend, 2nd, 3rd, 4th extend-protocol, 2nd extend-type, 2nd Extensible Markup Language (XML), 2nd, 3rd F Factor (programming language), 2nd factory methods fail, 2nd false, 3rd evil-false Fantom (programming language) fence post errors filter, 2nd, 3rd, 4th, 5th find-doc find-ns finite state machines first, 2nd, 3rd, 4th, 5th First In, First Out (FIFO), 2nd First In, Last Out (FILO) first-class, 2nd, 3rd fixed-size pool FIXO, 3rd, 5th fixo-peek fixo-push, 2nd, 3rd flexibility float, 2nd floating point, 2nd, 5th overflow rounding error underflow, 2nd floats fluent builder FluentMove fn, 2nd, 3rd, 4th, 5th, 6th for, 2nd force, 2nd, 3rd forever form free variables freedom to focus frequencies Frink (programming language), 2nd frustrating fully qualified, 2nd, 3rd fun functions, 6th anonymous, 2nd, 3rd, 4th arity Calling Functions dangerous function signatures local multiple function bodies named arguments G Gang of Four garbage collection, 2nd, 3rd gcd gen-class, 2nd, 3rd, 4th, 5th, 6th generalized tail-call optimization, 2nd generic genotype gensym get, 2nd, 3rd, 4th get-in getter global hierarchy map goal Gödel, Escher, Bach: An Eternal Golden Braid good-move Graham, Paul, 2nd graphic graphical user interface (GUI), 2nd, 3rd, 4th graphics context greatest common denominator, 2nd green thread Greenspun’s Tenth Rule, 2nd Groovy (programming language) H Halloway, Stuart, 2nd has hash maps hash-map, 2nd, 3rd, 4th Haskell (programming language), 2nd, 3rd, 4th, 5th, 7th, 9th, 11th, 13th out of order execution Template Haskell typeclasses, 2nd heuristic Hickey, Rich, 2nd hidden hierarchy history homoiconicity, 2nd hooks, 2nd hops host semantics Hoyte, Doug hyphens I I/O, 2nd, 3rd idempotent, 2nd identical?


Monte Carlo Simulation and Finance by Don L. McLeish

algorithmic bias, Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, discrete time, distributed generation, finite state, frictionless, frictionless market, implied volatility, incomplete markets, invention of the printing press, martingale, p-value, random walk, risk free rate, Sharpe ratio, short selling, stochastic process, stochastic volatility, survivorship bias, the market place, transaction costs, value at risk, Wiener process, zero-coupon bond, zero-sum game

Random Samples Associated with Markov Chains Consider a finite state Markov Chain, a sequence of (discrete) random variables X1 , X2 , . . .each of which takes integer values 1, 2, . . . N (called states). The number of states of a Markov chain may be large or even infinite and it is not always convenient to label them with the positive integers and so it is common to define the state space as the set of all possible states of a Markov chain, but we will give some examples of this later. For the present we restrict attention to the case of a finite state space. The transition probability matrix is a matrix P describing the conditional probability of moving between possible states of the chain, so that P [Xn+1 = j|Xn = i] = Pij , i = 1, . . .

Coupling From the Past: Sampling from the stationary distribution of a Markov Chain All of the above methods assume that we generate from the stationary distribution of a Markov chain by the “until Hele freezes over” method, i.e. wait until run the chain from an arbitrary starting value and then delete the initial transient. An alternative elegant method that is feasible at least for some finite state Markov chains is the method of “coupling from the past” due to Propp and Wilson (1996). We assume that we are able to generate transitions in the Markov Chain. In other words if the chain is presently in state i at time n we are able to generate a random variable Xn+1 from the distribution proportional to Pij , j = 1, ...K.


Applied Cryptography: Protocols, Algorithms, and Source Code in C by Bruce Schneier

active measures, cellular automata, Claude Shannon: information theory, complexity theory, dark matter, Donald Davies, Donald Knuth, dumpster diving, Dutch auction, end-to-end encryption, Exxon Valdez, fault tolerance, finite state, heat death of the universe, information security, invisible hand, John von Neumann, knapsack problem, MITM: man-in-the-middle, Multics, NP-complete, OSI model, P = NP, packet switching, quantum cryptography, RAND corporation, RFC: Request For Comment, seminal paper, software patent, telemarketer, traveling salesman, Turing machine, web of trust, Zimmermann PGP

A computer can only be in a finite number of states (a large finite number, but a finite number nonetheless), and the stuff that comes out will always be a deterministic function of the stuff that went in and the computer’s current state. That means that any random-number generator on a computer (at least, on a finite-state machine) is, by definition, periodic. Anything that is periodic is, by definition, predictable. And if something is predictable, it can’t be random. A true random-number generator requires some random input; a computer can’t provide that. Pseudo-Random Sequences The best a computer can produce is a pseudo-random-sequence generator.

Is there such a thing as randomness? What is a random sequence? How do you know if a sequence is random? Is “101110100” more random than “101010101”? Quantum mechanics tells us that there is honest-to-goodness randomness in the real world. But can we preserve that randomness in the deterministic world of computer chips and finite-state machines? Philosophy aside, from our point of view a sequence generator is real random if it has this additional third property: 3. It cannot be reliably reproduced. If you run the sequence generator twice with the exact same input (at least as exact as humanly possible), you will get two completely unrelated random sequences.

A full discussion on these four approaches and the research surrounding them is well beyond the scope of this book. See [1047,1355] for a good introduction to the topic; I am only going to touch on the major contributions to the field. The first approach treats a cryptographic protocol as any other computer program and attempts to prove correctness. Some researchers represent a protocol as a finite-state machine [1449,1565], others use extensions of first-order predicate calculus [822], and still others use specification languages to analyze protocols [1566]. However, proving correctness is not the same as proving security and this approach fails to detect many flawed protocols. Although it was widely studied at first, most of the work in this area has been redirected as the third approach gained popularity.


pages: 239 words: 64,812

Geek Sublime: The Beauty of Code, the Code of Beauty by Vikram Chandra

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple II, barriers to entry, Berlin Wall, Big Tech, British Empire, business process, Californian Ideology, Charles Babbage, conceptual framework, create, read, update, delete, crowdsourcing, don't repeat yourself, Donald Knuth, East Village, European colonialism, finite state, Firefox, Flash crash, functional programming, glass ceiling, Grace Hopper, Hacker News, haute couture, hype cycle, iterative process, Jaron Lanier, John von Neumann, land reform, London Whale, Norman Mailer, Paul Graham, pink-collar, revision control, Silicon Valley, Silicon Valley ideology, Skype, Steve Jobs, Steve Wozniak, supercomputer in your pocket, synthetic biology, tech worker, the Cathedral and the Bazaar, theory of mind, Therac-25, Turing machine, wikimedia commons, women in the workforce

One of the pioneers of the newly minted field of Aesthetic Computing, Paul Fishwick, points out that digital watches and DVRs surface the abstract computer-science notion of “finite state machines” in their menus—each time you press the “mode” button you move from one state to another: The way in which our thinking is changing culturally surfaces deep abstract concepts in computing to us as we use these devices: from number, to information structure, to process … It is not just that the finite state machine is embedded within the watch’s silicon, but also that the human wearing the watch becomes aware of this virtual machine’s structure and its components through the experience of using the watch.


pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden

Benevolent Dictator For Life (BDFL), business intelligence, business logic, business process, cellular automata, cloud computing, cognitive load, commoditize, complexity theory, conceptual framework, continuous integration, data acquisition, Dennis Ritchie, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, functional programming, general-purpose programming language, Guido van Rossum, higher-order functions, history of Unix, HyperCard, industrial research laboratory, information retrieval, information security, iterative process, Ivan Sutherland, John von Neumann, Ken Thompson, Larry Ellison, Larry Wall, linear programming, loose coupling, machine readable, machine translation, Mars Rover, millennium bug, Multics, NP-complete, Paul Graham, performance metric, Perl 6, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Ruby on Rails, Sapir-Whorf hypothesis, seminal paper, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, SQL injection, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

How do you make the idea of syntax-driven transformations accessible to users who might not know very much or anything at all about finite-state machines and push-down automata? Al: Certainly as a user of AWK, you don’t need to know about these concepts. On the other hand, if you’re into language design and implementation, knowledge of finite-state machines and context-free grammars is essential. Should a user of lex or yacc understand the context-free grammar even if the programs they produce don’t require their users to understand them? Al: Most users of lex can use lex without understanding what a finite-state machine is. A user of yacc is really writing a context-free grammar, so from that perspective, the user of yacc certainly gets to appreciate grammars, but the user doesn’t have to become a formal language theorist to use yacc.

Let me interpret automata theory as formal languages and the automata that recognize them. Automata theory provides useful notations, particularly regular expressions and context-free grammars, for describing the important syntactic features of programming languages. The automata that recognize these formal languages, such as finite-state machines and push-down automata, can serve as models for the algorithms used by compilers to scan and parse programs. Perhaps the greatest benefit of automata theory to compiling comes from being able to build compiler-construction tools such as lex and yacc that automate the construction of efficient scanners and parsers based on these automata.

How can we design pattern-matching algorithms that take advantage of concurrency in multicore hardware? Al: This is currently an active research area. Many researchers are exploring parallel hardware and software implementations of pattern-matching algorithms like the Aho-Corasick algorithm or finite-state algorithms. Some of the strong motivators are genomic analyses and intrusion detection systems. What motivated you and Corasick to develop the Aho-Corasick algorithm? Al: The origin has a very interesting story behind it. I was working on the book The Design and Analysis of Computer Algorithms [Addison-Wesley] with John Hopcroft and Jeffrey Ullman back in the early 70s.


The Art of Scalability: Scalable Web Architecture, Processes, and Organizations for the Modern Enterprise by Martin L. Abbott, Michael T. Fisher

always be closing, anti-pattern, barriers to entry, Bernie Madoff, business climate, business continuity plan, business intelligence, business logic, business process, call centre, cloud computing, combinatorial explosion, commoditize, Computer Numeric Control, conceptual framework, database schema, discounted cash flows, Dunning–Kruger effect, en.wikipedia.org, fault tolerance, finite state, friendly fire, functional programming, hiring and firing, Infrastructure as a Service, inventory management, machine readable, new economy, OSI model, packet switching, performance metric, platform as a service, Ponzi scheme, power law, RFC: Request For Comment, risk tolerance, Rubik’s Cube, Search for Extraterrestrial Intelligence, SETI@home, shareholder value, Silicon Valley, six sigma, software as a service, the scientific method, transaction costs, Vilfredo Pareto, web application, Y2K

You may recall from a computer science computational theory class the description of Mealy and Moore machines, which are known as state machines or finite state machines. A state machine is an abstract model of states and actions that is used to model behavior; these can be implemented in the real world in either hardware or software. There are other ways to model or describe behavior of an application, but the state machine is one of the most common. Mealy Moore Machines A Mealy machine is a finite state machine that generates output based on the input and the current state of the machine. A Moore machine, on the other hand, is a finite state machine that generates output based solely on the current state.

The output is determined by the current state of the light as well as the input. If a car is waiting and the current state is red, the signal gets turned to green. Obviously, these are both overly simplified examples, but you get the point that there are different ways of modeling behavior using states, inputs, outputs, and actions. Given that finite state machines are one of the fundamental aspects of theoretical computer science as mathematically modeled by automatons, it is no wonder why this is a fundamental structure of our system designs. But why exactly do we see state in almost all of our programs, and are there alternatives? The reason that most applications rely on state is that the languages used for Web based or Software as a Service (SaaS) development are almost all imperative based.

Additionally, we discussed why it is important to have individuals like architects and managers overseeing the entire system design to help point out to engineers when asynchronous calls could be warranted. Another topic that we covered in this chapter was the use of state in an application. We started with what is state within application development. We then dove into a discussion in computational theory on finite state machines and concluded with a distinction between imperative and declarative languages. We finished the stateful versus stateless conversation with one of the most commonly used implementations of state: that being the session state. Session as we defined it was an established communication between the client, typically the user’s browser, and the server, that gets maintained during the life of the session for that user.


pages: 593 words: 118,995

Relevant Search: With Examples Using Elasticsearch and Solr by Doug Turnbull, John Berryman

business logic, cognitive load, commoditize, crowdsourcing, data science, domain-specific language, Dr. Strangelove, fail fast, finite state, fudge factor, full text search, heat death of the universe, information retrieval, machine readable, natural language processing, premature optimization, recommendation engine, sentiment analysis, the long tail

This component circumvents the performance problem referenced previously and allows for custom sorting of completion results (rather than sorting by occurrence). Effectively, the completion suggester is a specialized search index that’s stored in parallel with the normal search index. It’s backed by a compact data structure (a finite state transducer) that provides a fast prefix-lookup capability. In many ways, this approach is the ideal solution for completion, but as you’ll see in a moment, it introduces a couple of problems of its own. Setting up the completion suggester is simple: you just declare one of the fields to be of type completion.

Here, based on the context of the entire query, it would be unreasonable to return anything but Star Trek movies. But in the case of the Elasticsearch completion suggester, the top completion result is Star Wars: Episode IV—A New Hope. This demonstrates that the completion suggester is unaware of the search context. Another unfortunate consequence with using a finite state transducer to back completion is that the data structure is immutable. If a document is deleted from the index, the completions for that document will still exist. Currently, the only remedy for this situation is a full-index optimization, which effectively rebuilds the completion index from scratch.

simple signals, 2nd Solr additive, with Boolean queries boosting feature mappings multiplicative, with function queries user ratings vs. filtering breadcrumb navigation browse experience browse interface, Yowl buckets section building signals bulk index API bulkMovies string business and domain awareness business concerns group business weight business-ranking logic BusinessScore C cast.name field, 2nd, 3rd, 4th, 5th cast.name scores cast.name.bigrammed field, 2nd, 3rd, 4th character filtering, 2nd, 3rd character offsets classic similarity classification features cleaning click-through rate co-occurrence counting cold-start problem COLLAB_FILTER filter, 2nd collaboration filtering, using co-occurrence counting search relevance and collation collocation extraction combining fields committed documents common words, removing completion field, 2nd completion suggester completion_analyzer completion_prefix variable complexphrase query parser compound queries, 2nd, 3rd concept search basic methods for building augmenting content with synonyms concept signals building using machine learning personalized search and configurations conflate tokens constant_score query, 2nd content augmentation curation engineer/curator pairing risk of miscommunication with content curator role of content curator exploring extracting into documents providing to search engine searching content group content weight, 2nd ContentScore control analysis controlling field matching converge conversion rate coord (coordinating factor), 2nd, 3rd, 4th, 5th, 6th, 7th copyField, 2nd, 3rd copy_to option, 2nd cosine similarity cross_fields, 2nd, 3rd searching, 2nd, 3rd, 4th Solr solving signal discordance with cuisine field cuisine_hifi field, 2nd cuisine_lofi field curation, search relevance and custom all field custom score query D data-driven culture debugging example search application Elasticsearch first searches with The Movie Database Python matching query matching analysis to solve matching issues comparing query to inverted index fixing by changing analyzers query parsing underlying strategy ranking computing weight explain feature scoring matches to measure relevance search term importance similarity vector-space model decay functions, 2nd deep paging default analyzer defType parameter delimiters acronyms modeling specificity phone numbers synonyms tokenizing geographic data tokenizing integers tokenizing melodies deployment, relevance-focused search application description field, 2nd, 3rd, 4th, 5th descriptive query directors field directors.name field, 2nd, 3rd, 4th directors.name score directors.name.bigrammed, 2nd, 3rd disable_coord option disabling tokenization discriminating fields DisjunctionMaximumQuery dismax, 2nd doc frequency, 2nd doc values document search and retrieval aggregations Boolean search facets filtering Lucene-based search positional and phrase matching ranked results relevance sorting document-ranking system documents analysis enhancement enrichment extraction flattening nested grouping similar matching meaning of scored search completion from documents being searched tokens as features of matching process meaning of documents dot character dot product, 2nd down-boosting title DSL (domain-specific language) E e-commerce search, 2nd easy_install utility edismax query parser, 2nd Elasticsearch example search application overview end sentinels engaged field engaged restaurants English analyzer overview reindexing with english_* filters english_bigrams analyzer english_keywords filter english_possessive_stemmer filter english_stemmer filter english_stop filter enrichment, 2nd ETL (extract, transform, load), 2nd every field gets a vote exact matching, 2nd, 3rd, 4th expert search, 2nd, 3rd explanation field external sources extract function, 2nd, 3rd extracting features extraction F faceted browsing overview Solr facet.prefix option facets, 2nd, 3rd fail fast, 2nd, 3rd, 4th fast vector highlighter feature modeling, 2nd feature selection feature space features creation of overview, 2nd, 3rd feedback at search box search completion search suggestions search-as-you-type business and domain awareness content curation risk of miscommunication with content curator role of content curator in search results listing grouping similar documents information presented snippet highlighting when there are no results search relevance and Solr faceted browsing field collapsing match phrase prefix relevance feedback feature mappings suggestion and highlighting components while browsing alternative results ordering breadcrumb navigation faceted browsing field boosts field collapsing overview Solr field discordance field mappings field normalization field scores, 2nd field synchronicity, signal modeling and field-by-field dismax field-centric methods, 2nd field-centric search, combining term-centric search and combining greedy search and conservative amplifiers like fields precision vs. recall Solr fieldNorms, 2nd, 3rd fields fieldType field_value_factor function fieldWeight, 2nd, 3rd, 4th filter clause filter element filter queries filtering Amazon-style collaborative overview using co-occurrence counting score shaping vs. boosting finite state transducer fire token first_name field floating-point numbers fragment_size parameter fudge factors full bulk command full search string full-text search full_name field function decay function queries, multiplicative boosting with Boolean queries vs. combining high-value tiers scored with simple Solr function_score query, 2nd, 3rd, 4th G garbage features Gaussian decay generalizing matches generate_word_parts genres aggregation genres.name field geographic data, tokenizing geohashing geolocation, 2nd getCastAndCrew function GitHub repository granular fields grouping fields H has_discount field high-quality signals highlighted snippets highlights HTMLStripCharFilter HTTP commands, 2nd I ideal document IDF (inverse document frequency) ignoring when ranking overview, 2nd inconsistent scoring index-time analysis, 2nd index-time personalization indexing documents information and requirements gathering business needs required and available information users and information needs information retrieval, creating relevance solutions through inner objects innermost calculation integers, tokenizing inventory-related files inventory_dir configuration inverse document frequency.


The Science of Language by Noam Chomsky

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, backpropagation, British Empire, Brownian motion, Computing Machinery and Intelligence, dark matter, Drosophila, epigenetics, finite state, Great Leap Forward, Howard Zinn, language acquisition, phenotype, public intellectual, statistical model, stem cell, Steven Pinker, Stuart Kauffman, theory of mind, trolley problem

On the first question, it is important to recognize that it is difficult to avoid hierarchy. The successor function for generating the natural numbers introduces it, and in fact any finite state grammar introduces hierarchy: each additional element added in generating a string of words or other elements yields a larger set of which the prior-generated set is a subset. If one adds the operation of associativity, one can make the hierarchy disappear and end up with a string, but that requires an additional operation – and justification for introducing it. I do not mention a finite state grammar because it introduces the right hierarchy for natural languages. Chomsky demonstrated long ago that it did not (1957; see also Chomsky & Miller 1963).

R. 26, 33, 197, 203, 268 Gandhi, Mahatma 114, 144 garden path sentences 50 Gauss, Carl F. 127 Gehring, Walter 46, 171, 258, 279 genes 46–49, 148, 173master genes 279, 280 and Merge 49 PAX-6 46, 280 universal genome 53 Gleitman, Lila 196 Godwin, Richard 123 Golinkoff, Roberta 196 Goodman, Nelson 81, 83, 88, 261, 285behaviorism 89, 285 and Chomsky 86–92 constructivism 285 nominalism 87, 91 personal relationship with Chomsky 91–92 Gould, Stephen J. 158, 170, 172, 173 grammar 277–278 and acquisition of language 24, 60 artifacts in theories of 238 extensional equivalence of 153 finite state 232 generative 63, 85, 91, 96, 99 generative capacity 236 phrase structure 233, 235 structure and hierarchy 236 transformational 25 ‘great leap forward' 13, 70, 179 growth 40, 73, 77cognitive growth 121 developmental constraints on 41, 45, 158 Haldane, J. B. S. 51, 53 Hale, Kenneth 17, 62 Halle, Morris 21 Hamilton, William D. 104 Harman, Gilbert 100 Harris, Zellig 38, 80, 81, 86 Hauser, Marc 100, 109, 286evolution of communication 12, 58 faculty of language 60, 170, 172, 268, 269 hearing 48 Helmholtz, Hermann von 73, 97 Herbert of Cherbury 181 Higginbotham, Jim 129, 130 Hirsh-Pasek, Kathy 196 homunculus 37, 290 Hornstein, Norbert 29, 183, 265 human behavior 138–151, 286 human evolution 2, 13, 71developmental constraints on 41 ‘great leap forward' 13, 70, 77 human nature 95–102, 108–112 and biological capacities 95 Chomsky on 95–102 determined and uniform 95, 99 distinctiveness of 176–179 enlightenment conception of 142 and evolution 103–107 ‘great leap forward' 179 moral agency 101 plasticity of 121 humanitarian intervention 121, 122, 287 humans, genetic variation 13 Hume, David 26, 90, 99, 106, 179color problem 247–248, 286 theory of moral nature 63, 99, 109 Huxley, Thomas 23 I-beliefs 153–156 definition of 156 I-concepts 153–156 definition of 155 I-language 81, 153–156, 164, 239, 258, 266intensional specification of 167 imagination 70, 161 inclusiveness 62, 281 induction 88, 90, 95 inference 73, 165, 221 information 208, 213, 218, 228, 229, 254pragmatic 30 semantic 29, 260 innateness 39–45, 60, 89, 91, 255, 267, 284 innatism 123 innovation 71, 74, 95, 177, 178, 185, 282technological 145 insects, study of 147 instinct 96, 143, 178, 181, 247, 248, 287 instrumentalism 211 intention (see also nativism) 163 internalism 6, 228, 248, 262–263, 269, 287and concepts 188, 190, 209, 255–257, 260, 272 intuitions 125, 126 island sentences 50 Jackendoff, Ray 170, 172 Jacob, François 24, 53, 60, 243 Joos, Martin 145 justice 120 Kahneman, Daniel 140 Kant, Immanuel 90 Kauffman, Stuart 21, 22, 266 Kayne, Richard 55, 84, 241 Keller, Helen 45 Kissinger, Henry 101, 107, 113, 287 Klein, Ralph 111 knowledge 70, 193See also information Kripke, Saul 126 Kropotkin, Peter 103, 111 languageand agency 124–128 as an animal instinct 178 and arithmetical capacities 16 and biology 21–30, 80, 235, 284 biophysical explanations of 208 and brain morphology 46 capacity for 70, 164 characteristic uses of 11–12 cognitive benefits of 2 competence and use 63 and complex thought 1 complexity of 52, 146 compositional character of 37 computational theory of 174, 272 and concepts 71, 198 conceptual resources of 212 displacement property 16 distinctive features 22 domination 232–238 expectations for 54 externalization of 52, 78, 79, 153, 222, 278 flexibility 95, 162, 197, 210, 224, 227 formal languages 16, 17, 289 formal theory of 21–30 functions of 11–20, 164, 165 generative capacity 49 head-first 240 hierarchical structure 232–238 I-language 153–156, 164, 239, 258, 266 interface conditions 25 internal 37 internal, individual and intensional 37, 154, 167 internal use of 52, 69, 124, 153, 160, 197, 262–263, 272–274 a ‘knowledge' system 187, 193 localization of 46, 59, 69–74 and mathematics 181 modularity of 59 movement property 16, 85, 108, 264–265 as a natural object 2, 7 nominalizing languages 155 open texture of 273 and other cognitive systems 271 phonetic features 42 phonological features 42, 57 precursors of 43, 77 properties of 22, 37, 60, 62 public language 153, 288 purposes of 224 and reason 181 result of historical events 84 rules of 165, 221, 223, 224, 225, 283, 284 and science 124–128 sounds available in 282 structural features of 42 structure of 236, 277–278 study of 36, 76, 79, 154See also linguistics theories of 164, 193, 239, 243, 285 unboundedness 177, 262 uniqueness to humans 150 variation in the use of 164, 239–242 language faculty 74, 172, 177, 243, 260, 261, 270adicity requirements of 198, 199 perfection of 50 language of thought 27, 71, 189, 190, 220, 230, 269 Lasnik, Howard 85 learning 95, 180, 200, 226, 281, 282empiricism and 173, 179 learning a language 187, 225, 226 Lenneberg, Eric 21, 43, 47, 59 Lepore, E. 195 Lewis, David 153, 165, 220, 222, 223, 224 Lewontin, Richard 58, 157, 170, 172, 173, 175, 231 lexical items 62categories of 234 origin of 46 liberalism 98 linguistic communities 222 linguistic development 39See also development linguistic practices 221, 223 linguistic principles 237, 276 linguistics 19, 36, 82, 145and biology 150 first factor considerations 45, 96, 148 and natural science 38 and politics 152 procedural theories in 149 second factor considerations 148, 277 structural 80 theories of 87, 265 third factor considerations:separate entry Locke, John 26, 125, 267personal identity 31, 271 secondary qualities 256 logic, formal 251 Logical Structure of Linguistic Theory 84–85 Lohndal, Terje 57 Lorenz, Konrad 21 Marx, Karl 122 mathematics 127, 165, 214, 215, 266capacity for 15, 136 formal functions in 166–169 and language 181 semantics for 251, 252 Mayr, Ernst 174 meaning 29, 98, 199, 206, 250, 252, 270, 273computational theory of 213 construction of a science of 226–230 externalist science of 209–220 methodology for a theory of 226, 227 study of 261 theories of 221 theory of 212, 214, 217, 226 Mehler, Jacques 55 Merge 16, 77, 91, 181, 236, 243, 263, 279–280 centrality of 41, 60, 62, 176, 245 consequences of 17 and edge properties 17, 41 Merge, external 17, 166, 201, 238, 263 Merge, internal 16, 25, 29, 85, 201, 238, 264 mutation giving rise to 43, 52 origin of 14, 15 Pair Merge 201, 264 and psychic identity 28 uniqueness to humans 25, 200, 205 metaphor 195 metaphysics 125, 157 Mikhail, John 63, 99, 100, 109, 129, 286 Mill, John Stuart 121, 122, 287 Miller, George 81 mindas a causal mechanism 138 computational sciences of 247 computational theory of 280 philosophy of 186, 255 place of language in 69–74 representational theory of 162, 188 science of 138–151, 212, 288 theory of 14 Minimalist Program 24, 83, 84, 233, 235–236, 237, 245, 246, 264and adaptationism 172 aim of 42, 199 simplicity and 80, 243, 285 modes of presentation (MOPs) 187, 190, 217, 219, 275roles of 218 morality 99, 100, 109, 287character of 110 conflicting systems 114 generation of action or judgment 110 moral truisms 101, 102 theories of 110, 135 trolley problems 109 and universalization 113–117 Moravcsik, Julius 164 morphemes 81, 149 morphology 52, 54, 195distributed 27 and syntax 200 Morris, Charles 250 Move 108 mutations 14, 43, 170, 171survival of 51, 53 mysterianism 97 Nagel, Thomas 98 Narita, Hiroki 57 nativism 187, 217, 283 natural numbers 204 natural sciences 18, 38 natural selection 58, 76, 104, 143, 157 Navajo language 277 neural networks 225 neurophysiology 74 Newton, Isaac 66, 67, 72, 88, 127, 134alchemy 67 nominalism 87, 91 non-violence 114 Norman Conquest 84 objective existence 169 optimism 118–123, 288 parameters 39–45, 54, 239–242, 277, 282, 283and acquisition of language 241 choice of 45, 83 developmental constraints in 243 functional categories 240 head-final 55, 240 headedness macroparameter 241, 276 linearization parameter 55 macroparameters 55 microparameters 55, 84, 241 polysynthesis 55 and simplicity 80 Peck, James 288 Peirce, Charles Sanders 96, 132, 184, 250abduction 168, 183, 246, 248 truth 133, 136 perfection 50–58, 172, 175, 263–264, 279 person, concept of 125, 126, 271, 284‘forensic' notion of 125 persuasion 114, 116 Pesetsky, David 30 Petitto, Laura-Ann 48, 78 phenomenalism 211 philosophers 129–131, 282, 283contribution of 129 contribution to science 129 philosophy 181accounts of visual sensations 255–257 of language 35, 273 of mind 186, 255 problems in 286 and psychology 140 phonemes 81 phonetic/phonological interfaces 161, 194, 253, 278 phonology 28, 40, 52, 54, 57, 109, 208 physicalism 187 physics 19, 65, 106, 144and chemistry 65 folk physics 72 theoretical 18, 65, 73, 100 Piattelli-Palmarini, Massimo 140, 246, 279 Pietroski, Paulconcepts 47, 199, 200, 209 semantics 198, 211, 223, 229, 254 Pinker, Steven 166, 170, 172, 176 Pirahã language 30 Plato 115 Plato's Problem 23, 195, 236, 244, 246, 266 Poincaré, Henri 65 politics 116, 119, 145, 146, 152 poverty of the stimulus observations 5, 23, 40, 177, 200, 227, 233, 262 power 120 pragmatic information 30 pragmatics 36, 130, 250–254, 289definition of 250 and reference 253 principles and parameters approach to linguistic theory 24, 53, 235, 236, 240, 245, 276language acquisition 60, 82, 83, 149 and simplicity 246 progress 118, 145, 183 projection problem 83, 89 prosody 37 psychic continuity 26, 205, 207, 271 psychology 219of belief and desire 138, 141 comparative 21 evolutionary 103–107, 111 folk psychology 72, 141 and philosophy 140 rationalistic 255 scientific 140 psychology, comparative 21 public intellectuals 122 Pustejovsky, James 164, 195 Putnam, Hilary 95, 126, 138 Quine, W.


pages: 721 words: 197,134

Data Mining: Concepts, Models, Methods, and Algorithms by Mehmed Kantardzić

Albert Einstein, algorithmic bias, backpropagation, bioinformatics, business cycle, business intelligence, business process, butter production in bangladesh, combinatorial explosion, computer vision, conceptual framework, correlation coefficient, correlation does not imply causation, data acquisition, discrete time, El Camino Real, fault tolerance, finite state, Gini coefficient, information retrieval, Internet Archive, inventory management, iterative process, knowledge worker, linked data, loose coupling, Menlo Park, natural language processing, Netflix Prize, NP-complete, PageRank, pattern recognition, peer-to-peer, phenotype, random walk, RFID, semantic web, speech recognition, statistical model, Telecommunications Act of 1996, telemarketer, text mining, traveling salesman, web application

Matching and discovery of such patterns are very useful in many applications, not only in bioinformatics. Due to their readily interpretable structure, patterns play a particularly dominant role in data mining. There have been many techniques used to model global or local temporal events. We will introduce only some of the most popular modeling techniques. Finite State Machine (FSM) has a set of states and a set of transitions. A state may have transitions to other states that are caused by fulfilling some conditions within the state. An FSM must have an initial state, usually drawn with an arrow, and it is a state that provides a starting point of the model.

An FSM can be represented using a state-transition table or state-transition diagram. Figures 12.20a,b shows both of these representations for a modeling recognition of a binary number with an even number of ones. FSM does not work very well when the transitions are not precise and does not scale well when the set of symbols for sequence representation is large. Figure 12.20. Finite-state machine. (a) State-transition table; (b) state-transition diagram. Markov Model (MM) extends the basic idea behind FSM. Both FSM and MM are directed graphs. As with FSM, MM always has a current state. Start and end nodes are drawn for illustrative purposes and need not be present. Unlike in FSM, transitions are not associated with specific input values.

The task of a sequence mining is to systematically discover all sequential patterns in database D. Counting frequencies of parallel itemsets is straightforward and described in traditional algorithms for frequent itemsets detection. Counting serial itemsets, on the other hand, requires more computational resources. For example, unlike for parallel itemsets, we need finite-state automata to recognize serial episodes. More specifically, an appropriate l-state automaton can be used to recognize occurrences of an l-node serial sequence. For example, for the sequence (A → B → A → A), there would be a five-state automaton (FSA) given in Figure 12.23. It transits from its first state on seeing an event of type A and then waits for an event of type B to transit to its next state, and so on.


pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI by John Brockman

AI winter, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alignment Problem, AlphaGo, artificial general intelligence, Asilomar, autonomous vehicles, basic income, Benoit Mandelbrot, Bill Joy: nanobots, Bletchley Park, Buckminster Fuller, cellular automata, Claude Shannon: information theory, Computing Machinery and Intelligence, CRISPR, Daniel Kahneman / Amos Tversky, Danny Hillis, data science, David Graeber, deep learning, DeepMind, Demis Hassabis, easy for humans, difficult for computers, Elon Musk, Eratosthenes, Ernest Rutherford, fake news, finite state, friendly AI, future of work, Geoffrey Hinton, Geoffrey West, Santa Fe Institute, gig economy, Hans Moravec, heat death of the universe, hype cycle, income inequality, industrial robot, information retrieval, invention of writing, it is difficult to get a man to understand something, when his salary depends on his not understanding it, James Watt: steam engine, Jeff Hawkins, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kickstarter, Laplace demon, Large Hadron Collider, Loebner Prize, machine translation, market fundamentalism, Marshall McLuhan, Menlo Park, military-industrial complex, mirror neurons, Nick Bostrom, Norbert Wiener, OpenAI, optical character recognition, paperclip maximiser, pattern recognition, personalized medicine, Picturephone, profit maximization, profit motive, public intellectual, quantum cryptography, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, Richard Feynman, Rodney Brooks, self-driving car, sexual politics, Silicon Valley, Skype, social graph, speech recognition, statistical model, Stephen Hawking, Steven Pinker, Stewart Brand, strong AI, superintelligent machines, supervolcano, synthetic biology, systems thinking, technological determinism, technological singularity, technoutopianism, TED Talk, telemarketer, telerobotics, The future is already here, the long tail, the scientific method, theory of mind, trolley problem, Turing machine, Turing test, universal basic income, Upton Sinclair, Von Neumann architecture, Whole Earth Catalog, Y2K, you are the product, zero-sum game

While we argue about the intelligence of digital computers, analog computing is quietly supervening upon the digital, in the same way that analog components like vacuum tubes were repurposed to build digital computers in the aftermath of World War II. Individually deterministic finite-state processors, running finite codes, are forming large-scale, nondeterministic, non-finite-state metazoan organisms running wild in the real world. The resulting hybrid analog/digital systems treat streams of bits collectively, the way the flow of electrons is treated in a vacuum tube, rather than individually, as bits are treated by the discrete-state devices generating the flow.


Design Patterns: Elements of Reusable Object-Oriented Software (Joanne Romanovich's Library) by Erich Gamma, Richard Helm, Ralph Johnson, John Vlissides

A Pattern Language, Donald Knuth, financial engineering, finite state, Ivan Sutherland, L Peter Deutsch, loose coupling, MVC pattern, yield curve

Decorator Pattern The Decorator (175) pattern captures class and object relationships that support embellishment by transparent enclosure. The term “embellishment” actually has broader meaning than what we’ve considered here. In the Decorator pattern, embellishment refers to anything that adds responsibilities to an object. We can think for example of embellishing an abstract syntax tree with semantic actions, a finite state automaton with new transitions, or a network of persistent objects with attribute tags. Decorator generalizes the approach we’ve used in Lexi to make it more widely applicable. 2.5 Supporting Multiple Look-and-Feel Standards Achieving portability across hardware and software platforms is a major problem in system design.

It takes inputState as an argument representing the current state of the matching process, having read part of the input string. This current state is characterized by a set of input streams representing the set of inputs that the regular expression could have accepted so far. (This is roughly equivalent to recording all states that the equivalent finite state automata would be in, having recognized the input stream to this point). The current state is most important to the repeat operation. For example, if the regular expression were 'a' repeat then the interpreter could match "a", "aa", "aaa", and so on. If it were 'a' repeat & 'be' then it could match "abc", "aabc", "aaabc", and so on.


pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, backpropagation, Bletchley Park, British Empire, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, computer age, Computing Machinery and Intelligence, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, independent contractor, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, launch on warning, low earth orbit, machine readable, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, spectrum auction, strong AI, synthetic biology, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

Newman said, “was like catching mice just as they were entering a hole in the wall.”38 A series of electrical pulses, about a microsecond apart, were converted to a train of sound waves circulating in a long tube of mercury equipped with crystal transducers at both ends. About a thousand digits could be stored in the millisecond it took a train of pulses to travel the length of a five-foot “tank.” Viewed as part of a finite-state Turing machine, the delay line represented a continuous loop of tape, a thousand squares in length and making a thousand complete passes per second under the read-write head. Turing specified some two hundred tubes, each storing thirty-two words of 32 bits, for a total, “comparable with the memory capacity of a minnow,” of about 200,000 bits.39 “The property of being digital,” announced Turing to the London Mathematical Society in a 1947 lecture on his design, “should be of more interest than that of being electronic.”40 Whether memory took the form of paper tape, vacuum-tube flip-flops, mercury pulse trains, or even papyrus scrolls did not matter as long as discrete symbols could be freely read, written, relocated, and, when so instructed, erased.

See Institute for Advanced Study IAS (Institute for Advanced Study) computer, xii, 78–79, 91–92, 93–107 as ancestor of the microprocessor, 98, 203 and artificial life, xii, 111–18, 121, 124–26, 129, 192. see also Barricelli construction and operation, 97–107, 111 and digital computing at RAND, 104, 148, 178 duplication of, 97, 98, 107 logical and physical architecture, 98, 99–107, 157 and nuclear weapons, 78–79, 91–92, 107, 111 and origins of IBM model 701, 91, 106 origins of, and weather prediction, 87–88 peripheral equipment, 98, 101–102, 106, 144 programming of, 102, 106–107, 114, 121, 130 progress reports, and impact of, 98, 99, 121 and random-access memory, 98, 103–105, 113 shakedown run, 78–79, 111 siblings and offspring, listed, 97 and von Neumann, 78–79, 87–88, 91–92, 97, 98–102, 106–108, 125, 153 IBM (International Business Machines) 12, 91, 103–104, 106, 122, 144, 148, 179. see also SAGE and evolution of operating systems, 122, 189 and IAS computer project, 91, 106 and punched-card computing, 60, 78, 81, 82, 83, 106, 122, 144 and von Neumann, 91 IBM computers: model 650, 122; model 701, 91, 106, 178; model 704, 118, 184; model 7090, 151, 182 iconoscope, 85, 104 ideas. see also consciousness; meaning; mind Darwinian evolution of, 28, 184 and formal logic, 38, 43, 46, 49, 129 nature of, 136, 158, 225 IFF (Identification Friend or Foe) radar, 104 Illinois, University of, 107 immortality, and composite organisms, 175, 191, 210 and non-Darwinian evolution, 31 improbability, and origins of life, 29–30, 112, 177 incompleteness (mathematical), 49–50, 53–54, 70, 72, 78, 120, 167, 228 Industrial Revolution, 21–22, 134 infinity, and finite-state machines, 10, 35, 43, 56, 130, 190 information. see also bandwidth; bits; communication; cybernetics; telecommunication and cybernetics, 6, 98, 101 defined, by Bateson, 167 flow, in data networks, 12, 110, 150, 158–59, 205 mathematical theory of, 110, 153, 155 and meaning, 8, 155, 158, 167, 171, 184–85 and money, 162, 165 and origins of life, 12, 29 insects, 8, 13, 129, 170, 174, 210 Instinct and Reason (Smee), 45, 48 Institute for Advanced Study (IAS), Princeton, N.J.


pages: 1,172 words: 114,305

New Laws of Robotics: Defending Human Expertise in the Age of AI by Frank Pasquale

affirmative action, Affordable Care Act / Obamacare, Airbnb, algorithmic bias, Amazon Mechanical Turk, Anthropocene, augmented reality, Automated Insights, autonomous vehicles, basic income, battle of ideas, Bernie Sanders, Big Tech, Bill Joy: nanobots, bitcoin, blockchain, Brexit referendum, call centre, Cambridge Analytica, carbon tax, citizen journalism, Clayton Christensen, collective bargaining, commoditize, computer vision, conceptual framework, contact tracing, coronavirus, corporate social responsibility, correlation does not imply causation, COVID-19, critical race theory, cryptocurrency, data is the new oil, data science, decarbonisation, deep learning, deepfake, deskilling, digital divide, digital twin, disinformation, disruptive innovation, don't be evil, Donald Trump, Douglas Engelbart, driverless car, effective altruism, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, fake news, Filter Bubble, finite state, Flash crash, future of work, gamification, general purpose technology, Google Chrome, Google Glasses, Great Leap Forward, green new deal, guns versus butter model, Hans Moravec, high net worth, hiring and firing, holacracy, Ian Bogost, independent contractor, informal economy, information asymmetry, information retrieval, interchangeable parts, invisible hand, James Bridle, Jaron Lanier, job automation, John Markoff, Joi Ito, Khan Academy, knowledge economy, late capitalism, lockdown, machine readable, Marc Andreessen, Mark Zuckerberg, means of production, medical malpractice, megaproject, meta-analysis, military-industrial complex, Modern Monetary Theory, Money creation, move fast and break things, mutually assured destruction, natural language processing, new economy, Nicholas Carr, Nick Bostrom, Norbert Wiener, nuclear winter, obamacare, One Laptop per Child (OLPC), open immigration, OpenAI, opioid epidemic / opioid crisis, paperclip maximiser, paradox of thrift, pattern recognition, payday loans, personalized medicine, Peter Singer: altruism, Philip Mirowski, pink-collar, plutocrats, post-truth, pre–internet, profit motive, public intellectual, QR code, quantitative easing, race to the bottom, RAND corporation, Ray Kurzweil, recommendation engine, regulatory arbitrage, Robert Shiller, Rodney Brooks, Ronald Reagan, self-driving car, sentiment analysis, Shoshana Zuboff, Silicon Valley, Singularitarianism, smart cities, smart contracts, software is eating the world, South China Sea, Steve Bannon, Strategic Defense Initiative, surveillance capitalism, Susan Wojcicki, tacit knowledge, TaskRabbit, technological solutionism, technoutopianism, TED Talk, telepresence, telerobotics, The Future of Employment, The Turner Diaries, Therac-25, Thorstein Veblen, too big to fail, Turing test, universal basic income, unorthodox policies, wage slave, Watson beat the top human players on Jeopardy!, working poor, workplace surveillance , Works Progress Administration, zero day

Xi’s “use of military power has undermined the larger strategic goal it was meant to achieve: a stable neighborhood. Countries like Japan and Vietnam have not backed down when threatened by fishing militias and aircraft carriers. Instead, China’s forcefulness seems to have spurred their quests for military power.” 75. Henry Farrell, “Seeing Like a Finite State Machine,” Crooked Timber (blog), November 25, 2019, http://crookedtimber.org/2019/11/25/seeing-like-a-finite-state-machine/. 76. Robert A. Burton, “Donald Trump, Our A.I. President,” New York Times, May 22, 2017, https://www.nytimes.com/2017/05/22/opinion/donald-trump-our-ai-president.html. 77. John Glaser, Christopher H. Preble, and A. Trevor Thrall, Fuel to the Fire: How Trump Made America’s Broken Foreign Policy Even Worse (and How We Can Recover) (Washington, DC: Cato Institute, 2019). 78.


pages: 249 words: 45,639

Learn Python the Hard Way by Zed Shaw

complexity theory, finite state, functional programming, index card, web application

Once you have doc comments as the room description, do you need to have the function prompt even? Have the runner prompt the user, and pass that in to each function. Your functions should just be if-statements printing the result and returning the next room. This is actually a small version of something called a "finite state machine". Read about them. They might not make sense but try anyway. © Copyright 2010, Zed A. Shaw. Last updated on Jun 24, 2011. Exercise 42: Gothons Are Getting Classy While it's fun to put functions inside of dictionaries, you'd think there'd be something in Python that does this for you.


pages: 214 words: 14,382

Monadic Design Patterns for the Web by L.G. Meredith

barriers to entry, domain-specific language, don't repeat yourself, finite state, functional programming, Georg Cantor, ghettoisation, higher-order functions, John von Neumann, Kickstarter, semantic web, seminal paper, social graph, type inference, web application, WebSocket

Cover · Overview · Contents · Discuss · Suggest · Glossary · Index Section 7.6 Chapter 7 · A Review of Collections as Monads Kleene star and other connectives Now, since these sets of concatenations of symbols (aka words) are often infinitary, some means of representing them without having to explicitly or extensionally enumerate their constituents have developed. Most programmers are familiar with the idea of regular expressions and the corresponding regular languages built from very simple operators: concatenation (ab), choice (a + b), and iteration (a∗ ) – sometimes called Kleene star. These operators actually denote many things at once: • Finite state machines that can recognize when a concatenation symbol has a corresponding form • That form’s sets of words; thus, ab∗ , for example, denotes the set of all words {ε, ab, abab, ababab, ...} Download from Wow! eBook <www.wowebook.com> • The means of generating this set of words The operators form a kind of algebra over regular expressions and the algebraic operations correspond to operations on the things the regular expressions denote – thus, for example, the choice operator corresponds to the union of sets.


Speaking Code: Coding as Aesthetic and Political Expression by Geoff Cox, Alex McLean

4chan, Amazon Mechanical Turk, augmented reality, bash_history, bitcoin, Charles Babbage, cloud computing, commons-based peer production, computer age, computer vision, Computing Machinery and Intelligence, crowdsourcing, dematerialisation, Donald Knuth, Douglas Hofstadter, en.wikipedia.org, Everything should be made as simple as possible, finite state, Free Software Foundation, Gabriella Coleman, Gödel, Escher, Bach, Hacker Conference 1984, Ian Bogost, Jacques de Vaucanson, language acquisition, Larry Wall, late capitalism, means of production, natural language processing, Neal Stephenson, new economy, Norbert Wiener, Occupy movement, packet switching, peer-to-peer, power law, Richard Stallman, Ronald Coase, Slavoj Žižek, social software, social web, software studies, speech recognition, SQL injection, stem cell, Stewart Brand, systems thinking, The Nature of the Firm, Turing machine, Turing test, Vilfredo Pareto, We are Anonymous. We are Legion, We are the 99%, WikiLeaks, Yochai Benkler

Chomsky assumes that somehow grammar is given in advance, hard-wired or preprogrammed: that humans possess innate grammatical competence that is presocial (and one of the controversies of Chomsky’s system is its separation of consciousness from the outside social world). Thus he explains the deepseated rules by which language operates and how principles and processes are set by which sentences are generated. In procedural terms, a natural language can be simply described as a “finite state Markov process that produces sentences from left to right.”10 In mathematics and other formalized systems, the logic adheres to similar grammars that generate their own grammatical sequences. Chomsky’s conclusion is that grammar is independent of meaning, and that the separation from semantics is essential to the study of syntactic structure.


pages: 504 words: 89,238

Natural language processing with Python by Steven Bird, Ewan Klein, Edward Loper

bioinformatics, business intelligence, business logic, Computing Machinery and Intelligence, conceptual framework, Donald Knuth, duck typing, elephant in my pajamas, en.wikipedia.org, finite state, Firefox, functional programming, Guido van Rossum, higher-order functions, information retrieval, language acquisition, lolcat, machine translation, Menlo Park, natural language processing, P = NP, search inside the book, sparse data, speech recognition, statistical model, text mining, Turing test, W. E. B. Du Bois

Contributions in the following areas are particularly encouraged: 444 | Afterword: The Language Challenge Phonology and morphology Computational approaches to the study of sound patterns and word structures typically use a finite-state toolkit. Phenomena such as suppletion and non-concatenative morphology are difficult to address using the string-processing methods we have been studying. The technical challenge is not only to link NLTK to a highperformance finite-state toolkit, but to avoid duplication of lexical data and to link the morphosyntactic features needed by morph analyzers and syntactic parsers. High-performance components Some NLP tasks are too computationally intensive for pure Python implementations to be feasible.


Turing's Cathedral by George Dyson

1919 Motor Transport Corps convoy, Abraham Wald, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Benoit Mandelbrot, Bletchley Park, British Empire, Brownian motion, cellular automata, Charles Babbage, cloud computing, computer age, Computing Machinery and Intelligence, Danny Hillis, dark matter, double helix, Dr. Strangelove, fault tolerance, Fellow of the Royal Society, finite state, Ford Model T, Georg Cantor, Henri Poincaré, Herman Kahn, housing crisis, IFF: identification friend or foe, indoor plumbing, Isaac Newton, Jacquard loom, John von Neumann, machine readable, mandelbrot fractal, Menlo Park, Murray Gell-Mann, Neal Stephenson, Norbert Wiener, Norman Macrae, packet switching, pattern recognition, Paul Erdős, Paul Samuelson, phenotype, planetary scale, RAND corporation, random walk, Richard Feynman, SETI@home, social graph, speech recognition, The Theory of the Leisure Class by Thorstein Veblen, Thorstein Veblen, Turing complete, Turing machine, Von Neumann architecture

He delivered a complete description of a million-cycle-per-second Automatic Computing Engine (ACE), accompanied by circuit diagrams, a detailed physical and logical analysis of the internal storage system, sample programs, detailed (if bug-ridden) subroutines, and even an estimated cost of £11,200.41 As Sara Turing later explained, her son’s goal was “to see his logical theory of a universal machine, previously set out in his paper ‘Computable Numbers,’ take concrete form.”42 After a comparison of available forms of storage, ranging from punched paper tape through “cerebral cortex” to electrostatic storage tubes, Turing specified mercury-filled acoustic delay lines for high-speed storage. He estimated cost, access time, and “spacial economy” (in digits/liter) for all forms of storage, putting the cost of a cerebral cortex at £300 per annum—his King’s College fellowship for the year. Viewed as part of a finite-state Turing machine, the delay line represented a continuous loop of tape, 1,000 squares in length and making 1,000 complete passes per second under the read/write head. Turing specified some 200 tubes, each storing 32 words of 32 bits each, for a total, “comparable with the memory capacity of a minnow,” of about 200,000 bits.43 Taking ten pages of the proposal to do so, Turing worked out the storage capacity, attenuation, noise, temperature sensitivity, and regenerative requirements, all from first principles.

Data that are widely replicated, or associated frequently by search requests, establish physical proximity that is manifested as proximity in time. More meaningful results appear higher on the list not only because of some mysterious, top-down, weighting algorithm, but because when microseconds count, they are closer, from the bottom up, in time. Meaning just seems to “come to mind” first. An Internet search engine is a finite-state, deterministic machine, except at those junctures where people, individually and collectively, make a nondeterministic choice as to which results are selected as meaningful and given a click. These clicks are then immediately incorporated into the state of the deterministic machine, which grows ever so incrementally more knowledgeable with every click.


pages: 259 words: 67,456

The Mythical Man-Month by Brooks, Jr. Frederick P.

Boeing 747, Conway's law, finite state, HyperCard, Ken Thompson, machine readable, Menlo Park, Multics, no silver bullet, seminal paper, sorting algorithm, speech recognition, Steve Jobs, Strategic Defense Initiative, Tacoma Narrows Bridge, Turing machine, work culture

Harel usefully defines a prototype as [A version of a program that] reflects only the design decisions made in the process of preparing the conceptual model, and not decisions driven by implementation concerns.12 It is possible to build a prototype that is not at all part of a product growing toward shipment. For example, one may build an interface prototype that has no real program function behind it, merely the finite-state machine that makes it appear to go through its paces. One can even prototype and test interfaces by the Wizard-of-Oz technique, with a concealed human simulating the system's responses. Such prototyping can be very useful for getting early user feedback, but it is quite apart from testing a product growing toward shipment.


Introducing Elixir by Simon St.Laurent, J. David Eisenberg

Debian, finite state, functional programming, higher-order functions, Pluto: dwarf planet, Ruby on Rails, web application

Some of them might restart a failed process if needed. OTP formalizes those activities, and a few more, into a set of behaviors (or behav‐ iours—the original spelling was British). The most common behaviors are GenServer (generic server) and Supervisor. Through Erlang, you can use the gen_fsm (finite state machine) and gen_event behaviors. Elixir provides the Mix build tool for creat‐ 169 ing applications so that you can package your OTP code into a single runnable (and updatable) system. The behaviors predefine the mechanisms you’ll use to create and interact with pro‐ cesses, and the compiler will warn you if you’re missing some of them.


pages: 999 words: 194,942

Clojure Programming by Chas Emerick, Brian Carper, Christophe Grand

Amazon Web Services, Benoit Mandelbrot, cloud computing, cognitive load, continuous integration, database schema, domain-specific language, don't repeat yourself, drop ship, duck typing, en.wikipedia.org, failed state, finite state, Firefox, functional programming, game design, general-purpose programming language, Guido van Rossum, higher-order functions, Larry Wall, mandelbrot fractal, no silver bullet, Paul Graham, platform as a service, premature optimization, random walk, Ruby on Rails, Schrödinger's Cat, semantic web, software as a service, sorting algorithm, SQL injection, Turing complete, type inference, web application

(def crawled-urls (atom #{})) (def word-freqs (atom {})) We’ll set up a bunch of agents in order to fully utilize all the resources we have available,[157] but we need to think through what state they’ll hold and what actions will be used to transition those states. In many cases, it is useful to think about agent state and the actions applied to it as forming a finite state machine; we’ve already walked through the workflow of our crawler, but we should formalize it. Figure 4-8. A web crawler’s primary state transitions The state of an agent at each point in this process should be obvious: prior to retrieving a URL, an agent will need a URL to retrieve (or some source of such URLs); prior to scraping, an agent will need a page’s content to scrape; and, prior to updating the cumulative crawler state, it will need the results of scraping.

It will leave the state of the agent set to a map containing the URL it pulls off the queue and its content: (declare run process handle-results) (defn ^::blocking get-url [{:keys [^BlockingQueue queue] :as state}] (let [url (as-url (.take queue))] (try (if (@crawled-urls url) state {:url url :content (slurp url) ::t #'process}) (catch Exception e ;; skip any URL we failed to load state) (finally (run *agent*))))) We’ll show run and explain what it’s doing in a little bit. If we’ve already crawled the URL pulled off the queue (or if we encounter an error while loading the URL’s content), we leave the agent’s state untouched. This implementation detail in our finite state machine adds a cycle to it where get-url will sometimes be invoked on a single agent multiple times before it transitions states. process will parse a URL’s content, using the links-from and words-from functions to obtain the URL’s content’s links and build a map containing the frequencies of each word found in the content.


pages: 294 words: 81,292

Our Final Invention: Artificial Intelligence and the End of the Human Era by James Barrat

AI winter, air gap, AltaVista, Amazon Web Services, artificial general intelligence, Asilomar, Automated Insights, Bayesian statistics, Bernie Madoff, Bill Joy: nanobots, Bletchley Park, brain emulation, California energy crisis, cellular automata, Chuck Templeton: OpenTable:, cloud computing, cognitive bias, commoditize, computer vision, Computing Machinery and Intelligence, cuban missile crisis, Daniel Kahneman / Amos Tversky, Danny Hillis, data acquisition, don't be evil, drone strike, dual-use technology, Extropian, finite state, Flash crash, friendly AI, friendly fire, Google Glasses, Google X / Alphabet X, Hacker News, Hans Moravec, Isaac Newton, Jaron Lanier, Jeff Hawkins, John Markoff, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, Loebner Prize, lone genius, machine translation, mutually assured destruction, natural language processing, Neil Armstrong, Nicholas Carr, Nick Bostrom, optical character recognition, PageRank, PalmPilot, paperclip maximiser, pattern recognition, Peter Thiel, precautionary principle, prisoner's dilemma, Ray Kurzweil, Recombinant DNA, Rodney Brooks, rolling blackouts, Search for Extraterrestrial Intelligence, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, smart grid, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Jurvetson, Steve Wozniak, strong AI, Stuxnet, subprime mortgage crisis, superintelligent machines, technological singularity, The Coming Technological Singularity, Thomas Bayes, traveling salesman, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, zero day

He said the halting problem is unsolvable because if the debugger encounters a halting problem in the target program, it will succumb to the infinite loop while analyzing it, and never determine if the halting problem was there. You, the programmer, would be waiting for it to come up with the answer for the same amount of time you’d wait for the original program to halt. That is, a very long time, perhaps forever. Marvin Minsky, one of the fathers of artificial intelligence, pointed out that “any finite-state machine, if left completely to itself, will fall eventually into a perfectly periodic repetitive pattern. The duration of this repeating pattern cannot exceed the number of internal states of the machine.” Translated, that means a computer of average memory, while running a program with a halting problem, would take a very long time to fall into a pattern of repetition, which could then be detected by a diagnostic program.


pages: 708 words: 223,211

The Friendly Orange Glow: The Untold Story of the PLATO System and the Dawn of Cyberculture by Brian Dear

air traffic controllers' union, AltaVista, Alvin Toffler, Apple II, Apple Newton, Buckminster Fuller, Charles Babbage, cloud computing, complexity theory, computer age, Computer Lib, conceptual framework, corporate social responsibility, disruptive innovation, Douglas Engelbart, Douglas Engelbart, Dynabook, Elon Musk, en.wikipedia.org, Fairchild Semiconductor, finite state, Future Shock, game design, Hacker News, Howard Rheingold, Ivan Sutherland, John Markoff, lateral thinking, linear programming, machine readable, Marc Andreessen, Marshall McLuhan, Menlo Park, Metcalfe’s law, Mitch Kapor, Mother of all demos, natural language processing, Neal Stephenson, Palm Treo, Plato's cave, pre–internet, publish or perish, Ralph Nader, Robert Metcalfe, Ronald Reagan, Silicon Valley, Silicon Valley startup, Skinner box, Skype, software is eating the world, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, the medium is the message, The Soul of a New Machine, three-martini lunch, Watson beat the top human players on Jeopardy!, Whole Earth Catalog

Not even CDC had been able to build such a thing on the 1604, but such was Bitzer’s unstoppable optimism that he was confident Hanson, all of nineteen, would be able to pull it off. “My work on the resident program,” says Hanson, “was built around Bitzer’s explaining a finite-state machine to me, and then expecting me to figure out the rest.” A “finite-state machine” is a concept used in electrical engineering and computer science to describe a “machine” or program that can be in one of several states until something changes causing the state to change. A simple analogy would be a traffic light, with its red, yellow, and green lights.


pages: 336 words: 88,320

Being Geek: The Software Developer's Career Handbook by Michael Lopp

do what you love, finite state, game design, job satisfaction, John Gruber, knowledge worker, reality distortion field, remote working, rolodex, Silicon Valley, Silicon Valley startup, Skype, sorting algorithm, systems thinking, web application

OK, maybe I am, but we're all doing it. We're all gathering data through the course of the day and creating a story based on that data, our experience, and our moods. It's a perfectly natural phenomenon to guide the narrative in our favor. We see the world how we want. A carpenter sees all problems as a nail. I see problems as finite state machines. As we edit our days into these stories, there is always a risk of fiction. This is why you need to identify and nurture Your People. You tell these stories to Your People without reservation. Your People love your stories—fiction and all. They love how you tell them, they laugh about the lies you tell yourself, and then they stop and they tell you the truth.


High-Frequency Trading by David Easley, Marcos López de Prado, Maureen O'Hara

algorithmic trading, asset allocation, backtesting, Bear Stearns, Brownian motion, capital asset pricing model, computer vision, continuous double auction, dark matter, discrete time, finite state, fixed income, Flash crash, High speed trading, index arbitrage, information asymmetry, interest rate swap, Large Hadron Collider, latency arbitrage, margin call, market design, market fragmentation, market fundamentalism, market microstructure, martingale, National best bid and offer, natural language processing, offshore financial centre, pattern recognition, power law, price discovery process, price discrimination, price stability, proprietary trading, quantitative trading / quantitative finance, random walk, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, Tobin tax, transaction costs, two-sided market, yield curve

. • Immediate market order cost: the cost we would pay for purchasing our remaining shares immediately with a market order. All of the features above were normalised in a standard fashion by subtracting their means, dividing by their standard deviations and time-averaging over a recent interval. In order to obtain a finite state space, features were discretised into bins in multiples of standard deviation units. Experiments can also be performed using continuous features and a parametric model representation, but are beyond the scope of this chapter. Along with our original state variables (v, t), the features above provide a rich language for dynamically conditioning our order placement on potentially relevant properties of the order book.


pages: 722 words: 90,903

Practical Vim: Edit Text at the Speed of Thought by Drew Neil

Bram Moolenaar, don't repeat yourself, en.wikipedia.org, fault tolerance, finite state, fizzbuzz, off-by-one error, place-making, QWERTY keyboard, web application

Operator-Pending mode is a case in point. We use it dozens of times daily, but it usually lasts for just a fraction of a second. For example, we invoke it when we run the command dw. It lasts during the brief interval between pressing d and w keys. Blink and you’ll miss it! If we think of Vim as a finite-state machine, then Operator-Pending mode is a state that accepts only motion commands. It is activated when we invoke an operator command, and then nothing happens until we provide a motion, which completes the operation. While Operator-Pending mode is active, we can return to Normal mode in the usual manner by pressing escape, which aborts the operation.


pages: 487 words: 95,085

JPod by Douglas Coupland

Asperger Syndrome, Drosophila, finite state, G4S, game design, Maui Hawaii, McMansion, neurotypical, pez dispenser, pre–internet, QWERTY keyboard, Ronald Reagan, special economic zone, sugar pill, tech worker, wage slave, Y2K

God is an Xkb state indicator God is a Window Maker docked application God is a multi-platform Z80cross-assembler God is a lightweight XML encoding library for Java God is a programmatic APIwritten in C++ God is Oracle's OCI8 and OCI9 APIs God is a configuration backup utility God is Web-based group ware and collaboration software God is a graphical editor for drawing finite state machines . . . Kaitlin was on the phone again, trying to extract herself from jPod. Cowboy was over by a ventilation unit, having a smoke. One of jPod's quirks is an air intake duct in front of which you can puff away on anything. Hell, you could let off an Exocet missile, and it'd suck everything up and away in a jiffy.


Practical Vim, Second Edition (for Stefano Alcazi) by Drew Neil

Bram Moolenaar, don't repeat yourself, en.wikipedia.org, fault tolerance, finite state, fizzbuzz, off-by-one error, place-making, QWERTY keyboard, web application

Operator-Pending mode is a case in point. We use it dozens of times daily, but it usually lasts for just a fraction of a second. For example, we invoke it when we run the command dw. It lasts during the brief interval between pressing d and w keys. Blink and you’ll miss it! If we think of Vim as a finite-state machine, then Operator-Pending mode is a state that accepts only motion commands. It is activated when we invoke an operator command, and then nothing happens until we provide a motion, which completes the operation. While Operator-Pending mode is active, we can return to Normal mode in the usual manner by pressing escape, which aborts the operation.


Practical Vim by Drew Neil

Bram Moolenaar, don't repeat yourself, en.wikipedia.org, fault tolerance, finite state, fizzbuzz, off-by-one error, place-making, QWERTY keyboard, web application

Operator-Pending mode is a case in point. We use it dozens of times daily, but it usually lasts for just a fraction of a second. For example, we invoke it when we run the command dw. It lasts during the brief interval between pressing d and w keys. Blink and you’ll miss it! If we think of Vim as a finite-state machine, then Operator-Pending mode is a state that accepts only motion commands. It is activated when we invoke an operator command, and then nothing happens until we provide a motion, which completes the operation. While Operator-Pending mode is active, we can return to Normal mode in the usual manner by pressing escape, which aborts the operation.


Lessons-Learned-in-Software-Testing-A-Context-Driven-Approach by Anson-QA

anti-pattern, Chuck Templeton: OpenTable:, finite state, framing effect, full employment, independent contractor, information retrieval, job automation, knowledge worker, lateral thinking, Ralph Nader, Richard Feynman, side project, Silicon Valley, statistical model, systems thinking, tacit knowledge, web application

-RRational Software Corporation. 2001. Rational Purify for Unix. Available at http://www.rational.com/products/purify_unix/index.jsp. Rational Software Corporation. 2001. Rational Test Foundation for Windows 2000. Available at http://www.rational.com/products/testfoundation/w2k_ds.jsp. Robinson, Harry. 1999. Finite State Model-Based Testing on a Shoestring. Star West 1999. Available at http://www.geocities.com/model_based_testing/shoestring.htm. -SSchneier, Bruce. 2000a. Computer Security: Will We Ever Learn? Crypto-Gram, May 15. Available at http://www.counterpane.com/crypto-gram-0005.html. Schneier, Bruce. 2000b.


Language and Mind by Noam Chomsky

Alfred Russel Wallace, classic study, finite state, Great Leap Forward, John von Neumann, language acquisition, lateral thinking, machine translation, pattern recognition, phenotype, tacit knowledge, theory of mind

In fact, it can be argued that the somewhat similar but not formally related concept of real-time deterministic automation is far more “natural” in terms of time and space conditions on computation.16 However, it is pointless to pursue this topic, because what is at stake is not the “simplicity” of phrase structure grammars but rather of transformational grammars with a phrase structure component that plays a role in generating deep structures. And there is absolutely no mathematical concept of “ease of computation” or “simplicity of algorithm” that even vaguely suggests that such systems may have some advantage over the kinds of automata that have been seriously investigated from this point of view – for example, finite state automata, linear bounded automata, and so on. The basic concept of “structure-dependent operation” has never even been considered in a strictly mathematical concept. The source of this confusion is a misconception on Putnam’s part as to the nature of grammatical transformations. They are not rules that “abbreviate” sentences; rather, they are operations that form surface structures from underlying deep structures, in such ways as are illustrated in the preceding lecture and the references there cited.17 Hence, to show that transformational grammars are the “simplest possible,” one would have to demonstrate that the “optimal” computing system would take a string of symbols as input and determine its surface structure, its underlying deep structure, and the sequence of transformational operations that relates them.


pages: 268 words: 109,447

The Cultural Logic of Computation by David Golumbia

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, American ideology, Benoit Mandelbrot, Bletchley Park, borderless world, business process, cellular automata, citizen journalism, Claude Shannon: information theory, computer age, Computing Machinery and Intelligence, corporate governance, creative destruction, digital capitalism, digital divide, en.wikipedia.org, finite state, folksonomy, future of work, Google Earth, Howard Zinn, IBM and the Holocaust, iterative process, Jaron Lanier, jimmy wales, John von Neumann, Joseph Schumpeter, late capitalism, Lewis Mumford, machine readable, machine translation, means of production, natural language processing, Norbert Wiener, One Laptop per Child (OLPC), packet switching, RAND corporation, Ray Kurzweil, RFID, Richard Stallman, semantic web, Shoshana Zuboff, Slavoj Žižek, social web, stem cell, Stephen Hawking, Steve Ballmer, Stewart Brand, strong AI, supply-chain management, supply-chain management software, technological determinism, Ted Nelson, telemarketer, The Wisdom of Crowds, theory of mind, Turing machine, Turing test, Vannevar Bush, web application, Yochai Benkler

“Beyond Explanatory Adequacy.” In Adriana Belletti, ed., The Cartography of Syntactic Structures. Volume 3: Structures and Beyond. New York: Cambridge University Press, 104–131. ———. 2005. “Three Factors in Language Design.” Linguistic Inquiry 36 (Winter), 1–22. Chomsky, Noam, and George Miller. 1958. “Finite State Languages.” Information and Control 1, 91–112. Chomsky, Noam, and M. P. Schützenberger. 1963. “The Algebraic Theory of Context-Free Languages.” In P. Braffort and D. Hirschberg, eds., Computer Programming and Formal Systems. Amsterdam: North-Holland, 118–161. Chun, Wendy H. K. 2006. Control and Freedom: Power and Paranoia in the Age of Fiber Optics.


pages: 462 words: 172,671

Clean Code: A Handbook of Agile Software Craftsmanship by Robert C. Martin

business logic, continuous integration, database schema, disinformation, domain-specific language, don't repeat yourself, Donald Knuth, en.wikipedia.org, Eratosthenes, finite state, G4S, Ignaz Semmelweis: hand washing, iterative process, place-making, Rubik’s Cube, web application

Therefore, when we see base classes mentioning the names of their derivatives, we suspect a problem. In general, base classes should know nothing about their derivatives. There are exceptions to this rule, of course. Sometimes the number of derivatives is strictly fixed, and the base class has code that selects between the derivatives. We see this a lot in finite state machine implementations. However, in that case the derivatives and base class are strongly coupled and always deploy together in the same jar file. In the general case we want to be able to deploy derivatives and bases in different jar files. Deploying derivatives and bases in different jar files and making sure the base jar files know nothing about the contents of the derivative jar files allow us to deploy our systems in discrete and independent components.


pages: 502 words: 107,510

Natural Language Annotation for Machine Learning by James Pustejovsky, Amber Stubbs

Amazon Mechanical Turk, bioinformatics, cloud computing, computer vision, crowdsourcing, easy for humans, difficult for computers, finite state, Free Software Foundation, game design, information retrieval, iterative process, language acquisition, machine readable, machine translation, natural language processing, pattern recognition, performance metric, power law, sentiment analysis, social web, sparse data, speech recognition, statistical model, text mining

Because the number of possible label sequences gets unmanageable with long sequences, the model makes the same assumption that the sequence predictor from Chapter 3 did; namely, look at only the previous n labels for calculating the probability (see Language Models). The underlying assumption is that the elements transition from state to state via a finite-state automaton behavior, with probabilistic transitions between states. Then, this can be transformed into a new model, with hidden states, where all observation tokens are emitted from each state with a finite probability. Of particular interest to us is the fact that the richer the encoding of features (e.g., from annotations) the better one can calculate the transition probabilities from state to state in the model.


pages: 366 words: 107,145

Fuller Memorandum by Stross, Charles

Any sufficiently advanced technology is indistinguishable from magic, Beeching cuts, Bletchley Park, British Empire, carbon credits, cognitive dissonance, complexity theory, congestion charging, Crossrail, death from overwork, dumpster diving, escalation ladder, false flag, finite state, Firefox, Herman Kahn, HyperCard, invisible hand, land reform, linear programming, messenger bag, MITM: man-in-the-middle, operational security, peak oil, Plato's cave, post-work, prosperity theology / prosperity gospel / gospel of success, quantum entanglement, reality distortion field, security theater, sensible shoes, side project, Sloane Ranger, telemarketer, Turing machine

He's been in his grave for nearly half a century, and would doubtless be spinning in it fast enough to qualify for carbon credits as an environmentally friendly power source if he could see us today in all our multi-ethnic anti-discriminatory splendor. But who cares? That is, indeed, the big-ticket question. Before the Laundry, things were a bit confused. You can do magic by hand, without computers, but magic performed by ritual without finite state automata in the loop--calculating machines, in other words--tends to be haphazard, unreliable, uncontrollable, prone to undesirable side effects, and difficult to repeat. It also tends to fuck with causality, the logical sequence of events, in a most alarming way. We've unintentionally rewritten our history over the centuries, would-be sorcerers unwinding chaos and pinning down events with the dead hand of consistency--always tending towards a more stable ground state because chaos is unstable; entropy is magic's great enemy.


Wireless by Charles Stross

air gap, anthropic principle, back-to-the-land, Benoit Mandelbrot, Buckminster Fuller, Cepheid variable, cognitive dissonance, colonial exploitation, cosmic microwave background, Easter island, epigenetics, finite state, Georg Cantor, gravity well, hive mind, hydroponic farming, jitney, Khyber Pass, Late Heavy Bombardment, launch on warning, lifelogging, Magellanic Cloud, mandelbrot fractal, MITM: man-in-the-middle, Neil Armstrong, peak oil, phenotype, Pluto: dwarf planet, security theater, sensible shoes, Turing machine, undersea cable

I get a powerful jolt of it right now, sizzling up my arm and locking my fingers in place around the head of the chess piece. I try to pull it away from the board, but it’s no good: it only wants to move up or down, left or right . . . Left or right? I blink. It’s a state machine all right: one that’s locked by the law of sympathy to some other finite state automaton, one that grinds down slow and hard. I move the piece forward one square. It’s surprisingly heavy, the magnet a solid weight in its base—but more than magnetism holds it in contact with the board. As soon as I stop moving I feel a sharp sting in my fingertips. “Ouch!” I raise them to my mouth just as there’s a crash from outside.


pages: 523 words: 112,185

Doing Data Science: Straight Talk From the Frontline by Cathy O'Neil, Rachel Schutt

Amazon Mechanical Turk, augmented reality, Augustin-Louis Cauchy, barriers to entry, Bayesian statistics, bike sharing, bioinformatics, computer vision, confounding variable, correlation does not imply causation, crowdsourcing, data science, distributed generation, Dunning–Kruger effect, Edward Snowden, Emanuel Derman, fault tolerance, Filter Bubble, finite state, Firefox, game design, Google Glasses, index card, information retrieval, iterative process, John Harrison: Longitude, Khan Academy, Kickstarter, machine translation, Mars Rover, Nate Silver, natural language processing, Netflix Prize, p-value, pattern recognition, performance metric, personalized medicine, pull request, recommendation engine, rent-seeking, selection bias, Silicon Valley, speech recognition, statistical model, stochastic process, tacit knowledge, text mining, the scientific method, The Wisdom of Crowds, Watson beat the top human players on Jeopardy!, X Prize

Depending on the rate of the water coming in, this system exhibits a chaotic process that depends on molecular-level interactions of water molecules on the sides of the buckets. Read more about it in this associated Wikipedia article. Many systems can exhibit inherent chaos. Philippe M. Binder and Roderick V. Jensen have written a paper entitled “Simulating chaotic behavior with finite-state machines”, which is about digital computer simulations of chaos. An interdisciplinary program involving M.I.T., Harvard, and Tufts involved teaching a technique that was entitled “Simulating chaos to teach order”. They simulated an emergency on the border between Chad and Sudan’s troubled Darfur region, with students acting as members of Doctors Without Borders, International Medical Corps, and other humanitarian agencies.


pages: 570 words: 115,722

The Tangled Web: A Guide to Securing Modern Web Applications by Michal Zalewski

barriers to entry, business process, defense in depth, easy for humans, difficult for computers, fault tolerance, finite state, Firefox, Google Chrome, information retrieval, information security, machine readable, Multics, RFC: Request For Comment, semantic web, Steve Jobs, telemarketer, Tragedy of the Commons, Turing test, Vannevar Bush, web application, WebRTC, WebSocket

One could argue that practitioners are not the ones to be asked for nuanced definitions, but go ahead and pose the same question to a group of academics and they’ll offer you roughly the same answer. For example, the following common academic definition traces back to the Bell-La Padula security model, published in the 1960s. (This was one of about a dozen attempts to formalize the requirements for secure systems, in this case in terms of a finite state machine;[86] it is also one of the most notable ones.) A system is secure if and only if it starts in a secure state and cannot enter an insecure state. Definitions along these lines are fundamentally true, of course, and may serve as the basis for dissertations or even a couple of government grants.


Pragmatic.Programming.Erlang.Jul.2007 by Unknown

Debian, en.wikipedia.org, fault tolerance, finite state, full text search, functional programming, higher-order functions, Planet Labs, RFC: Request For Comment

Module:terminate(Arg, State) -> term() Clean up before deletion. add_handler(EventMgrRef, Handler, Args) -> Result Add an event handler to a generic event manager. add_sup_handler(EventMgrRef, Handler, Args) -> Result Add a supervised event handler to a generic event manager. call(EventMgrRef, Handler, Request, Timeout) -> Result Make a synchronous call to a generic event manager. delete_handler(EventMgrRef, Handler, Args) -> Result Delete an event handler from a generic event manager. start(EventMgrName) -> Result Create a stand-alone event manager process. start_link(EventMgrName) -> Result Create a generic event manager process in a supervision tree. stop(EventMgrRef) -> ok Terminate a generic event manager. 475 M ODULE : GEN _ FSM swap_handler(EventMgrRef, {Handler1,Args1}, {Handler2,Args2}) -> Result Replace an event handler in a generic event manager. swap_sup_handler(EventMgrRef, {Handler1,Args1}, {Handler2,Args2}) -> Result Replace an event handler in a generic event manager. sync_notify(EventMgrRef, Event) -> ok Notify an event manager about an event. which_handlers(EventMgrRef) -> [Handler] Return all event handlers installed in a generic event manager. F.29 Module: gen_fsm Generic finite state machine behavior. Module:StateName(Event, StateData) -> Result Handle an asynchronous event. Module:StateName(Event, From, StateData) -> Result Handle a synchronous event. Module:code_change(OldVsn, StateName, StateData, Extra) -> {ok, NextStateName, NewStateData} Update the internal state data during upgrade/downgrade.


Elixir in Action by Saša Jurić

demand response, en.wikipedia.org, fail fast, fault tolerance, finite state, functional programming, general-purpose programming language, higher-order functions, place-making, reproducible builds, Ruby on Rails, WebSocket

The Erlang standard library includes the following OTP behaviours: ¡ gen_server — Generic implementation of a stateful server process ¡ supervisor — Provides error handling and recovery in concurrent systems ¡ application — Generic implementation of components and libraries ¡ gen_event — Provides event-handling support ¡ gen_statem — Runs a finite state machine in a stateful server process Elixir provides its own wrappers for the most frequently used behaviours via the modules GenServer, Supervisor, and Application. This book focuses on these behaviours. The GenServer behaviour receives detailed treatment in this chapter and chapter 7, Supervisor is discussed in chapters 8 and 9, and Application is presented in chapter 11. 168 Chapter 6 Generic server processes The remaining behaviours, although useful, are used less often and won’t be discussed in this book.


pages: 489 words: 148,885

Accelerando by Stross, Charles

book value, business cycle, call centre, carbon-based life, cellular automata, cognitive dissonance, commoditize, Conway's Game of Life, dark matter, disinformation, dumpster diving, Extropian, financial engineering, finite state, flag carrier, Flynn Effect, Future Shock, glass ceiling, gravity well, John von Neumann, junk bonds, Kickstarter, knapsack problem, Kuiper Belt, machine translation, Magellanic Cloud, mandelbrot fractal, market bubble, means of production, military-industrial complex, MITM: man-in-the-middle, Neal Stephenson, orbital mechanics / astrodynamics, packet switching, performance metric, phenotype, planetary scale, Pluto: dwarf planet, quantum entanglement, reversible computing, Richard Stallman, satellite internet, SETI@home, Silicon Valley, Singularitarianism, Skinner box, slashdot, South China Sea, stem cell, technological singularity, telepresence, The Chicago School, theory of mind, Turing complete, Turing machine, Turing test, upwardly mobile, Vernor Vinge, Von Neumann architecture, warehouse robotics, web of trust, Y2K, zero-sum game

"He used to design Babbage engines for the Pentagon – stealth computers. (No van Eck radiation, you know.) Look." He carefully pulls a fabric-bound document out of the obsolescent data wall and shows the spine to Manfred: "On the Theory of Games, by John von Neumann. Signed first edition." Aineko meeps and dumps a slew of confusing purple finite state automata into Manfred's left eye. The hardback is dusty and dry beneath his fingertips as he remembers to turn the pages gently. "This copy belonged to the personal library of Oleg Kordiovsky. A lucky man is Oleg: He bought it in 1952, while on a visit to New York, and the MVD let him to keep it."


pages: 818 words: 153,952

C++ Concurrency in Action: Practical Multithreading by Anthony Williams

car-free, finite state, functional programming, SETI@home

Each thread is therefore effectively a state machine: when it receives a message, it updates its state in some manner and maybe sends one or more messages to other threads, with the processing performed depending on the initial state. One way to write such threads would be to formalize this and implement a Finite State Machine model, but this isn’t the only way; the state machine can be implicit in the structure of the application. Which method works better in any given scenario depends on the exact behavioral requirements of the situation and the expertise of the programming team. However you choose to implement each thread, the separation into independent processes has the potential to remove much of the complication from shared-data concurrency and therefore make programming easier, lowering the bug rate.


pages: 757 words: 193,541

The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services, Volume 2 by Thomas A. Limoncelli, Strata R. Chalup, Christina J. Hogan

active measures, Amazon Web Services, anti-pattern, barriers to entry, business process, cloud computing, commoditize, continuous integration, correlation coefficient, database schema, Debian, defense in depth, delayed gratification, DevOps, domain-specific language, en.wikipedia.org, fault tolerance, finite state, Firefox, functional programming, Google Glasses, information asymmetry, Infrastructure as a Service, intermodal, Internet of things, job automation, job satisfaction, Ken Thompson, Kickstarter, level 1 cache, load shedding, longitudinal study, loose coupling, machine readable, Malcom McLean invented shipping containers, Marc Andreessen, place-making, platform as a service, premature optimization, recommendation engine, revision control, risk tolerance, Salesforce, scientific management, seminal paper, side project, Silicon Valley, software as a service, sorting algorithm, standardized shipping container, statistical model, Steven Levy, supply-chain management, systems thinking, The future is already here, Toyota Production System, vertical integration, web application, Yogi Berra

This makes performing upgrades much easier. Generally we frown on the technique of modifying a live system but some languages are designed specifically to support it. Erlang is one such language. A service written in Erlang can be upgraded while it is running. Properly structured Erlang programs are designed as event-driven finite-state machines (FSM). For each event received, a specific function is called. One event that the service can receive is a notification that code has been upgraded. The function that is called is responsible for upgrading or converting any data structures. All subsequent events will trigger the new versions of the functions.


pages: 612 words: 187,431

The Art of UNIX Programming by Eric S. Raymond

A Pattern Language, Albert Einstein, Apple Newton, barriers to entry, bioinformatics, Boeing 747, Clayton Christensen, combinatorial explosion, commoditize, Compatible Time-Sharing System, correlation coefficient, David Brooks, Debian, Dennis Ritchie, domain-specific language, don't repeat yourself, Donald Knuth, end-to-end encryption, Everything should be made as simple as possible, facts on the ground, finite state, Free Software Foundation, general-purpose programming language, George Santayana, history of Unix, Innovator's Dilemma, job automation, Ken Thompson, Larry Wall, level 1 cache, machine readable, macro virus, Multics, MVC pattern, Neal Stephenson, no silver bullet, OSI model, pattern recognition, Paul Graham, peer-to-peer, premature optimization, pre–internet, publish or perish, revision control, RFC: Request For Comment, Richard Stallman, Robert Metcalfe, Steven Levy, the Cathedral and the Bazaar, transaction costs, Turing complete, Valgrind, wage slave, web application

If it had lacked this consistent mathematical model, it would probably look like the design of the original glob(1) facility in the oldest Unixes, a handful of ad-hoc wildcards that can't be combined. The yacc(1) utility for generating language parsers is a thin wrapper around the formal theory of LR(1) grammars. Its partner, the lexical analyzer generator lex(1), is a similarly thin wrapper around the theory of nondeterministic finite-state automata. All three of these programs are so bug-free that their correct functioning is taken utterly for granted, and compact enough to fit easily in a programmer's hand. Only a part of these good qualities are due to the polishing that comes with a long service life and frequent use; most of it is that, having been constructed around a strong and provably correct algorithmic core, they never needed much polishing in the first place.


pages: 1,085 words: 219,144

Solr in Action by Trey Grainger, Timothy Potter

business intelligence, cloud computing, commoditize, conceptual framework, crowdsourcing, data acquisition, data science, en.wikipedia.org, failed state, fault tolerance, finite state, full text search, functional programming, glass ceiling, information retrieval, machine readable, natural language processing, openstreetmap, performance metric, premature optimization, recommendation engine, web application

In a large index, there can be many terms matching a prefix, so we also need a way to sort the suggestions by popularity, as we don’t want low-frequency, rare terms polluting our suggestion lists. Think of it in terms of doing the most good for the most users; if a term only occurs in a few documents out of millions, it’s probably not a good suggestion. In listing 10.11, we’re using the org.apache.solr.spelling.suggest.fst.FSTLookup class, which uses a data structure based on finite state automaton (FST) to provide fast, constant-time lookups regardless of prefix length. The FSTLookupFactory implementation is slower to build but requires a smaller memory footprint, which makes it a good choice for large-term dictionaries. The suggestion dictionary must be rebuilt to incorporate new terms as new documents are indexed.


pages: 846 words: 232,630

Darwin's Dangerous Idea: Evolution and the Meanings of Life by Daniel C. Dennett

Albert Einstein, Alfred Russel Wallace, anthropic principle, assortative mating, buy low sell high, cellular automata, Charles Babbage, classic study, combinatorial explosion, complexity theory, computer age, Computing Machinery and Intelligence, conceptual framework, Conway's Game of Life, Danny Hillis, double helix, Douglas Hofstadter, Drosophila, finite state, Garrett Hardin, Gregor Mendel, Gödel, Escher, Bach, heat death of the universe, In Cold Blood by Truman Capote, invention of writing, Isaac Newton, Johann Wolfgang von Goethe, John von Neumann, junk bonds, language acquisition, Murray Gell-Mann, New Journalism, non-fiction novel, Peter Singer: altruism, phenotype, price mechanism, prisoner's dilemma, QWERTY keyboard, random walk, Recombinant DNA, Richard Feynman, Rodney Brooks, Schrödinger's Cat, selection bias, Stephen Hawking, Steven Pinker, strong AI, Stuart Kauffman, the scientific method, theory of mind, Thomas Malthus, Tragedy of the Commons, Turing machine, Turing test

Chomsky eventually defined an ascending scale of types of grammars or types of languages — the Chomsky Hierarchy, on which all students of computation theory still cut their teeth — and showed how these grammars were interdefinable with an ascending scale of types of automata or computers — from "finite state machines" through "push-down automata" and "linear bounded machines" to "Turing machines." I can vividly remember the shock wave that rolled through philosophy when Chomsky's work first came to our attention a few years later. In 1960, my sophomore year at Harvard, I asked Professor Quine what critics of his views I should be reading.


Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, billion-dollar mistake, bitcoin, blockchain, business intelligence, business logic, business process, c2.com, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, data science, database schema, deep learning, DevOps, distributed ledger, Donald Knuth, Edward Snowden, end-to-end encryption, Ethereum, ethereum blockchain, exponential backoff, fake news, fault tolerance, finite state, Flash crash, Free Software Foundation, full text search, functional programming, general-purpose programming language, Hacker News, informal economy, information retrieval, Internet of things, iterative process, John von Neumann, Ken Thompson, Kubernetes, Large Hadron Collider, level 1 cache, loose coupling, machine readable, machine translation, Marc Andreessen, microservices, natural language processing, Network effects, no silver bullet, operational security, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, SQL injection, statistical model, surveillance capitalism, systematic bias, systems thinking, Tragedy of the Commons, undersea cable, web application, WebSocket, wikimedia commons

As mentioned in “Making an LSM-tree out of SSTables” on page 78, Lucene uses a SSTable-like structure for its term dictionary. This structure requires a small inmemory index that tells queries at which offset in the sorted file they need to look for a key. In LevelDB, this in-memory index is a sparse collection of some of the keys, but in Lucene, the in-memory index is a finite state automaton over the characters in the keys, similar to a trie [38]. This automaton can be transformed into a Levenshtein automaton, which supports efficient search for words within a given edit distance [39]. Other fuzzy search techniques go in the direction of document classification and machine learning.


pages: 1,237 words: 227,370

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, billion-dollar mistake, bitcoin, blockchain, business intelligence, business logic, business process, c2.com, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, data science, database schema, deep learning, DevOps, distributed ledger, Donald Knuth, Edward Snowden, end-to-end encryption, Ethereum, ethereum blockchain, exponential backoff, fake news, fault tolerance, finite state, Flash crash, Free Software Foundation, full text search, functional programming, general-purpose programming language, Hacker News, informal economy, information retrieval, Infrastructure as a Service, Internet of things, iterative process, John von Neumann, Ken Thompson, Kubernetes, Large Hadron Collider, level 1 cache, loose coupling, machine readable, machine translation, Marc Andreessen, microservices, natural language processing, Network effects, no silver bullet, operational security, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, SQL injection, statistical model, surveillance capitalism, systematic bias, systems thinking, Tragedy of the Commons, undersea cable, web application, WebSocket, wikimedia commons

As mentioned in “Making an LSM-tree out of SSTables”, Lucene uses a SSTable-like structure for its term dictionary. This structure requires a small in-memory index that tells queries at which offset in the sorted file they need to look for a key. In LevelDB, this in-memory index is a sparse collection of some of the keys, but in Lucene, the in-memory index is a finite state automaton over the characters in the keys, similar to a trie [38]. This automaton can be transformed into a Levenshtein automaton, which supports efficient search for words within a given edit distance [39]. Other fuzzy search techniques go in the direction of document classification and machine learning.


Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei

backpropagation, bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, disinformation, distributed generation, finite state, industrial research laboratory, information retrieval, information security, iterative process, knowledge worker, linked data, machine readable, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, power law, random walk, recommendation engine, RFID, search costs, semantic web, seminal paper, sentiment analysis, sparse data, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, Thomas Bayes, web application

If the behavior attribute values of the object significantly deviate from the values predicted by the model, then the object can be declared a contextual outlier. By using a prediction model that links the contexts and behavior, these methods avoid the explicit identification of specific contexts. A number of classification and prediction techniques can be used to build such models such as regression, Markov models, and finite state automaton. Interested readers are referred to Chapter 8 and Chapter 9 on classification and the bibliographic notes for further details (Section 12.11). In summary, contextual outlier detection enhances conventional outlier detection by considering contexts, which are important in many applications.


Europe: A History by Norman Davies

agricultural Revolution, Albert Einstein, anti-communist, Berlin Wall, bread and circuses, Bretton Woods, British Empire, business climate, centre right, charter city, classic study, clean water, Columbian Exchange, conceptual framework, continuation of politics by other means, Corn Laws, cuban missile crisis, Defenestration of Prague, discovery of DNA, disinformation, double entry bookkeeping, Dr. Strangelove, Edmond Halley, Edward Lloyd's coffeehouse, equal pay for equal work, Eratosthenes, Etonian, European colonialism, experimental economics, financial independence, finite state, Francis Fukuyama: the end of history, Francisco Pizarro, full employment, gentleman farmer, global village, Gregor Mendel, Honoré de Balzac, Index librorum prohibitorum, interchangeable parts, invention of agriculture, invention of movable type, Isaac Newton, James Hargreaves, James Watt: steam engine, Johann Wolfgang von Goethe, Johannes Kepler, John Harrison: Longitude, joint-stock company, Joseph-Marie Jacquard, Korean Air Lines Flight 007, land reform, liberation theology, long peace, Louis Blériot, Louis Daguerre, Mahatma Gandhi, mass immigration, Mikhail Gorbachev, military-industrial complex, Monroe Doctrine, Murano, Venice glass, music of the spheres, New Urbanism, North Sea oil, offshore financial centre, Peace of Westphalia, Plato's cave, popular capitalism, Potemkin village, purchasing power parity, Ralph Waldo Emerson, road to serfdom, sceptred isle, Scramble for Africa, spinning jenny, Suez canal 1869, Suez crisis 1956, Thales of Miletus, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, trade route, transatlantic slave trade, Transnistria, urban planning, urban sprawl, W. E. B. Du Bois

These downtrodden Ostjuden were the butt of much prejudice, not only from the locals but also from their fellow-Jews who had made the grade in Germany and Austria, and who had left the old Jewish world completely behind.55 European Imperialism in the late nineteenth century differed from earlier forms of the phenomenon in several important ways. It was part of a world-wide scramble for control of the last remaining countries suitable for exploitation. It was evident that the world’s resources were finite: states which set up a colonial empire quickly stood to gain a permanent advantage; those who delayed might be excluded from the ‘First Division’ forever. In the two decades starting in 1875, over one-quarter of the land surface of the globe was seized by half-a-dozen European powers. Colonies were viewed as an integral part of the advanced industrial economies.