stochastic process

68 results back to index


Mathematical Finance: Theory, Modeling, Implementation by Christian Fries

Black-Scholes formula, Brownian motion, continuous integration, discrete time, financial engineering, fixed income, implied volatility, interest rate derivative, martingale, quantitative trading / quantitative finance, random walk, short selling, Steve Jobs, stochastic process, stochastic volatility, volatility smile, Wiener process, zero-coupon bond

In this sense C may be interpreted as an information set and X|C as a filtered version of X. If it is only possible to make statements upon events in C then one may only make statements about X which could also be made about X|C . C| 2.2. Stochastic Processes q Definition 17 (Stochastic Process): A family X = {Xt | 0 ≤ t < ∞} of random variables Xt : (Ω, F ) → (S , S) is called (time continuous) stochastic process. If (S , S) = (Rd , B(Rd )), we say that X is a d-dimensional stochastic process. The family X may also be interpreted as a X : [0, ∞) × Ω → S : X(t, ω) := Xt (ω) ∀ (t, ω) ∈ [0, ∞) × Ω. y If the range (S , S) is not given explicitly we assume (S , S) = 35 This work is licensed under a Creative Commons License. http://creativecommons.org/licenses/by-nc-nd/2.5/deed.en (Rd , B(Rd )).

Integral of a random variable Z with respect to a measure P (cf. expectation). Stochastic Processes: Z • X(t1 , ω) dP(ω) – Lebesgue integral. Integral of a random variable X(t1 ) with respect to a measure P. X(t) dt – Lebesgue Integral or Riemann integral. The (pathwise) integral of the stochastic process X with respect to t. X(t) dW(t) – Itô integral. The (pathwise) integral of the stochastic process X with respect to a Brownian motion W. t1 Ω Ω t2 Z • t1 Z • t2 t1 Z Ω X(t,ω) X(t1 , ω) dP(ω) ω1 0 t1 T Z T 0 X(t)dW (t)[ω1 ] t Figure 2.10.: Integration of stochastic processes The notion of a stochastic integral may be extended to more general integrands and/or more general integrators.

List of Symbols The following list of symbols summarizes the most important notions from Chapter 2: Symbol Object Interpretation ω element of Ω State. In the context of stochastic processes: path. Ω set State space. X random variable Map which assigns an event / outcome (e.g. a number) to a state. Example: the payoff of a financial product (this may be interpreted as a snapshot of the financial product itself). X stochastic process Sequence (in time) of random variables (e.g. the evolution of a financial product (could be its payoffs but also its value)). X(t), Xt stochastic process evaluated at time t (≡ random variable) see above X(ω) stochastic process evaluated in state ω Path of X in state ω. W Brownian motion Model for a continuos (random) movement of a particle with independent increments (position changes).


pages: 416 words: 39,022

Asset and Risk Management: Risk Oriented Finance by Louis Esch, Robert Kieffer, Thierry Lopez

asset allocation, Brownian motion, business continuity plan, business process, capital asset pricing model, computer age, corporate governance, discrete time, diversified portfolio, fixed income, implied volatility, index fund, interest rate derivative, iterative process, P = NP, p-value, random walk, risk free rate, risk/return, shareholder value, statistical model, stochastic process, transaction costs, value at risk, Wiener process, yield curve, zero-coupon bond

This is a distribution symmetrical with respect to 0, which corresponds to a normal distribution for n = 2 and gives rise to a leptokurtic distribution (resp. negative kurtosis distribution) for n < 2 (n > 2). 2.3 STOCHASTIC PROCESSES 2.3.1 General considerations The term stochastic process is applied to a random variable that is a function of the time variable: {Xt : t ∈ T }. 354 Asset and Risk Management f (x) v=1 v=2 v=3 x 0 Figure A2.15 Generalised error distribution If the set T of times is discrete, the stochastic process is simply a sequence of random variables. However, in a number of financial applications such as Black and Scholes’ model, it will be necessary to consider stochastic processes in continuous time. For each possible result ω ∈ , the function of Xt (ω) of the variable t is known as the path of the stochastic process.

For each possible result ω ∈ , the function of Xt (ω) of the variable t is known as the path of the stochastic process. A stochastic process is said to have independent increments when, regardless of the times t1 < t2 < . . . < tn , the r.v.s Xt1 , Xt2 − Xt1 , Xt3 − Xt2 , . . . are independent. In the same way, a stochastic process is said to have stationary increments when for every t and h the r.v.s Xt+h − Xt and Xh are identically distributed. 2.3.2 Particular stochastic processes 2.3.2.1 The Poisson process We consider a process of random occurrences of an event in time, corresponding to the set [0; +∞[. Here, the principal interest does not correspond directly to the occurrence times, but to the number of occurrences within given intervals.

8.1.2 The data in the example 8.2 Calculations 8.2.1 Treasury portfolio case 8.2.2 Bond portfolio case 8.3 The normality hypothesis PART IV FROM RISK MANAGEMENT TO ASSET MANAGEMENT Introduction 9 224 224 230 234 235 238 241 243 243 243 244 244 244 250 252 255 256 Portfolio Risk Management 9.1 General principles 9.2 Portfolio risk management method 9.2.1 Investment strategy 9.2.2 Risk framework 257 257 257 258 258 10 Optimising the Global Portfolio via VaR 10.1 Taking account of VaR in Sharpe’s simple index method 10.1.1 The problem of minimisation 10.1.2 Adapting the critical line algorithm to VaR 10.1.3 Comparison of the two methods 10.2 Taking account of VaR in the EGP method 10.2.1 Maximising the risk premium 10.2.2 Adapting the EGP method algorithm to VaR 10.2.3 Comparison of the two methods 10.2.4 Conclusion 10.3 Optimising a global portfolio via VaR 10.3.1 Generalisation of the asset model 10.3.2 Construction of an optimal global portfolio 10.3.3 Method of optimisation of global portfolio 265 266 266 267 269 269 269 270 271 272 274 275 277 278 11 Institutional Management: APT Applied to Investment Funds 11.1 Absolute global risk 11.2 Relative global risk/tracking error 11.3 Relative fund risk vs. benchmark abacus 11.4 Allocation of systematic risk 285 285 285 287 288 x Contents 2.2 Theoretical distributions 2.2.1 Normal distribution and associated ones 2.2.2 Other theoretical distributions 2.3 Stochastic processes 2.3.1 General considerations 2.3.2 Particular stochastic processes 2.3.3 Stochastic differential equations 347 347 350 353 353 354 356 Appendix 3 Statistical Concepts 3.1 Inferential statistics 3.1.1 Sampling 3.1.2 Two problems of inferential statistics 3.2 Regressions 3.2.1 Simple regression 3.2.2 Multiple regression 3.2.3 Nonlinear regression 359 359 359 360 362 362 363 364 Appendix 4 Extreme Value Theory 4.1 Exact result 4.2 Asymptotic results 4.2.1 Extreme value theorem 4.2.2 Attraction domains 4.2.3 Generalisation 365 365 365 365 366 367 Appendix 5 Canonical Correlations 5.1 Geometric presentation of the method 5.2 Search for canonical characters 369 369 369 Appendix 6 371 Algebraic Presentation of Logistic Regression Appendix 7 Time Series Models: ARCH-GARCH and EGARCH 7.1 ARCH-GARCH models 7.2 EGARCH models 373 373 373 Appendix 8 Numerical Methods for Solving Nonlinear Equations 8.1 General principles for iterative methods 8.1.1 Convergence 8.1.2 Order of convergence 8.1.3 Stop criteria 8.2 Principal methods 8.2.1 First order methods 8.2.2 Newton–Raphson method 8.2.3 Bisection method 375 375 375 376 376 377 377 379 380 Contents 8.3 Nonlinear equation systems 8.3.1 General theory of n-dimensional iteration 8.3.2 Principal methods xi 380 381 381 Bibliography 383 Index 389 Collaborators Christian Berbé, Civil engineer from Université libre de Bruxelles and ABAF financial analyst.


pages: 320 words: 33,385

Market Risk Analysis, Quantitative Methods in Finance by Carol Alexander

asset allocation, backtesting, barriers to entry, Brownian motion, capital asset pricing model, constrained optimization, credit crunch, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, financial engineering, fixed income, implied volatility, interest rate swap, low interest rates, market friction, market microstructure, p-value, performance metric, power law, proprietary trading, quantitative trading / quantitative finance, random walk, risk free rate, risk tolerance, risk-adjusted returns, risk/return, seminal paper, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, systematic bias, Thomas Bayes, transaction costs, two and twenty, value at risk, volatility smile, Wiener process, yield curve, zero-sum game

Readers interested in estimating the parameters of a GARCH model when they come to Chapter II.4 will need to understand maximum likelihood estimation. Section I.3.7 shows how to model the evolution of financial asset prices and returns using a stochastic process in both discrete and continuous time. The translation between discrete and continuous time, and the relationship between the continuous time representation and the discrete time representation of a stochastic process, is very important indeed. The theory of finance requires an understanding of both discrete time and continuous time stochastic processes. Section I.3.8 summarizes and concludes. Some prior knowledge of basic calculus and elementary linear algebra is required to understand this chapter.

Then, using ˆ in place of we have ˆ (I.3.135) estse X = √ n and ˆ2 (I.3.136) estse ˆ 2 = √ 2n I.3.7 STOCHASTIC PROCESSES IN DISCRETE AND CONTINUOUS TIME A stochastic process is a sequence of identically distributed random variables. For most of our purposes random variables are continuous, indeed they are often assumed to be normal, but the sequence may be over continuous or discrete time. That is, we consider continuous state processes in both continuous and discrete time. • The study of discrete time stochastic processes is called time series analysis. In the time domain the simplest time series models are based on regression analysis, which is introduced in the next chapter.

In efficient markets a time series of prices or log prices will follow a random walk. More generally, even in the presence of market frictions and inefficiencies, prices and log prices of tradable assets are integrated stochastic processes. These are fundamentally different from the associated returns, which are generated by stationary stochastic processes. Figures I.3.28 and I.3.29 illustrate the fact that prices and returns are generated by very different types of stochastic process. Figure I.3.28 shows time series of daily prices (lefthand scale) and log prices (right-hand scale) of the Dow Jones Industrial Average (DJIA) DJIA 12000 9.4 Log DJIA 9.3 11000 9.2 10000 9.1 9000 9 8000 8.9 Sep-01 May-01 Jan-01 Sep-00 May-00 Jan-00 Sep-99 May-99 Jan-99 Sep-98 May-98 8.8 Jan-98 7000 Figure I.3.28 Daily prices and log prices of DJIA index 56 This is not the only possible discretization of a continuous increment.


pages: 313 words: 34,042

Tools for Computational Finance by Rüdiger Seydel

bioinformatics, Black-Scholes formula, Brownian motion, commoditize, continuous integration, discrete time, financial engineering, implied volatility, incomplete markets, interest rate swap, linear programming, London Interbank Offered Rate, mandelbrot fractal, martingale, random walk, risk free rate, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process, zero-coupon bond

The easiest way to consider stochastic movements is via an additive term, 32 Chapter 1 Modeling Tools for Financial Options dx = a(x, t) + b(x, t)ξt . dt Here we use the notations a: deterministic part, bξt : stochastic part, ξt denotes a generalized stochastic process. An example of a generalized stochastic process is white noise. For a brief definition of white noise we note that to each stochastic process a generalized version can be assigned [Ar74]. For generalized stochastic processes derivatives of any order can be defined. Suppose that Wt is the generalized version of a Wiener process, then Wt can be differentiated. Then white noise ξt is d Wt , or vice versa, defined as ξt = Ẇt = dt t Wt = ξs ds. 0 That is, a Wiener process is obtained by smoothing the white noise.

A stochastic process is a family of random variables Xt , which are defined for a set of parameters t (−→ Appendix B1). Here we consider the continuoustime situation. That is, t ∈ IR varies continuously in a time interval I, which typically represents 0 ≤ t ≤ T . A more complete notation for a stochastic process is {Xt , t ∈ I}, or (Xt )0≤t≤T . Let the chance play for all t in the interval 0 ≤ t ≤ T , then the resulting function Xt is called realization or path of the stochastic process. Special properties of stochastic processes have lead to the following names: Gaussian process: All finite-dimensional distributions (Xt1 , . . . , Xtk ) are Gaussian. Hence specifically Xt is distributed normally for all t. Markov process: Only the present value of Xt is relevant for its future motion.

The general definition is ∂V (S, t) ; ∆ = ∆(S, t) = ∂S the expression (1.16) is a discretized version. 1.6 Stochastic Processes Brownian motion originally meant the erratic motion of a particle (pollen) on the surface of a fluid, caused by tiny impulses of molecules. Wiener suggested a mathematical model for this motion, the Wiener process. But earlier Bachelier had applied Brownian motion to model the motion of stock prices, which instantly respond to the numerous upcoming informations similar as pollen react to the impacts of molecules. The illustration of the Dow in Figure 1.14 may serve as motivation. A stochastic process is a family of random variables Xt , which are defined for a set of parameters t (−→ Appendix B1).


Analysis of Financial Time Series by Ruey S. Tsay

Asian financial crisis, asset allocation, backpropagation, Bayesian statistics, Black-Scholes formula, Brownian motion, business cycle, capital asset pricing model, compound rate of return, correlation coefficient, data acquisition, discrete time, financial engineering, frictionless, frictionless market, implied volatility, index arbitrage, inverted yield curve, Long Term Capital Management, market microstructure, martingale, p-value, pattern recognition, random walk, risk free rate, risk tolerance, short selling, statistical model, stochastic process, stochastic volatility, telemarketer, transaction costs, value at risk, volatility smile, Wiener process, yield curve

ISBN: 0-471-41544-8 CHAPTER 6 Continuous-Time Models and Their Applications Price of a financial asset evolves over time and forms a stochastic process, which is a statistical term used to describe the evolution of a random variable over time. The observed prices are a realization of the underlying stochastic process. The theory of stochastic process is the basis on which the observed prices are analyzed and statistical inference is made. There are two types of stochastic process for modeling the price of an asset. The first type is called the discrete-time stochastic process, in which the price changes at discrete time points. All the processes discussed in the previous chapters belong to this category.

For the price of an asset at time t, the range of x(η, t) is the set of non-negative real numbers. For a given η, {x(η, t)} is a time series with values depending on the time t. For simplicity, we 223 STOCHASTIC PROCESSES write a continuous-time stochastic process as {xt } with the understanding that, for a given t, xt is a random variable. In the literature, some authors use x(t) instead of xt to emphasize that t is continuous. However, we use the same notation xt , but call it a continuous-time stochastic process. 6.2.1 The Wiener Process In a discrete-time econometric model, we assume that the shocks form a white noise process, which is not predictable.

This is the purpose of discussing Ito’s calculus in the next section. 6.2.2 Generalized Wiener Processes The Wiener process is a special stochastic process with zero drift and variance proportional to the length of time interval. This means that the rate of change in expectation is zero and the rate of change in variance is 1. In practice, the mean and variance of a stochastic process can evolve over time in a more complicated manner. Hence, further generalization of stochastic process is needed. To this end, we consider the generalized Wiener process in which the expectation has a drift rate µ and the rate of variance change is σ 2 .


pages: 447 words: 104,258

Mathematics of the Financial Markets: Financial Instruments and Derivatives Modelling, Valuation and Risk Issues by Alain Ruttiens

algorithmic trading, asset allocation, asset-backed security, backtesting, banking crisis, Black Swan, Black-Scholes formula, Bob Litterman, book value, Brownian motion, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, currency risk, delta neutral, discounted cash flows, discrete time, diversification, financial engineering, fixed income, implied volatility, interest rate derivative, interest rate swap, low interest rates, managed futures, margin call, market microstructure, martingale, p-value, passive investing, proprietary trading, quantitative trading / quantitative finance, random walk, risk free rate, risk/return, Satyajit Das, seminal paper, Sharpe ratio, short selling, statistical model, stochastic process, stochastic volatility, time value of money, transaction costs, value at risk, volatility smile, Wiener process, yield curve, zero-coupon bond

F forward price, or future price (depends on the context) FV future value -ibor generic for LIBOR, EURIBOR, or any other inter-bank market rate K strike price of an option κ kurtosis M month or million, depending on context MD modified duration MtM “Marked to Market” (= valued to the observed current market price) μ drift of a stochastic process N total number of a series (integer number), or nominal (notional) amount (depends on the context) (.) Gaussian (normal) density distribution function N(.) Gaussian (normal) cumulative distribution function P put price P{.} probability of {.} PV present value (.) Poisson density distribution function r generic symbol for a rate of return rf risk-free return ρ(.) correlation of (.) skew skewness S spot price of an asset (equity, currency, etc.), as specified by the context STD(.) standard deviation of (.) σ volatility of a stochastic process t current time, or time in general (depends on the context) t0 initial time T maturity time τ tenor, that is, time interval between current time t and maturity T V(.) variance of (

For example, see Figure 8.1 for the case of a cumulative normal distribution. Figure 8.1 A cumulative normal distribution In this example of a normal distribution (μ = 0, σ = 0.40), F(x) = P(X ≤ x) = 0.7734. Provided F(x) is continuously differentiable, we can determine the corresponding density function f(x) associated to the random variable X as Stochastic Processes A stochastic process can be defined as a collection of random variables defined on the same probability space (Ω, , P) and “indexed” by a set of parameter T, that is, {Xt, t ∈ T}. Within the framework of our chapter, t is the time. For a given outcome or sample ω, Xt(ω) for t ∈ T is called a sample path, realization or trajectory of the process.

., NEFTCI in the further reading at the end of the chapter). 9 Other financial models: from ARMA to the GARCH family The previous chapter dealt with stochastic processes, which consist of (returns) models involving a mixture of deterministic and stochastic components. By contrast, the models developed here present three major differences: These models are deterministic; since they are aiming to model a non-deterministic variable such as a return, the difference between the model output and the actual observed value is a probabilistic error term. By contrast with stochastic processes described by differential equations, these models are built in discrete time, in practice, the periodicity of the modeled return (daily, for example).


pages: 443 words: 51,804

Handbook of Modeling High-Frequency Data in Finance by Frederi G. Viens, Maria C. Mariani, Ionut Florescu

algorithmic trading, asset allocation, automated trading system, backtesting, Bear Stearns, Black-Scholes formula, book value, Brownian motion, business process, buy and hold, continuous integration, corporate governance, discrete time, distributed generation, fear index, financial engineering, fixed income, Flash crash, housing crisis, implied volatility, incomplete markets, linear programming, machine readable, mandelbrot fractal, market friction, market microstructure, martingale, Menlo Park, p-value, pattern recognition, performance metric, power law, principal–agent problem, random walk, risk free rate, risk tolerance, risk/return, short selling, statistical model, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process

New York: Chapman and Hall; 1994. 6. Levy P. Calcul des probabilités. Paris: Gauthier-Villars; 1925. 7. Khintchine AYa, Levy P. Sur les lois stables. C R Acad Sci Paris;1936;202:374. 8. Mantegna RN, Stanley HE. Stochastic process with ultra-slow convergence to a Gaussian: the truncated Levy flight. Phys Rev Lett;1994;73:2946– 2949. 9. Koponen I. Analytic approach to the problem of convergence of truncated Levy flights towards the Gaussian stochastic process. Phys Rev E;1995;52:1197–1199. 10. Weron R. Levy-stable distributions revisited: tail index> 2 does not exclude the Levy-stable regime. Int J Mod Phys C; 2001;12:209–223. Chapter Thirteen Solutions to Integro-Differential Parabolic Problem Arising on Financial Mathematics MARIA C.

However, several models proposed in recent years, such as the model found in Ref. 13, have allowed the volatility to be nonconstant or a stochastic variable. In this model, the underlying security S follows, as in the Black–Scholes model, a stochastic process dSt = μSt dt + σt St dZt , where Z is a standard Brownian motion. Unlike the classical model, the variance v(t) = σ 2 (t) also follows a stochastic process given by √ dvt = κ(θ − v(t))dt + γ vt dWt , where W is another standard Brownian motion. The correlation coefficient between W and Z is denoted by ρ: Cov (dZt , dWt ) = ρ dt. This leads to a generalized Black–Scholes equation ∂ 2F ∂F 1 2 ∂ 2F 1 2 ∂ 2F + ργ vS + rS vS + vγ 2 2 2 ∂S ∂v∂S 2 ∂v ∂S ∂F ∂F + [κ(θ − v) − λv] − rF + = 0.

The iterative method we will use for this problem was developed by Chadam and Yin in Ref. 22 to study a similar partial integro-differential problem. 13.3.1 STATEMENT OF THE PROBLEM As pointed out in Ref. 17, when modeling high frequency data in applications, a Lévy-like stochastic process appears to be the best fit. When using these models, option prices are found by solving the resulting PIDE. For example, integrodifferential equations appear in exponential Lévy models, where the market price of an asset is represented as the exponential of a Lévy stochastic process. These models have been discussed in several published works such as Refs 17 and 23. 365 13.3 Another Iterative Method In this section, we consider the following integro-differential model for a European call option ∂C σ 2S2 ∂ 2C ∂C (S, t) − rC(S, t) (S, t) + rS (S, t) + ∂t ∂S 2 ∂S 2 ∂C + ν(dy) C(Sey , t) − C(S, t) − S(ey − 1) (S, t) = 0, ∂S (13.28) where the market price of an asset is represented as the exponential of a Lévy stochastic process (see Chapter 12 of Ref. 17).


Mathematical Finance: Core Theory, Problems and Statistical Algorithms by Nikolai Dokuchaev

Black-Scholes formula, Brownian motion, buy and hold, buy low sell high, discrete time, electricity market, fixed income, implied volatility, incomplete markets, martingale, random walk, risk free rate, short selling, stochastic process, stochastic volatility, transaction costs, volatility smile, Wiener process, zero-coupon bond

We consider a random direct line L in R such that with probability 1, and that the angle between L and the vector (1, 0) has the uniform distribution on [0, π). Find the probability that the set © 2007 Nikolai Dokuchaev is finite. 2 Basics of stochastic processes In this chapter, some basic facts and definitions from the theory of stochastic (random) processes are given, including filtrations, martingales, Markov times, and Markov processes. 2.1 Definitions of stochastic processes Sometimes it is necessary to consider random variables or vectors that depend on time. Definition 2.1 A sequence of random variables ξt, t=0, 1, 2,…, is said to be a discrete time stochastic (or random) process. be given.

In particular, it follows that where is the probability density function for N(x, T−s). Note that this function is also well known in the theory of parabolic equations: it is the so-called fundamental solution of the heat equation. The representation of functions of the stochastic processes via solution of parabolic partial differential equations (PDEs) helps to study stochastic processes: one can use numerical methods developed for PDEs (i.e., finite differences, fundamental solutions, etc.). On the other hand, the probabilistic representation of a solution of parabolic PDEs can also help to study PDEs. For instance, one can use Monte Carlo simulation for numerical solution of PDEs.

British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging in Publication Data A catalog record for this book has been requested ISBN 0-203-96472-1 Master e-book ISBN ISBN10: 0-415-41447-4 (hbk) ISBN10: 0-415-41448-2 (pbk) ISBN10: 0-203-96472-1 (Print Edition) (ebk) ISBN13: 978-0-415-41447-0 (hbk) ISBN13: 978-0-415-41448-7 (pbk) ISBN13: 978-0-203-96472-9 (Print Edition) (ebk) © 2007 Nikolai Dokuchaev Contents Preface vi 1 Review of probability theory 1 2 Basics of stochastic processes 17 3 Discrete time market models 23 4 Basics of Ito calculus and stochastic analysis 49 5 Continuous time market models 75 6 American options and binomial trees 110 7 Implied and historical volatility 132 8 Review of statistical estimation 139 9 Estimation of models for stock prices 168 Legend of notations and abbreviations 182 Selected answers and key figures 183 Bibliography 184 © 2007 Nikolai Dokuchaev Preface Dedicated to Natalia, Lidia, and Mikhail This book gives a systematic, self-sufficient, and yet short presentation of the mainstream topics of Mathematical Finance and related part of Stochastic Analysis and Statistical Finance that covers typical university programs.


Risk Management in Trading by Davis Edwards

Abraham Maslow, asset allocation, asset-backed security, backtesting, Bear Stearns, Black-Scholes formula, Brownian motion, business cycle, computerized trading, correlation coefficient, Credit Default Swap, discrete time, diversified portfolio, financial engineering, fixed income, Glass-Steagall Act, global macro, implied volatility, intangible asset, interest rate swap, iterative process, John Meriwether, junk bonds, London Whale, Long Term Capital Management, low interest rates, margin call, Myron Scholes, Nick Leeson, p-value, paper trading, pattern recognition, proprietary trading, random walk, risk free rate, risk tolerance, risk/return, selection bias, shareholder value, Sharpe ratio, short selling, statistical arbitrage, statistical model, stochastic process, systematic trading, time value of money, transaction costs, value at risk, Wiener process, zero-coupon bond

Randomness, in finance, is typically described using notation from probability. Probability is the branch of mathematics that studies how likely or unlikely something is to occur. The probability that an event will occur is represented as a value 64 RISK MANAGEMENT IN TRADING KEY CONCEPT: STOCHASTIC PROCESSES Stochastic is a term that describes a type of random process that evolves over time. In a stochastic process, prices might be modeled as a series whose next value depends on the current value plus a random component. This is slightly different than a completely random process (like the series of numbers obtained by rolling a pair of dice). between a 0 percent chance of occurrence (something will not occur) and a 100 percent chance of occurrence (something will definitely occur).

In finance, the term stochastic is often used as a synonym for random. Stochastic describes a type of random sequence that evolves over time. In this type of sequence, the value of the next item in the sequence depends on the value of the previous item plus or minus a random value. In finance, stochastic processes are particularly important. This is because prices are often modeled as stochastic processes, and prices are a fundamental input into trading decisions. Common examples of random numbers are the results of throwing dice or flipping a coin. Each roll of the dice or flip of a coin generates a realization of a defined process. The probability of the coin landing on either a head or a tail is 50 percent and the probability of any single number on a regular, six‐sided die is 1/6 (assuming a fair dice roll and fair coin flip).

The time that has passed 74 RISK MANAGEMENT IN TRADING 0.1% 0.2% 50/50 chance of +1 or −1 Cumulative Result 10 9 8 7 6 5 4 3 2 1 0 −1 −2 −3 −4 −5 −6 −7 −8 −9 −10 1.0% 0.4% 1.8% 0.8% 1.6% 3.1% 25.0% 50.0% 100.0% 15.6% 25.0% 37.5% 37.5% 25.0% 16.4% 21.9% 23.4% 6.3% 24.6% 27.3% 15.6% 16.4% 20.5% 16.4% 11.7% 10.9% 9.4% 3.1% 24.6% 27.3% 31.3% 25.0% 12.5% 20.5% 24.6% 27.3% 31.3% 11.7% 16.4% 21.9% 23.4% 31.3% 37.5% 50.0% 50.0% 4.4% 7.0% 10.9% 9.4% 6.3% 12.5% 3.1% 5.5% 7.0% 5.5% 3.1% 1.6% 0.8% 4.4% 1.8% 1.0% 0.4% 0.2% 0.1% 0 1 2 3 4 5 6 7 8 9 10 Time FIGURE 3.9 Dispersion in a Random Series For financial mathematics, the Wiener process is often generalized to include a constant drift term that pushes prices upward. The constant drift term is due to risk‐free inflation (and described later in the chapter in the “time value of money” discussion). Continuous time versions of this process are called Generalized Wiener Process or the Ito Process. (See Equation 3.8, A Stochastic Process.) A stochastic process with discrete time steps can be described as: ΔSt = μΔt + σΔWt St or ΔSt = μSt Δt + σSt ΔWt where ΔSt Change in Price. The change in price that will occur St Price. The price of an asset at time t (3.8) Financial Mathematics μ Drift. The drift term that pushes prices upwards.


pages: 571 words: 105,054

Advances in Financial Machine Learning by Marcos Lopez de Prado

algorithmic trading, Amazon Web Services, asset allocation, backtesting, behavioural economics, bioinformatics, Brownian motion, business process, Claude Shannon: information theory, cloud computing, complexity theory, correlation coefficient, correlation does not imply causation, data science, diversification, diversified portfolio, en.wikipedia.org, financial engineering, fixed income, Flash crash, G4S, Higgs boson, implied volatility, information asymmetry, latency arbitrage, margin call, market fragmentation, market microstructure, martingale, NP-complete, P = NP, p-value, paper trading, pattern recognition, performance metric, profit maximization, quantitative trading / quantitative finance, RAND corporation, random walk, risk free rate, risk-adjusted returns, risk/return, selection bias, Sharpe ratio, short selling, Silicon Valley, smart cities, smart meter, statistical arbitrage, statistical model, stochastic process, survivorship bias, transaction costs, traveling salesman

After Hosking's paper, the literature on this subject has been surprisingly scarce, adding up to eight journal articles written by only nine authors: Hosking, Johansen, Nielsen, MacKinnon, Jensen, Jones, Popiel, Cavaliere, and Taylor. See the references for details. Most of those papers relate to technical matters, such as fast algorithms for the calculation of fractional differentiation in continuous stochastic processes (e.g., Jensen and Nielsen [2014]). Differentiating the stochastic process is a computationally expensive operation. In this chapter we will take a practical, alternative, and novel approach to recover stationarity: We will generalize the difference operator to non-integer steps. 5.4 The Method Consider the backshift operator, B, applied to a matrix of real-valued features {Xt}, where BkXt = Xt − k for any integer k ≥ 0.

While assessing the probability of backtest overfitting is a useful tool to discard superfluous investment strategies, it would be better to avoid the risk of overfitting, at least in the context of calibrating a trading rule. In theory this could be accomplished by deriving the optimal parameters for the trading rule directly from the stochastic process that generates the data, rather than engaging in historical simulations. This is the approach we take in this chapter. Using the entire historical sample, we will characterize the stochastic process that generates the observed stream of returns, and derive the optimal values for the trading rule's parameters without requiring a historical simulation. 13.3 The Problem Suppose an investment strategy S invests in i = 1, …I opportunities or bets.

Chapters 10 and 16 are dedicated to this station, with the understanding that it would be unreasonable for a book to reveal specific investment strategies. 1.3.1.4 Backtesters This station assesses the profitability of an investment strategy under various scenarios. One of the scenarios of interest is how the strategy would perform if history repeated itself. However, the historical path is merely one of the possible outcomes of a stochastic process, and not necessarily the most likely going forward. Alternative scenarios must be evaluated, consistent with the knowledge of the weaknesses and strengths of a proposed strategy. Team members are data scientists with a deep understanding of empirical and experimental techniques. A good backtester incorporates in his analysis meta-information regarding how the strategy came about.


pages: 153 words: 12,501

Mathematics for Economics and Finance by Michael Harrison, Patrick Waldron

Brownian motion, buy low sell high, capital asset pricing model, compound rate of return, discrete time, incomplete markets, law of one price, market clearing, Myron Scholes, Pareto efficiency, risk tolerance, riskless arbitrage, short selling, stochastic process

It can also be thought of as a vector-valued function on the sample space Ω. A stochastic process is a collection of random variables or random vectors indexed by time, e.g. {x̃t : t ∈ T } or just {x̃t } if the time interval is clear from the context. For the purposes of this part of the course, we will assume that the index set consists of just a finite number of times i.e. that we are dealing with discrete time stochastic processes. Then a stochastic process whose elements are N -dimensional random vectors is equivalent to an N |T |-dimensional random vector. The (joint) c.d.f. of a random vector or stochastic process is the natural extension of the one-dimensional concept.

When we consider consumer choice under uncertainty, consumption plans will have to specify a fixed consumption vector for each possible state of nature or state of the world. This just means that each consumption plan is a random vector. Let us review the associated concepts from basic probability theory: probability space; random variables and vectors; and stochastic processes. Let Ω denote the set of all possible states of the world, called the sample space. A collection of states of the world, A ⊆ Ω, is called an event. Let A be a collection of events in Ω. The function P : A → [0, 1] is a probability function if 1. (a) Ω ∈ A (b) A ∈ A ⇒ Ω − A ∈ A (c) Ai ∈ A for i = 1, . . . , ∞ ⇒ S∞ i=1 Ai ∈ A (i.e.

If there are k physical commodities, a consumption plan must specify a k-dimensional vector, x ∈ <k , for each time and state of the world. We assume a finite number of times, denoted by the set T . The possible states of the world are denoted by the set Ω. So a consumption plan or lottery is just a collection of |T | k-dimensional random vectors, i.e. a stochastic process. Again to distinguish the certainty and uncertainty cases, we let L denote the collection of lotteries under consideration; X will now denote the set of possible values of the lotteries in L. Revised: December 2, 1998 94 5.5. THE EXPECTED UTILITY PARADIGM Preferences are now described by a relation on L.


The Concepts and Practice of Mathematical Finance by Mark S. Joshi

Black-Scholes formula, Brownian motion, correlation coefficient, Credit Default Swap, currency risk, delta neutral, discrete time, Emanuel Derman, financial engineering, fixed income, implied volatility, incomplete markets, interest rate derivative, interest rate swap, London Interbank Offered Rate, martingale, millennium bug, power law, quantitative trading / quantitative finance, risk free rate, short selling, stochastic process, stochastic volatility, the market place, time value of money, transaction costs, value at risk, volatility smile, yield curve, zero-coupon bond

We shall call such a family of random variables an Ito process or sometimes just a stochastic process. Note that if a is identically zero, we have that Xt+h - Xt - h s(t, Xt) (5.9) is of mean and variance o(h). We have thus essentially recovered the differential equation dXt (5.10) µ(t, Xt). dt The essential aspect of this definition is that if we know X0 and that Xt satisfies the stochastic differential equation, (5.8), then Xt is fully determined. In other terms, the stochastic differential equation has a unique solution. An important corollary of this is that µ and a together with Xo are the only quantities we need to know in order to define a stochastic process. Equally important is the issue of existence - it is not immediately obvious that a family Xt satisfying a given stochastic differential equation exists.

We therefore have a very powerful alternative method for pricing options. Justifying this procedure requires an excursion into some deep and powerful mathematics. 6.4 The concept of information 141 Before we can proceed to a better understanding of option pricing, we need a better understanding of the nature of stochastic processes. In particular, we need to think a little more deeply about what a stochastic process is. We have talked about a continuous family of processes, Xt, such that X, - XS has a certain distribution. As long as we only look at a finite number of values of t and s this is conceptually fairly clear, but once we start looking at all values at once it as a lot less obvious what these statements mean.

Hence, as before arbitrage is impossible. This is still not particularly useful however, as we know that a risky asset will in general grow faster than a riskless bond on average due to the risk aversion of market participants. To get round this problem, we ask what the rate of growth means for a stochastic process. The stochastic process is determined by a probability measure on the sample space which is the space of paths. However, the definition of an arbitrage barely mentions the probability measure. All it says is that it is impossible to set up a portfolio with zero value today which has a positive probability of being of positive value in the future, and a zero probability of being of negative value.


The Volatility Smile by Emanuel Derman,Michael B.Miller

Albert Einstein, Asian financial crisis, Benoit Mandelbrot, Black Monday: stock market crash in 1987, book value, Brownian motion, capital asset pricing model, collateralized debt obligation, continuous integration, Credit Default Swap, credit default swaps / collateralized debt obligations, discrete time, diversified portfolio, dividend-yielding stocks, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial engineering, fixed income, implied volatility, incomplete markets, law of one price, London Whale, mandelbrot fractal, market bubble, market friction, Myron Scholes, prediction markets, quantitative trading / quantitative finance, risk tolerance, riskless arbitrage, Sharpe ratio, statistical arbitrage, stochastic process, stochastic volatility, transaction costs, volatility arbitrage, volatility smile, Wiener process, yield curve, zero-coupon bond

If in addition to shares of stock you can also use other options to hedge the stochastic volatility of the target option, and if you know the stochastic process for option prices (i.e., volatility) as well as stock prices, then you can hedge your option’s exposure to volatility with another option, and derive an arbitrage-free formula for the option’s value, which we will do in the following chapter. In reality, we understand the stochastic process for option prices and volatility even less well than we understand the stochastic process for stock prices (which is to say, not very well at all). In the next chapter we will nevertheless assume that we know both processes, and analyze the results.

Relying on the market price of risk or a utility function, both of which require theoretical assumptions, is less reliable than either static or dynamic hedging, but there are times when we may have to do that in order to come up with a value estimate. If, however, you can trade options, and if you know (or, rather, assume that you know) the stochastic process for volatility in addition to the stochastic process for stock prices, then you can hedge an option’s exposure to volatility with another option. By doing this, you can derive an 168 THE VOLATILITY SMILE arbitrage-free formula for option values. We will do exactly this in a later chapter. The main problem with stochastic volatility models is that we don’t really know the appropriate stochastic differential equation for volatility.

This brings us to financial science, the putative study of the fundamental laws of financial objects, be they stocks, interest rates, or whatever else your theory uses as constituents. Here, unfortunately, be dragons. Financial engineering rests upon the mathematical fields of calculus, probability theory, stochastic processes, simulation, and Brownian motion. These fields can capture some of the essential features of the uncertainty we deal with in markets, but they don’t accurately describe the characteristic behavior of financial objects. Markets are plagued with anomalies that violate standard financial theories (or, more accurately, theories are plagued by their inability to systematically account for the actual behavior of markets).


pages: 819 words: 181,185

Derivatives Markets by David Goldenberg

Black-Scholes formula, Brownian motion, capital asset pricing model, commodity trading advisor, compound rate of return, conceptual framework, correlation coefficient, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, financial engineering, financial innovation, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, law of one price, locking in a profit, London Interbank Offered Rate, Louis Bachelier, margin call, market microstructure, martingale, Myron Scholes, Norbert Wiener, Paul Samuelson, price mechanism, random walk, reserve currency, risk free rate, risk/return, riskless arbitrage, Sharpe ratio, short selling, stochastic process, stochastic volatility, time value of money, transaction costs, volatility smile, Wiener process, yield curve, zero-coupon bond, zero-sum game

Even under risk neutrality (which doesn’t mean zero interest rates), the martingale requirement that Er(S1(ω)|S0)=S0 is clearly violated. Stock prices under risk neutrality are not martingales. However they aren’t very far from martingales. Definition of a Sub (Super) Martingale 1. A discrete-time stochastic process (Xn(ω))n=0,1,2,3,… is called a sub-martingale if E(Xn)<∞, and E(Xn+1(ω)|Xn)>Xn for all n=0,1,2,3,… 2. A discrete-time stochastic process (Xn(ω))n=0,1,2,3,… is called a super-martingale if E(Xn)<∞, , and E(Xn+1(ω)|Xn)<Xn for all n=0,1,2,3,… We expect stock prices to be sub-martingales, not martingales, for two separate and different reasons: 1. All assets, risky or not, have to provide a reward for time and waiting.

ABM is the most basic and important stochastic process in continuous time and continuous space, and it has many desirable properties including the strong Markov property, the martingale property, independent increments, normality, and continuous sample paths. Of course, here we want to focus on options pricing rather than the pure mathematical theory. The idea here is to partially prepare you for courses in mathematical finance. The details we have to leave out are usually covered in such courses. 16.1 ARITHMETIC BROWNIAN MOTION (ABM) ABM is a stochastic process {Wt(ω)}t≥0 defined on a sample space (Ω,ℑW,℘W).

If we want to take the total differential of f(x,t), we have to also incorporate its time dimension, which is the partial derivative of f(x,t) with respect to x and, which is the partial derivative of f(x,t) with respect to t. So far so good. Now for a huge jump. How do we take derivatives of smooth functions of stochastic processes, say F(Xt,t), such as (GBM SDE) where the process is the solution of a stochastic differential equation dXt=μXtdt+σXtdWt with initial value X0? We start with the observation that we can expect to end up with another stochastic process that is also the solution to another stochastic differential equation. This new stochastic differential equation for the total differential of F(Xt,t) will have a new set of drift and diffusion coefficients.


pages: 345 words: 86,394

Frequently Asked Questions in Quantitative Finance by Paul Wilmott

Abraham Wald, Albert Einstein, asset allocation, beat the dealer, Black-Scholes formula, Brownian motion, butterfly effect, buy and hold, capital asset pricing model, collateralized debt obligation, Credit Default Swap, credit default swaps / collateralized debt obligations, currency risk, delta neutral, discrete time, diversified portfolio, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial engineering, fixed income, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, iterative process, lateral thinking, London Interbank Offered Rate, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, market bubble, martingale, Myron Scholes, Norbert Wiener, Paul Samuelson, power law, quantitative trading / quantitative finance, random walk, regulatory arbitrage, risk free rate, risk/return, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, transaction costs, urban planning, value at risk, volatility arbitrage, volatility smile, Wiener process, yield curve, zero-coupon bond

Maths. 2 84-90 Hammersley, JM & Handscomb, DC 1964 Monte Carlo Methods. Methuen, London Harrison, JM & Kreps, D 1979 Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory 20 381-408 Harrison, JM & Pliska, SR 1981 Martingales and stochastic integrals in the theory of continuous trading. Stochastic Processes and their Applications 11 215-260 Haselgrove, CB 1961 A method for numerical integration. Mathematics of Computation 15 323-337 Heath, D, Jarrow, R & Morton, A 1992 Bond pricing and the term structure of interest rates: a new methodology. Econometrica 60 77-105 Ho, T & Lee, S 1986 Term structure movements and pricing interest rate contingent claims.

Figure 2-5: Certainty equivalent as a function of the risk-aversion parameter for example in the text. References and Further Reading Ingersoll, JE Jr 1987 Theory of Financial Decision Making. Rowman & Littlefield What is Brownian Motion and What are its Uses in Finance? Short Answer Brownian Motion is a stochastic process with stationary independent normally distributed increments and which also has continuous sample paths. It is the most common stochastic building block for random walks in finance. Example Pollen in water, smoke in a room, pollution in a river, are all examples of Brownian motion. And this is the common model for stock prices as well.

Long Answer Brownian motion (BM) is named after the Scottish botanist who first described the random motions of pollen grains suspended in water. The mathematics of this process were formalized by Bachelier, in an option-pricing context, and by Einstein. The mathematics of BM is also that of heat conduction and diffusion. Mathematically, BM is a continuous, stationary, stochastic process with independent normally distributed increments. If Wt is the BM at time t then for every t, τ ≥ 0, Wt+τ − Wt is independent of {Wu : 0 ≤ u ≤ t}, and has a normal distribution with zero mean and variance τ. The important properties of BM are as follows. • Finiteness: the scaling of the variance with the time step is crucial to BM remaining finite


pages: 206 words: 70,924

The Rise of the Quants: Marschak, Sharpe, Black, Scholes and Merton by Colin Read

Abraham Wald, Albert Einstein, Bayesian statistics, Bear Stearns, Black-Scholes formula, Bretton Woods, Brownian motion, business cycle, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, David Ricardo: comparative advantage, discovery of penicillin, discrete time, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial engineering, financial innovation, fixed income, floating exchange rates, full employment, Henri Poincaré, implied volatility, index fund, Isaac Newton, John Meriwether, John von Neumann, Joseph Schumpeter, Kenneth Arrow, Long Term Capital Management, Louis Bachelier, margin call, market clearing, martingale, means of production, moral hazard, Myron Scholes, Paul Samuelson, price stability, principal–agent problem, quantitative trading / quantitative finance, RAND corporation, random walk, risk free rate, risk tolerance, risk/return, Robert Solow, Ronald Reagan, shareholder value, Sharpe ratio, short selling, stochastic process, Thales and the olive presses, Thales of Miletus, The Chicago School, the scientific method, too big to fail, transaction costs, tulip mania, Works Progress Administration, yield curve

While he had demonstrated that an options price depends on the underlying stock price mean and volatility, and the risk-free interest rate, the overall market for interest rates is much more multi-dimensional. The interest rate yield curve, which graphs rates against maturities, depends on many markets and instruments, each of which is subject to stochastic processes. His interest and collaboration with Emanuel Derman and Bill Toy resulted in a model of interest rates that was first used profitably by Goldman Sachs through the 1980s, but eventually entered the public domain when they published their work in the Financial Analysts Journal in 1990.2 Their model provided reasonable estimates for both the prices and volatilities of treasury bonds, and is still used today.

Black-Scholes model – a model that can determine the price of a European call option based on the assumption that the underlying security follows a geometric Brownian motion with constant drift and volatility. Bond – a financial instrument that provides periodic (typically semi-annual) interest payments and the return of the paid-in capital upon maturity in exchange for a fixed price. Brownian motion – the simplest of the class of continuous-time stochastic processes that describes the random motion of a particle or a security that is buffeted by forces that are normally distributed in strength. Calculus of variations – a mathematical technique that can determine the optimal path of a variable, like savings or consumption, over time. Call – an option to purchase a specified security at a specified future time and price.

Kurtosis – a statistical measure of the distribution of observations about the expected mean as a deviation from that predicted by the normal distribution. Life cycle – the characterization of a process from its birth to death. Life Cycle Model – a model of household consumption behavior from the beginning of its earning capacity to the end of the household. Markov process – a stochastic process with the memorylessness property for which the present state, future state, and past observations are independent. Markowitz bullet – the upper boundary of the efficient frontier of various portfolios when graphed according to risk and return. Martingale – a model of a process for which past events cannot predict future outcomes.


pages: 209 words: 13,138

Empirical Market Microstructure: The Institutions, Economics and Econometrics of Securities Trading by Joel Hasbrouck

Alvin Roth, barriers to entry, business cycle, conceptual framework, correlation coefficient, discrete time, disintermediation, distributed generation, experimental economics, financial intermediation, index arbitrage, information asymmetry, interest rate swap, inventory management, market clearing, market design, market friction, market microstructure, martingale, payment for order flow, power law, price discovery process, price discrimination, quantitative trading / quantitative finance, random walk, Richard Thaler, second-price auction, selection bias, short selling, statistical model, stochastic process, stochastic volatility, transaction costs, two-sided market, ultimatum game, zero-sum game

For reasons that will be discussed shortly, the drift can be dropped in most microstructure analyses. When µ = 0, pt cannot be forecast beyond its most recent value: E[pt+1 | pt , pt−1 , . . .] = pt . A process with this property is generally described as a martingale. One definition of a martingale is a discrete stochastic process {xt } where E|xt | < ∞ for all t, and E(xt+1 | xt , xt−1 , . . . ) = xt (see Karlin and Taylor (1975) or Ross (1996)). Martingale behavior of asset prices is a classic result arising in many economic models with individual optimization, absence of arbitrage, or security market equilibrium (Cochrane (2005)).

Placing the price change first is simply an expositional simplification and carries no implications that this variable is first in any causal sense. The chapter treats the general case but uses a particular structural model for purposes of illustration. The structural model is a bivariate model of price changes and trade directions: yt = [pt qt ]′ . 9.1 Modeling Vector Time Series The basic descriptive statistics of a vector stochastic process { yt } are the process mean µ = E[yt ] and the vector autocovariances. The vector autocovariances are defined as the matrices 78 MULTIVARIATE LINEAR MICROSTRUCTURE MODELS Ŵk = E( yt − E [yt ])(yt−k − E [yt ])′ for k = . . . −2, −1, 0, +1, +2, . . . (9.1) In suppressing the dependence of µ and Ŵk on t, we have implicitly invoked an assumption of covariance stationarity.

If at time t, pt ≥ Lt , then the agent has effectively submitted a marketable limit order, which achieves immediate execution. A limit order priced at Lt < pt will be executed during period t if pτ ≤ Lt for any time t < τ < t + 1. The situation is depicted in figure 15.2. A limit order priced at Lt executes if the stock price follows path B but not path A. This is a standard problem in stochastic processes, and many exact results are available. The diffusion-barrier notion of execution is at best a first approximation. In many markets, a buy limit order might be executed by a market (or marketable) sell order while the best ask is still well above the limit price. We will subsequently generalize the execution mechanism to allow this.


pages: 855 words: 178,507

The Information: A History, a Theory, a Flood by James Gleick

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, bank run, bioinformatics, Bletchley Park, Brownian motion, butterfly effect, Charles Babbage, citation needed, classic study, Claude Shannon: information theory, clockwork universe, computer age, Computing Machinery and Intelligence, conceptual framework, crowdsourcing, death of newspapers, discovery of DNA, Donald Knuth, double helix, Douglas Hofstadter, en.wikipedia.org, Eratosthenes, Fellow of the Royal Society, Gregor Mendel, Gödel, Escher, Bach, Henri Poincaré, Honoré de Balzac, index card, informal economy, information retrieval, invention of the printing press, invention of writing, Isaac Newton, Jacquard loom, Jaron Lanier, jimmy wales, Johannes Kepler, John von Neumann, Joseph-Marie Jacquard, Lewis Mumford, lifelogging, Louis Daguerre, machine translation, Marshall McLuhan, Menlo Park, microbiome, Milgram experiment, Network effects, New Journalism, Norbert Wiener, Norman Macrae, On the Economy of Machinery and Manufactures, PageRank, pattern recognition, phenotype, Pierre-Simon Laplace, pre–internet, quantum cryptography, Ralph Waldo Emerson, RAND corporation, reversible computing, Richard Feynman, Rubik’s Cube, Simon Singh, Socratic dialogue, Stephen Hawking, Steven Pinker, stochastic process, talking drums, the High Line, The Wisdom of Crowds, transcontinental railway, Turing machine, Turing test, women in the workforce, yottabyte

.♦ To illuminate the structure of the message Shannon turned to some methodology and language from the physics of stochastic processes, from Brownian motion to stellar dynamics. (He cited a landmark 1943 paper by the astrophysicist Subrahmanyan Chandrasekhar in Reviews of Modern Physics.♦) A stochastic process is neither deterministic (the next event can be calculated with certainty) nor random (the next event is totally free). It is governed by a set of probabilities. Each event has a probability that depends on the state of the system and perhaps also on its previous history. If for event we substitute symbol, then a natural written language like English or Chinese is a stochastic process. So is digitized speech; so is a television signal.

“It is true,” he said, “that Shannon left to his successors the rigorous ‘justification’ of his ideas in some difficult cases. However, his mathematical intuition was amazingly precise.” Kolmogorov was not as enthusiastic about cybernetics. Norbert Wiener felt a kinship with him—they had both done early work on stochastic processes and Brownian motion. On a visit to Moscow, Wiener said, “When I read the works of Academician Kolmogorov, I feel that these are my thoughts as well, this is what I wanted to say. And I know that Academician Kolmogorov has the same feeling when reading my works.”♦ But the feeling was evidently not shared.

classification, 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 3.1, 3.2 Clausius, Rudolf, 9.1, 9.2, 9.3 Clauson-Thue, William, 5.1, 5.2, 5.3 Clement, Joseph, 4.1, 4.2 clocks, synchronization of, 1.1, 5.1, 5.2, 5.3, 5.4 cloud, information, 14.1, 14.2 clustering Clytemnestra code attempts to reduce cost of telegraphy, 5.1, 5.2 Babbage’s interest in cipher and compression systems for telegraphy, 5.1, 5.2, 5.3, 5.4, 5.5, 5.6 Enigma, 7.1, 7.2, 7.3 genetic, 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.8, 10.9, 10.10 in Jacquard loom operations Morse, prl.1, 1.1, 1.2, 1.3, 1.4, 5.1, 5.2, 5.3, 5.4, 6.1, 11.1 as noise for printing telegraph Shannon’s interest in, prl.1, 6.1, 7.1 telegraphy before Morse code, 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 see also cryptography coding theory, 8.1, 8.2, 10.1, 12.1 cognitive science, 8.1, 8.2, 8.3, 8.4 Colebrooke, Henry collective consciousness, epl.1, epl.2, epl.3, epl.4, epl.5 Colossus computing machine Columbus, Christopher combinatorial analysis, 6.1, 10.1, 10.2 communication by algorithm with alien life-form, 12.1, 12.2, 12.3, 12.4, 12.5 Babbage’s mechanical notation for describing, 4.1, 4.2, 5.1 constrained channels of, 2.1, 2.2 disruptive effects of new technologies in, 15.1, 15.2 emergence of global consciousness, epl.1, epl.2, epl.3 evolution of electrical technologies for, 5.1, 5.2, 6.1, 6.2 fundamental problem of, prl.1, 7.1, 7.2, 8.1 human evolution and, prl.1, prl.2 implications of technological evolution of, 15.1, 15.2 information overload and, epl.1, epl.2 knowledge needs for, 12.1, 12.2, 12.3 in origins of governance Shannon’s diagram of, 7.1, 7.2, 7.3 as stochastic process symbolic logic to describe systems of system elements, 7.1, 7.2 in Twitter, epl.1, epl.2 see also talking drums; telegraphy; telephony; transmission of information compact disc, prl.1, 8.1, epl.1 complexity, 12.1, 12.2, 12.3, 12.4, 12.5, 12.6, 12.7, 12.8, 12.9 compression of information; see data compression “Computable Numbers, On” (Turing), 7.1, 7.2, 12.1 computation in Babylonian mathematics, 2.1, 2.2 computable and uncomputable numbers, 7.1, 7.2, 7.3, 7.4, 12.1, 12.2, 12.3 of differential equations, 4.1, 4.2 in evolution of complex structures human computers, 4.1, 4.2, 4.3 thermodynamics of, 13.1, 13.2, 13.3, 13.4 Turing machine for, 7.1, 7.2, 7.3, 7.4, 7.5, 7.6 see also calculators; computers computer(s) analog and digital, 8.1, 8.2 chess-playing, 8.1, 8.2 comparison to humans, 8.1, 8.2 cost of memory storage cost of work of, 13.1, 13.2 early mechanical, prl.1, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 8.1 growth of memory and processing speed of, 14.1, 14.2, 14.3, 14.4 inductive learning in perception of thinking by, 8.1, 8.2, 8.3, 8.4 public awareness of quantum-based, 13.1, 13.2, 13.3, 13.4 Shannon’s information theory in, prl.1, 6.1, 7.1, 7.2, 8.1, 8.2 significance of information theory in development of spread of memes through Turing’s conceptualization of, 8.1, 8.2, 8.3 universe as, 14.1, 14.2 see also calculators; computation; programming Conference on Cybernetics, 8.1, 8.2, 8.3, 8.4, 8.5, 8.6, 8.7, 8.8, 8.9, 8.10, 8.11, 8.12 Connolly, Sean J.


pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable by James Owen Weatherall

Alan Greenspan, Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Apollo 11, Asian financial crisis, bank run, Bear Stearns, beat the dealer, behavioural economics, Benoit Mandelbrot, Black Monday: stock market crash in 1987, Black Swan, Black-Scholes formula, Bonfire of the Vanities, book value, Bretton Woods, Brownian motion, business cycle, butterfly effect, buy and hold, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, coastline paradox / Richardson effect, collateralized debt obligation, collective bargaining, currency risk, dark matter, Edward Lorenz: Chaos theory, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial engineering, financial innovation, Financial Modelers Manifesto, fixed income, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, Jim Simons, John Nash: game theory, junk bonds, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, Market Wizards by Jack D. Schwager, martingale, Michael Milken, military-industrial complex, Myron Scholes, Neil Armstrong, new economy, Nixon triggered the end of the Bretton Woods system, Paul Lévy, Paul Samuelson, power law, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk free rate, risk-adjusted returns, Robert Gordon, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, Stuart Kauffman, The Chicago School, The Myth of the Rational Market, tulip mania, Vilfredo Pareto, volatility smile

Although discussing such debates is far from the scope of this book, I should note that the arguments offered here for how one should think of the status of mathematical models in finance are closely connected to more general discussions concerning the status of mathematical or physical theories quite generally. “. . . named after Scottish botanist Robert Brown . . .”: Brown’s observations were published as Brown (1828). “The mathematical treatment of Brownian motion . . .”: More generally, Brownian motion is an example of a random or “stochastic” process. For an overview of the mathematics of stochastic processes, see Karlin and Taylor (1975, 1981). “. . . it was his 1905 paper that caught Perrin’s eye”: Einstein published four papers in 1905. One of them was the one I refer to here (Einstein 1905b), but the other three were equally remarkable. In Einstein (1905a), he first suggests that light comes in discrete packets, now called quanta or photons; in Einstein (1905c), he introduces his special theory of relativity; and in Einstein (1905d), he proposes the famous equation e = mc2

“The Predictors by Thomas A. Bass: A Retrospective.” This is a comment on The Predictors by a former employee of the Prediction Company. Available at http://www.bearcave.com/bookrev/predictors2.html. Karlin, Samuel, and Howard M. Taylor. 1975. A First Course in Stochastic Processes. 2nd ed. San Diego, CA: Academic Press. — — — . 1981. A Second Course in Stochastic Processes. San Diego, CA: Academic Press. Katzmann, Robert A. 2008. Daniel Patrick Moynihan: The Intellectual in Public Life. Washington, DC: Woodrow Wilson Center Press. Kelly, J., Jr. 1956. “A New Interpretation of Information Rate.” IRE Transactions on Information Theory 2 (3, September): 185–89.

Journal of Economic Perspectives 12 (1, Winter): 3–26. Bosworth, Barry P. 1997. “The Politics of Immaculate Conception.” The Brookings Review, June, 43–44. Bouchaud, Jean-Philippe, and Didier Sornette. 1994. “The Black-Scholes Option Pricing Problem in Mathematical Finance: Generalization and Extensions for a Large Class of Stochastic Processes.” Journal de Physique 4 (6): 863–81. Bower, Tom. 1984. Klaus Barbie, Butcher of Lyons. London: M. Joseph. Bowman, D. D., G. Ouillion, C. G. Sammis, A. Sornette, and D. Sornette. 1998. “An Observational Test of the Critical Earthquake Concept.” Journal of Geophysical Research 103: 24359–72.


pages: 695 words: 194,693

Money Changes Everything: How Finance Made Civilization Possible by William N. Goetzmann

Albert Einstein, Andrei Shleifer, asset allocation, asset-backed security, banking crisis, Benoit Mandelbrot, Black Swan, Black-Scholes formula, book value, Bretton Woods, Brownian motion, business cycle, capital asset pricing model, Cass Sunstein, classic study, collective bargaining, colonial exploitation, compound rate of return, conceptual framework, Cornelius Vanderbilt, corporate governance, Credit Default Swap, David Ricardo: comparative advantage, debt deflation, delayed gratification, Detroit bankruptcy, disintermediation, diversified portfolio, double entry bookkeeping, Edmond Halley, en.wikipedia.org, equity premium, equity risk premium, financial engineering, financial independence, financial innovation, financial intermediation, fixed income, frictionless, frictionless market, full employment, high net worth, income inequality, index fund, invention of the steam engine, invention of writing, invisible hand, James Watt: steam engine, joint-stock company, joint-stock limited liability company, laissez-faire capitalism, land bank, Louis Bachelier, low interest rates, mandelbrot fractal, market bubble, means of production, money market fund, money: store of value / unit of account / medium of exchange, moral hazard, Myron Scholes, new economy, passive investing, Paul Lévy, Ponzi scheme, price stability, principal–agent problem, profit maximization, profit motive, public intellectual, quantitative trading / quantitative finance, random walk, Richard Thaler, Robert Shiller, shareholder value, short selling, South Sea Bubble, sovereign wealth fund, spice trade, stochastic process, subprime mortgage crisis, Suez canal 1869, Suez crisis 1956, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, time value of money, tontine, too big to fail, trade liberalization, trade route, transatlantic slave trade, tulip mania, wage slave

Mandelbrot was a student of Paul Lévy’s—the son of the man who gave Bachelier bad marks at his examination at the École Polytechnique in 1900. Lévy’s research focused on “stochastic processes”: mathematical models that describe the behavior of some variable through time. For example, we saw in Chapter 15 that Jules Regnault proposed and tested a stochastic process that varied randomly, which resulted in a rule about risk increasing with the square root of time. Likewise, Louis Bachelier more formally developed a random-walk stochastic process. Paul Lévy formalized these prior random walk models into a very general family of stochastic processes referred to as Lévy processes. Brownian motion was just one process in the family of Lévy processes—and perhaps the best behaved of them.

Brownian motion was just one process in the family of Lévy processes—and perhaps the best behaved of them. Other stochastic processes have such things as discontinuous jumps and unusually large shocks (which might, for example, explain the crash of 1987, when the US stock market lost 22.6% of its value in a single day). In the 1960s, Benoit Mandelbrot began to investigate whether Lévy processes described economic time series like cotton prices and stock prices. He found that the ones that generated jumps and extreme events better described financial markets. He developed a mathematics around these unusual Lévy processes that he called “fractal geometry.”

One of his major contributions to the literature on finance (published in 1966) was a proof that an efficient market implies that stock prices may not follow a random walk, but that they must be unpredictable. It was a nice refinement of Regnault’s hypothesis articulated almost precisely a century prior. Although Mandelbrot ultimately developed a fractal-based option-pricing model with two of his students that allowed for extreme events and a more general stochastic process, for various reasons Mandelbrot never saw it adopted in practice to any great extent. I suspect that this is because the solution, while potentially useful, is complicated and contradicts most other tools that quantitative financiers use. With Mandelbrot’s models, it is all or nothing. You have to take a leap beyond the world of Brownian motion and throw out old friends like Bernoulli’s law of large numbers.


High-Frequency Trading by David Easley, Marcos López de Prado, Maureen O'Hara

algorithmic trading, asset allocation, backtesting, Bear Stearns, Brownian motion, capital asset pricing model, computer vision, continuous double auction, dark matter, discrete time, finite state, fixed income, Flash crash, High speed trading, index arbitrage, information asymmetry, interest rate swap, Large Hadron Collider, latency arbitrage, margin call, market design, market fragmentation, market fundamentalism, market microstructure, martingale, National best bid and offer, natural language processing, offshore financial centre, pattern recognition, power law, price discovery process, price discrimination, price stability, proprietary trading, quantitative trading / quantitative finance, random walk, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, Tobin tax, transaction costs, two-sided market, yield curve

Pedersen, 2005, “Predatory Trading”, Journal of Finance 40(4), pp. 1825–63. Carlin, B., M. Sousa Lobo and S. Viswanathan, 2007, “Episodic Liquidity Crises: Cooperative and Predatory Trading”, Journal of Finance 42(5), pp. 2235–74. Clark, P. K., 1970, “A Subordinated Stochastic Process Model of Cotton Futures Prices”, PhD Dissertation, Harvard University. Clark, P. K., 1973, “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices”, Econometrica 41(1), pp. 135–55. Donefer, B. S., 2010, “Algos Gone Wild: Risk in the World of Automated Trading Strategies”, The Journal of Trading 5, pp. 31–4. Easley, D., N. Kiefer, M.

O’Hara, 2012, “The Impact of Computer Trading on Liquidity, Price Efficiency/Discovery and Transactions Costs”, in Foresight: The Future of Computer Trading in Financial Markets. An International Perspective, Final Project Report. The Government Office for Science, London. Mandelbrot, B., 1973, “Comments on ‘A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices by Peter K. Clark’ ”, Econometrica 41(1), pp. 157–59. Mandelbrot, B., and M. Taylor, 1967, “On the Distribution of Stock Price Differences”, Operations Research 15(6), pp. 1057–62. NANEX, 2010, “Analysis of the ‘Flash Crash’ ”, June 18. URL: http://www.nanex.net/ 20100506/FlashCrashAnalysis_CompleteText.html.

The aforementioned papers seek to solve problems similar to ours, ie, to execute a certain number of shares over some fixed period as cheaply as possible, but approach it from another direction. They typically start with an assumption that the underlying “true” stock price is generated by some known stochastic process. There is also a known impact function that specifies how arriving liquidity demand pushes market prices away from this true value. Having this information, as well as time and volume constraints, it is then possible to compute the optimal strategy explicitly. This can be done either in closed form or numerically (often using dynamic programming, the basis of reinforcement learning).


Monte Carlo Simulation and Finance by Don L. McLeish

algorithmic bias, Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, discrete time, distributed generation, finite state, frictionless, frictionless market, implied volatility, incomplete markets, invention of the printing press, martingale, p-value, random walk, risk free rate, Sharpe ratio, short selling, stochastic process, stochastic volatility, survivorship bias, the market place, transaction costs, value at risk, Wiener process, zero-coupon bond, zero-sum game

MODELS IN CONTINUOUS TIME 67 Wiener Process 3 2.5 2 W(t) 1.5 1 0.5 0 -0.5 -1 0 1 2 3 4 5 t 6 7 8 9 Figure 2.6: A sample path of the Wiener process Models in Continuous Time We begin with some oversimplified rules of stochastic calculus which can be omitted by those with a background in Brownian motion and diffusion. First, we define a stochastic process Wt called the standard Brownian motion or Wiener process having the following properties; 1. For each h > 0, the increment W (t+h)−W (t) has a N (0, h) distribution and is independent of all preceding increments W (u) − W (v), t > u > v > 0. 2. W (0 ) = 0 . [FIGURE 2.6 ABOUT HERE] The fact that such a process exists is by no means easy to see.

And when the drift term a(Xt , t ) is linear in Xt , the solution of an ordinary differential equation will allow the calculation of the expected value of the process and this is the first and most basic description of its behaviour. The MODELS IN CONTINUOUS TIME 77 appendix provides an elementary review of techniques for solving partial and ordinary differential equations. However, that the information about a stochastic process obtained from a deterministic object such as a ordinary or partial differential equation is necessarily limited. For example, while we can sometimes obtain the marginal distribution of the process at time t it is more difficult to obtain quantities such as the joint distribution of variables which depending on the path of the process, and these are important in valuing certain types of exotic options such as lookback and barrier options.

Solving deterministic differential equations can sometimes provide a solution to a specific problem such as finding the arbitrage-free price of a derivative. In general, for more complex features of the derivative such as the distribution of return, important for considerations such as the Value at Risk, we need to obtain a solution {Xt , 0 < t < T }to an equation of the above form which is a stochastic process. Typically this can only be done by simulation. One of the simplest methods of simulating such a process is motivated through a crude interpretation of the above equation in terms of discrete time steps, that is that a small increment Xt+h − Xt in the process is approximately normally distributed with mean given by a(Xt , t)hand variance given by σ 2 (Xt , t)h.


Mathematics for Finance: An Introduction to Financial Engineering by Marek Capinski, Tomasz Zastawniak

Black-Scholes formula, Brownian motion, capital asset pricing model, cellular automata, delta neutral, discounted cash flows, discrete time, diversified portfolio, financial engineering, fixed income, interest rate derivative, interest rate swap, locking in a profit, London Interbank Offered Rate, margin call, martingale, quantitative trading / quantitative finance, random walk, risk free rate, short selling, stochastic process, time value of money, transaction costs, value at risk, Wiener process, zero-coupon bond

(The latter is the same as for the par bond.) Expectation with respect to the risk-neutral probability gives the initial bond price 100.05489, so the floor is worth 0.05489. Bibliography Background Reading: Probability and Stochastic Processes Ash, R. B. (1970), Basic Probability Theory, John Wiley & Sons, New York. Brzeźniak, Z. and Zastawniak, T. (1999), Basic Stochastic Processes, Springer Undergraduate Mathematics Series, Springer-Verlag, London. Capiński, M. and Kopp, P. E. (1999), Measure, Integral and Probability, Springer Undergraduate Mathematics Series, Springer-Verlag, London. Capiński, M. and Zastawniak, T. (2001), Probability Through Problems, Springer-Verlag, New York.

Toland University of Bath Other books in this series A First Course in Discrete Mathematics I. Anderson Analytic Methods for Partial Differential Equations G. Evans, J. Blackledge, P. Yardley Applied Geometry for Computer Graphics and CAD D. Marsh Basic Linear Algebra, Second Edition T.S. Blyth and E.F. Robertson Basic Stochastic Processes Z. Brzeźniak and T. Zastawniak Elementary Differential Geometry A. Pressley Elementary Number Theory G.A. Jones and J.M. Jones Elements of Abstract Analysis M. Ó Searcóid Elements of Logic via Numbers and Sets D.L. Johnson Essential Mathematical Biology N.F. Britton Fields, Flows and Waves: An Introduction to Continuum Models D.F.


pages: 407 words: 104,622

The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution by Gregory Zuckerman

affirmative action, Affordable Care Act / Obamacare, Alan Greenspan, Albert Einstein, Andrew Wiles, automated trading system, backtesting, Bayesian statistics, Bear Stearns, beat the dealer, behavioural economics, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, Black Monday: stock market crash in 1987, blockchain, book value, Brownian motion, butter production in bangladesh, buy and hold, buy low sell high, Cambridge Analytica, Carl Icahn, Claude Shannon: information theory, computer age, computerized trading, Credit Default Swap, Daniel Kahneman / Amos Tversky, data science, diversified portfolio, Donald Trump, Edward Thorp, Elon Musk, Emanuel Derman, endowment effect, financial engineering, Flash crash, George Gilder, Gordon Gekko, illegal immigration, index card, index fund, Isaac Newton, Jim Simons, John Meriwether, John Nash: game theory, John von Neumann, junk bonds, Loma Prieta earthquake, Long Term Capital Management, loss aversion, Louis Bachelier, mandelbrot fractal, margin call, Mark Zuckerberg, Michael Milken, Monty Hall problem, More Guns, Less Crime, Myron Scholes, Naomi Klein, natural language processing, Neil Armstrong, obamacare, off-the-grid, p-value, pattern recognition, Peter Thiel, Ponzi scheme, prediction markets, proprietary trading, quantitative hedge fund, quantitative trading / quantitative finance, random walk, Renaissance Technologies, Richard Thaler, Robert Mercer, Ronald Reagan, self-driving car, Sharpe ratio, Silicon Valley, sovereign wealth fund, speech recognition, statistical arbitrage, statistical model, Steve Bannon, Steve Jobs, stochastic process, the scientific method, Thomas Bayes, transaction costs, Turing machine, Two Sigma

Members of Axcom’s team viewed investing through a math prism and understood financial markets to be complicated and evolving, with behavior that is difficult to predict, at least over long stretches—just like a stochastic process. It’s easy to see why they saw similarities between stochastic processes and investing. For one thing, Simons, Ax, and Straus didn’t believe the market was truly a “random walk,” or entirely unpredictable, as some academics and others argued. Though it clearly had elements of randomness, much like the weather, mathematicians like Simons and Ax would argue that a probability distribution could capture futures prices as well as any other stochastic process. That’s why Ax thought employing such a mathematical representation could be helpful to their trading models.


pages: 571 words: 124,448

Building Habitats on the Moon: Engineering Approaches to Lunar Settlements by Haym Benaroya

3D printing, anti-fragile, Apollo 11, Apollo 13, biofilm, Black Swan, Brownian motion, Buckminster Fuller, carbon-based life, centre right, clean water, Colonization of Mars, Computer Numeric Control, conceptual framework, data acquisition, dual-use technology, Elon Musk, fault tolerance, Gene Kranz, gravity well, inventory management, Johannes Kepler, low earth orbit, Neil Armstrong, orbital mechanics / astrodynamics, performance metric, RAND corporation, restrictive zoning, risk tolerance, Ronald Reagan, stochastic process, tacit knowledge, telepresence, telerobotics, the scientific method, Two Sigma, urban planning, Virgin Galactic, X Prize, zero-sum game

A measure of the reliability (probability of failure) is given by the overlapped area, shown hatched. The random variable is a static property – the shape of the density function does not change with time. Where the density function is time-dependent, the variable is called a random, or stochastic, process. Before examining some commonly used densities, we define an averaging procedure known as the mathematical expectation for probabilistic variables. 10.3 Mathematical Expectation The single most important descriptor of a random variable is its mean or expected value. This defines the most likely value of a variable.

Our reliability estimates are guesses about the future, not extrapolations from past data. More on this later. Now that we have an understanding of the autocorrelation, we proceed to study its Fourier transform, the spectral density . 10.6 Power Spectrum A measure of the ‘energy’ of the stochastic process X(t) is given by its power spectrum , or spectral density , S XX (ω), which is the Fourier transform of its autocorrelation function: and thus: (10.14) These equations are known as the Wiener-Khintchine formulas. Since R XX (−τ) = R XX (τ), S XX (ω) is not a complex function but a real and even function.

Footnotes 1From the Greek we also have the stochastic (στoκoς) process. 2An axiom is a rule that is assumed to be true, and upon which further rules and facts are deduced. For engineering, the deduced facts must conform to reality. An excellent book on the basics of probabilistic modeling is Probability, Random Variables, and Stochastic Processes, A. Papoulis, McGraw-Hill, 1965. © Springer International Publishing AG 2018 Haym BenaroyaBuilding Habitats on the MoonSpringer Praxis Bookshttps://doi.org/10.1007/978-3-319-68244-0_11 11. Reliability and damage Haym Benaroya1 (1)Professor of Mechanical & Aerospace Engineering, Rutgers University, New Brunswick, New Jersey, USA “We need to make sure it survives for a while.”


Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing (Writing Science) by Thierry Bardini

Apple II, augmented reality, Bill Duvall, Charles Babbage, classic study, Compatible Time-Sharing System, Computing Machinery and Intelligence, conceptual framework, Donald Davies, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, Grace Hopper, hiring and firing, hypertext link, index card, information retrieval, invention of hypertext, Ivan Sutherland, Jaron Lanier, Jeff Rulifson, John von Neumann, knowledge worker, Leonard Kleinrock, Menlo Park, military-industrial complex, Mother of all demos, Multics, new economy, Norbert Wiener, Norman Mailer, packet switching, Project Xanadu, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, RFC: Request For Comment, Sapir-Whorf hypothesis, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, stochastic process, Ted Nelson, the medium is the message, theory of mind, Turing test, unbiased observer, Vannevar Bush, Whole Earth Catalog, work culture

In the conceptual world, both the transmission and the trans- formation of what Whorf called "culturally ordained forms and categories" IS the process by which people learn. The crucial point in Bateson's synthesis lay in the characterization of all such processes as "stochastic": Both genetic change and the process called learnIng (including the somatic changes induced by the envIronment) are stochastic processes. In each case there is, I believe, a stream of events that is random in certain aspects and in each case there is a nonrandom selective process which causes certain of the random com- ponents to "surVIve" longer than others. Without the random, there can be no new thIng. . . . We face, then, two great stochastic systems that are partly in interaction and partly isolated from each other.

And things have been this way ever since" (ibid., 336).6 Thus, it was by institutionalization as an incorporating practice that the QWERTY standard became established. The establishment of a commercial education network favoring the QWERTY was the decisive factor, the source of the" historical accident" that governed the stochastic process that secured forever the supremacy of the QWERTY. It is indeed because of such an "acci- dent" that the six or seven years during which Remington enjoyed the early advantage of being the sole owner of the typewriter patent also saw its selling agents establish profitable and durable business associations with the com- mercial education business.

See also Hypertext Atari, 103 Atlas computer, 126, 25 2n6 Augmentation Research Center (ARC), 275 276 Index 145-47,157; staff, 121-22; spa- tial organization of laboratory, 122- 23; Framework ActivIty (FRAMAC), 194-95,211, 259nI6; Personal and Organizational Development Activ- ity (PODAC), 194-201, 259nnI6- 18; Line Activity (LINAC), 194,211, 259nI6; and est, 201-8, 260nnI9- 20; as "breakthrough lab," 211-13; Engelbart's eulogy, 214 -as NIC, see under ARPA: Network Information Center Augmented knowledge workshop, I 16, 21 9 AutomatIon, 18 - I 9, 24 0n 5 Automobile, 18, 3 I Baby boomers, 125 Bandwidth, see under Information Baran, Paul, 18 4, 257n4 Bass, Walter, I98ff; and est, 202, 204, 260nI9 Batch processing, 4 Bates, Roger, 1°9,120,123,156 Bateson, Gregory, 17, 26, 52, 56, 102, 135, 228- 2 9, 23 6nI 3, 24 0n 3, 242nI8; on coevolution, 56, 24 2 - 43 n 24; on stochastIc process, 56, 24 2n2 4 Baudot, Maurice-Emile, 65, 67 f , 79 BBN (Bolt, Beranek and Newman), 30, 12 4, 19 1 , 247 nI ,25 8n 7 BCC (Berkeley Computer CorporatIon), 155 f , 25 6n 9 Beam pen, 89. See also LIght pen "Behavior, Purpose and Teleology," 25 Bell Laboratories, 24 7 n 5 Benedict, Henry H., 78 Bergson, Henri-Louis, 48 Berkeley Computer CorporatIon (BCC), I55 f , 25 6n 9 Berman, Melvyn, 1°9-10 Bewley, William, 177 Bigelow, Julian, 25 Bliss, James C., 61-62, 222-23, 244nI Boaz, Franz, 24 0n 3 .


How I Became a Quant: Insights From 25 of Wall Street's Elite by Richard R. Lindsey, Barry Schachter

Albert Einstein, algorithmic trading, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, asset allocation, asset-backed security, backtesting, bank run, banking crisis, Bear Stearns, Black-Scholes formula, Bob Litterman, Bonfire of the Vanities, book value, Bretton Woods, Brownian motion, business cycle, business process, butter production in bangladesh, buy and hold, buy low sell high, capital asset pricing model, centre right, collateralized debt obligation, commoditize, computerized markets, corporate governance, correlation coefficient, creative destruction, Credit Default Swap, credit default swaps / collateralized debt obligations, currency manipulation / currency intervention, currency risk, discounted cash flows, disintermediation, diversification, Donald Knuth, Edward Thorp, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial engineering, financial innovation, fixed income, full employment, George Akerlof, global macro, Gordon Gekko, hiring and firing, implied volatility, index fund, interest rate derivative, interest rate swap, Ivan Sutherland, John Bogle, John von Neumann, junk bonds, linear programming, Loma Prieta earthquake, Long Term Capital Management, machine readable, margin call, market friction, market microstructure, martingale, merger arbitrage, Michael Milken, Myron Scholes, Nick Leeson, P = NP, pattern recognition, Paul Samuelson, pensions crisis, performance metric, prediction markets, profit maximization, proprietary trading, purchasing power parity, quantitative trading / quantitative finance, QWERTY keyboard, RAND corporation, random walk, Ray Kurzweil, Reminiscences of a Stock Operator, Richard Feynman, Richard Stallman, risk free rate, risk-adjusted returns, risk/return, seminal paper, shareholder value, Sharpe ratio, short selling, Silicon Valley, six sigma, sorting algorithm, statistical arbitrage, statistical model, stem cell, Steven Levy, stochastic process, subscription business, systematic trading, technology bubble, The Great Moderation, the scientific method, too big to fail, trade route, transaction costs, transfer pricing, value at risk, volatility smile, Wiener process, yield curve, young professional

For starters, after years of specializing in pure mathematics, I was starting from scratch in a totally new area. It allowed me to start to learn basic mathematics instead of delving deeper and deeper into advanced subjects. I literally had to start from scratch and learn probability theory and then the basics of stochastic processes, things I knew nothing at all about. Not to mention I knew nothing about financial markets, derivatives, or JWPR007-Lindsey 122 May 7, 2007 16:55 h ow i b e cam e a quant anything at all to do with finance. It was exciting to learn so much from scratch. In the midst of reading about Black-Scholes, I was also deeply involved with writing the book with Victor Ginzburg from the University of Chicago.

This represents some of the best academic advice I have ever received since I am not sure that I would have immediately realized the model’s importance and potential for further work by myself. The rest, in some sense, is history. I really enjoyed the paper because I was struggling to understand some of the rather abstract questions in stochastic process theory that it dealt with, and I quickly decided to work on the HJM model for my dissertation. Broadly speaking, the HJM paradigm still represents the state of the art in interest rate derivatives pricing, so having been working with it from the very beginning is definitely high on my list of success factors later in life.

At Columbia College, I decided to enroll in its three-two program, which meant that I spent three years studying the contemporary civilization and humanities core curriculum, as well as the hard sciences, and then two years at the Columbia School of Engineering. There, I found a home in operations research, which allowed me to study computer science and applied mathematics, including differential equations, stochastic processes, statistical quality control, and mathematical programming. While studying for my master’s in operations research at Columbia, I had the opportunity to work at the Rand Institute, where math and computer science were applied to real-world problems. There I was involved in developing a large-scale simulation model designed to optimize response times for the New York City Fire Department.


Commodity Trading Advisors: Risk, Performance Analysis, and Selection by Greg N. Gregoriou, Vassilios Karavas, François-Serge Lhabitant, Fabrice Douglas Rouah

Asian financial crisis, asset allocation, backtesting, buy and hold, capital asset pricing model, collateralized debt obligation, commodity trading advisor, compound rate of return, constrained optimization, corporate governance, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, currency risk, discrete time, distributed generation, diversification, diversified portfolio, dividend-yielding stocks, financial engineering, fixed income, global macro, high net worth, implied volatility, index arbitrage, index fund, interest rate swap, iterative process, linear programming, London Interbank Offered Rate, Long Term Capital Management, managed futures, market fundamentalism, merger arbitrage, Mexican peso crisis / tequila crisis, p-value, Pareto efficiency, Performance of Mutual Funds in the Period, Ponzi scheme, proprietary trading, quantitative trading / quantitative finance, random walk, risk free rate, risk-adjusted returns, risk/return, selection bias, Sharpe ratio, short selling, stochastic process, survivorship bias, systematic trading, tail risk, technology bubble, transaction costs, value at risk, zero-sum game

Faff and Hallahan (2001) argue that survivorship bias is more likely to cause performance reversals than performance persistence. The data used show considerable kurtosis (see Table 3.1). However, this kurtosis may be caused by heteroskedasticity (returns of some funds are more variable than others). REGRESSION TEST OF PERFORMANCE PERSISTENCE To measure performance persistence, a model of the stochastic process that generates returns is required. The process considered is: rit = αi + βi rt + εit , ε it ~ N(0, σ i2 ) i = 1, K , n and t = 1, K , T (3.1) where rit = return of fund (or CTA) i in month t rt = average fund returns in month t slope parameter bi = differences in leverage. The model allows each fund to have a different variance, which is consistent with past research.

MONTE CARLO STUDY In their method, EGR ranked funds by their mean return or modified Sharpe ratio in a first period, and then determined whether the funds that ranked high in the first period also ranked high in the second period. We use Monte Carlo simulation to determine the power and size of hypothesis tests with EGR’s method when data follow the stochastic process given in equation 3.1. Data were generated by specifying values of α, β, and σ. The simulation used 1,000 replications and 120 simulated funds. The mean return over all funds, r̄t, is derived from the values of α and β as: Σα i Σε it + n n rt = Σβ i 1− n where all sums are from i = 1 to n.

M584/10M/1299. www.cve.com. Christoffersen, P. (2003) Elements of Financial Risk Management. San Diego, CA: Academic Press. Chung, S. Y. (1999) “Portfolio Risk Measurement: A Review of Value at Risk.” Journal of Alternative Investments, Vol. 2, No. 1, pp. 34–42. Clark, P. K. (1973) “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices.” Econometrica, Vol. 41, No. 1, pp. 135–155. Clayton, U. (2003) A Guide to the Law of Securitisation in Australia. Sydney, Australia: Clayton Company. Cooley, P. L., R. L. Roenfeldt, and N. K. Modani. (1977) “Interdependence of Market Risk Measures.”


pages: 105 words: 18,832

The Collapse of Western Civilization: A View From the Future by Naomi Oreskes, Erik M. Conway

Anthropocene, anti-communist, correlation does not imply causation, creative destruction, en.wikipedia.org, energy transition, Great Leap Forward, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Kim Stanley Robinson, laissez-faire capitalism, Lewis Mumford, market fundamentalism, mass immigration, means of production, military-industrial complex, oil shale / tar sands, Pierre-Simon Laplace, precautionary principle, road to serfdom, Ronald Reagan, stochastic process, the built environment, the market place

T h e F r e n z y o F F o s s i l F u e l s 17 This was consistent with the expectation—based on physical theory—that warmer sea surface temperatures in regions of cyclogenesis could, and likely would, drive either more hurricanes or more intense ones. However, they backed away from this conclusion under pressure from their scientific colleagues. Much of the argument surrounded the concept of statistical significance. Given what we now know about the dominance of nonlinear systems and the distribution of stochastic processes, the then-dominant notion of a 95 percent confidence limit is hard to fathom. Yet overwhelming evidence suggests that twentieth-century scientists believed that a claim could be accepted only if, by the standards of Fisherian statistics, the possibility that an observed event could have happened by chance was less than 1 in 20.


pages: 306 words: 82,765

Skin in the Game: Hidden Asymmetries in Daily Life by Nassim Nicholas Taleb

anti-fragile, availability heuristic, behavioural economics, Benoit Mandelbrot, Bernie Madoff, Black Swan, Brownian motion, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, cellular automata, Claude Shannon: information theory, cognitive dissonance, complexity theory, data science, David Graeber, disintermediation, Donald Trump, Edward Thorp, equity premium, fake news, financial independence, information asymmetry, invisible hand, knowledge economy, loss aversion, mandelbrot fractal, Mark Spitznagel, mental accounting, microbiome, mirror neurons, moral hazard, Murray Gell-Mann, offshore financial centre, p-value, Paradox of Choice, Paul Samuelson, Ponzi scheme, power law, precautionary principle, price mechanism, principal–agent problem, public intellectual, Ralph Nader, random walk, rent-seeking, Richard Feynman, Richard Thaler, Ronald Coase, Ronald Reagan, Rory Sutherland, Rupert Read, Silicon Valley, Social Justice Warrior, Steven Pinker, stochastic process, survivorship bias, systematic bias, tail risk, TED Talk, The Nature of the Firm, Tragedy of the Commons, transaction costs, urban planning, Yogi Berra

Adaptation of Theorem 1 to Brownian Motion The implications of simplified discussion do not change whether one uses richer models, such as a full stochastic process subjected to an absorbing barrier. And of course in a natural setting the eradication of all previous life can happen (i.e., Xt can take extreme negative value), not just a stopping condition. The Peters and Gell-Mann argument also cancels the so-called equity premium puzzle if you add fat tails (hence outcomes vastly more severe pushing some level equivalent to ruin) and absence of the fungibility of temporal and ensemble. There is no puzzle. The problem is invariant in real life if one uses a Brownian-motion-style stochastic process subjected to an absorbing barrier.


Learn Algorithmic Trading by Sebastien Donadio

active measures, algorithmic trading, automated trading system, backtesting, Bayesian statistics, behavioural economics, buy and hold, buy low sell high, cryptocurrency, data science, deep learning, DevOps, en.wikipedia.org, fixed income, Flash crash, Guido van Rossum, latency arbitrage, locking in a profit, market fundamentalism, market microstructure, martingale, natural language processing, OpenAI, p-value, paper trading, performance metric, prediction markets, proprietary trading, quantitative trading / quantitative finance, random walk, risk tolerance, risk-adjusted returns, Sharpe ratio, short selling, sorting algorithm, statistical arbitrage, statistical model, stochastic process, survivorship bias, transaction costs, type inference, WebSocket, zero-sum game

He specializes in statistical arbitrage market-making, and pairs trading strategies for the most liquid global futures contracts. He works as a Senior Quantitative Developer at a trading firm in Chicago. He holds a Masters in Computer Science from the University of Southern California. His areas of interest include Computer Architecture, FinTech, Probability Theory and Stochastic Processes, Statistical Learning and Inference Methods, and Natural Language Processing. About the reviewers Nataraj Dasgupta is the VP of Advanced Analytics at RxDataScience Inc. He has been in the IT industry for more than 19 years and has worked in the technical & analytics divisions of Philip Morris, IBM, UBS Investment Bank, and Purdue Pharma.

Predicting the Markets with Basic Machine Learning In the last chapter, we learned how to design trading strategies, create trading signals, and implement advanced concepts, such as seasonality in trading instruments. Understanding those concepts in greater detail is a vast field comprising stochastic processes, random walks, martingales, and time series analysis, which we leave to you to explore at your own pace. So what's next? Let's look at an even more advanced method of prediction and forecasting: statistical inference and prediction. This is known as machine learning, the fundamentals of which were developed in the 1800s and early 1900s and have been worked on ever since.


pages: 111 words: 1

Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Nicholas Taleb

Alan Greenspan, Antoine Gombaud: Chevalier de Méré, availability heuristic, backtesting, behavioural economics, Benoit Mandelbrot, Black Swan, commoditize, complexity theory, corporate governance, corporate raider, currency peg, Daniel Kahneman / Amos Tversky, discounted cash flows, diversified portfolio, endowment effect, equity premium, financial engineering, fixed income, global village, hedonic treadmill, hindsight bias, junk bonds, Kenneth Arrow, Linda problem, Long Term Capital Management, loss aversion, mandelbrot fractal, Mark Spitznagel, Market Wizards by Jack D. Schwager, mental accounting, meta-analysis, Michael Milken, Myron Scholes, PalmPilot, Paradox of Choice, Paul Samuelson, power law, proprietary trading, public intellectual, quantitative trading / quantitative finance, QWERTY keyboard, random walk, Richard Feynman, risk free rate, road to serfdom, Robert Shiller, selection bias, shareholder value, Sharpe ratio, Steven Pinker, stochastic process, survivorship bias, too big to fail, Tragedy of the Commons, Turing test, Yogi Berra

I will outline them next. Monte Carlo methods, in brief, consist of creating artificial history using the following concepts. First, consider the sample path. The invisible histories have a scientific name, alternative sample paths, a name borrowed from the field of mathematics of probability called stochastic processes. The notion of path, as opposed to outcome, indicates that it is not a mere MBA-style scenario analysis, but the examination of a sequence of scenarios along the course of time. We are not just concerned with where a bird can end up tomorrow night, but rather with all the various places it can possibly visit during the time interval.

Starting at $100, in one scenario it can end up at $20 having seen a high of $220; in another it can end up at $145 having seen a low of $10. Another example is the evolution of your wealth during an evening at a casino. You start with $1,000 in your pocket, and measure it every fifteen minutes. In one sample path you have $2,200 at midnight; in another you barely have $20 left for a cab fare. Stochastic processes refer to the dynamics of events unfolding with the course of time. Stochastic is a fancy Greek name for random. This branch of probability concerns itself with the study of the evolution of successive random events—one could call it the mathematics of history. The key about a process is that it has time in it.


Fifty Challenging Problems in Probability With Solutions by Frederick Mosteller

Isaac Newton, John von Neumann, prisoner's dilemma, RAND corporation, stochastic process

As is well known, no strategy can give him a higher probability of achieving his goal, and the probability is this high if and only if he makes sure either to lose x or win y eventually. The Leeser Paradise The Lesser Paradise resembles the Golden Paradise with the imoortant difference that before leaving the haH the gambler must pay an income tax ·First pUblished, 1965 Reprinted by Dover Publications. Inc in 1976 under the title Inequalities for stochastic processes 56 of t 100% (0 < t < 1) on any net positive income that he has won there. It is therefore no harder or easier for him to win y dollars with an initial fortune of x than it is for his brother in the Golden Paradise to win y/(I - t) dollars. The greatest probability with which he can achieve his goal is therefore (I - t)x (I) (1 - t)x +y The Paradise Lost Here, the croupier collects the tax of !


pages: 518 words: 107,836

How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy) by Benjamin Peters

Albert Einstein, American ideology, Andrei Shleifer, Anthropocene, Benoit Mandelbrot, bitcoin, Brownian motion, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive dissonance, commons-based peer production, computer age, conceptual framework, continuation of politics by other means, crony capitalism, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, disinformation, Dissolution of the Soviet Union, Donald Davies, double helix, Drosophila, Francis Fukuyama: the end of history, From Mathematics to the Technologies of Life and Death, Gabriella Coleman, hive mind, index card, informal economy, information asymmetry, invisible hand, Jacquard loom, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, Lewis Mumford, linear programming, mandelbrot fractal, Marshall McLuhan, means of production, megaproject, Menlo Park, Mikhail Gorbachev, military-industrial complex, mutually assured destruction, Network effects, Norbert Wiener, packet switching, Pareto efficiency, pattern recognition, Paul Erdős, Peter Thiel, Philip Mirowski, power law, RAND corporation, rent-seeking, road to serfdom, Ronald Coase, scientific mainstream, scientific management, Steve Jobs, Stewart Brand, stochastic process, surveillance capitalism, systems thinking, technoutopianism, the Cathedral and the Bazaar, the strength of weak ties, The Structural Transformation of the Public Sphere, transaction costs, Turing machine, work culture , Yochai Benkler

During World War II, Wiener researched ways to integrate human gunner and analog computer agency in antiaircraft artillery fire-control systems, vaulting his wartime research on the feedback processes among humans and machines into a general science of communication and control, with the gun and gunner ensemble (the man and the antiaircraft gun cockpit) as the original image of the cyborg.5 To designate this new science of control and feedback mechanisms, Wiener coined the neologism cybernetics from the Greek word for steersman, which is a predecessor to the English term governor (there is a common consonant-vowel structure between cybern- and govern—k/g + vowel + b/v + ern). Wiener’s popular masterworks ranged further still, commingling complex mathematical analysis (especially noise and stochastic processes), exposition on the promise and threat associated with automated information technology, and various speculations of social, political, and religious natures.6 For Wiener, cybernetics was a working out of the implications of “the theory of messages” and the ways that information systems organized life, the world, and the cosmos.

Because the coauthors were sensitive to how language, especially foreign terms, packs in questions of international competition, the coauthors attempted to keep their language as technical and abstract as possible, reminding the reader that the cybernetic mind-machine analogy was central to the emerging science but should be understood only “from a functional point of view,” not a philosophical one.76 The technical and abstract mathematical language of Wiener’s cybernetics thus served as a political defense against Soviet philosopher-critics and as ballast for generalizing the coauthors’ ambitions for scientists in other fields. They employed a full toolbox of cybernetic terminology, including signal words such as homeostasis, feedback, entropy, reflex, and the binary digit. They also repeated Wiener and Shannon’s emphases on probabilistic, stochastic processes as the preferred mathematical medium for scripting behavioral patterns onto abstract logical systems, including a whole section that elaborated on the mind-machine analogy with special emphasis on the central processor as capable of memory, responsiveness, and learning.77Wiener’s call for cyberneticists with “Leibnizian catholicity” of scientific interests was tempered into its negative form—a warning against disciplinary isolationism.78 On the last page of the article, the coauthors smoothed over the adoption of Wiener, an American, as foreign founder of Soviet cybernetics by summarizing and stylizing Wiener’s “sharp critique of capitalist society,” his pseudo-Marxist prediction of a “new industrial revolution” that would arise out of the “chaotic conditions of the capitalist market,” and his widely publicized postwar fear of “the replacement of common workers with mechanical robots.”79 A word play in Russian animates this last phrase: the Russian word for worker, or rabotnik, differs only by a vowel transformation from robot, the nearly universal term coined in 1927 by the playwright Karel Capek from the Czech word for “forced labor.”80 The first industrial revolution replaced the hand with the machine, or the rabotnik with the robot, and Wiener’s science, the coauthors dreamed, would help usher in a “second industrial revolution” in which the labor of the human mind could be carried out by intelligent machines, thus freeing, as Marx had intimated a century earlier, the mind to higher pursuits.


pages: 396 words: 112,748

Chaos: Making a New Science by James Gleick

Benoit Mandelbrot, business cycle, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Herbert Marcuse, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, military-industrial complex, Murray Gell-Mann, Norbert Wiener, pattern recognition, power law, Richard Feynman, scientific management, Stephen Hawking, stochastic process, trade route

Now, nobody would have thought that the right background for this problem was to know particle physics, to know something about quantum field theory, and to know that in quantum field theory you have these structures known as the renormalization group. Nobody knew that you would need to understand the general theory of stochastic processes, and also fractal structures. “Mitchell had the right background. He did the right thing at the right time, and he did it very well. Nothing partial. He cleaned out the whole problem.” Feigenbaum brought to Los Alamos a conviction that his science had failed to understand hard problems—nonlinear problems.

But everything evolves in the direction of specialization, and strictly speaking, “chaos” is now a very particular thing. When Yaneer Bar-Yam wrote a kilopage textbook, Dynamics of Complex Systems, in 2003, he took care of chaos proper in the first section of the first chapter. (“The first chapter, I have to admit, is 300 pages, okay?” he says.) Then came Stochastic Processes, Modeling Simulation, Cellular Automata, Computation Theory and Information Theory, Scaling, Renormalization, and Fractals, Neural Networks, Attractor Networks, Homogenous Systems, Inhomogenous Systems, and so on. Bar-Yam, the son of a high-energy physicist, had studied condensed matter physics and become an engineering professor at Boston University, but he left in 1997 to found the New England Complex Systems Institute.


pages: 119 words: 10,356

Topics in Market Microstructure by Ilija I. Zovko

Brownian motion, computerized trading, continuous double auction, correlation coefficient, financial intermediation, Gini coefficient, information asymmetry, market design, market friction, market microstructure, Murray Gell-Mann, p-value, power law, quantitative trading / quantitative finance, random walk, stochastic process, stochastic volatility, transaction costs

Estimating the permanent and transitory components of the bid/ask spread. In C.-F. e. Lee, editor, Advances in investment analysis and portfolio management. Volume 5. Elsevier, 1998. T. Chordia and B. Swaminathan. Trading volume and crossautocorrelations in stock returns. Journal of Finance, LV(2), April 2000. P. K. Clark. Subordinated stochastic process model with finite variance for speculative prices. Econometrica, 41(1):135–155, 1973. K. J. Cohen, S. F. Maier, R. A. Schwartz, and D. K. Whitcomb. Transaction costs, order placement strategy, and existence of the bid-ask spread. Journal of Political Economy, 89(2):287–305, 1981. K. J. Cohen, R.


pages: 425 words: 122,223

Capital Ideas: The Improbable Origins of Modern Wall Street by Peter L. Bernstein

Albert Einstein, asset allocation, backtesting, Benoit Mandelbrot, Black Monday: stock market crash in 1987, Black-Scholes formula, Bonfire of the Vanities, Brownian motion, business cycle, buy and hold, buy low sell high, capital asset pricing model, corporate raider, debt deflation, diversified portfolio, Eugene Fama: efficient market hypothesis, financial innovation, financial intermediation, fixed income, full employment, Glass-Steagall Act, Great Leap Forward, guns versus butter model, implied volatility, index arbitrage, index fund, interest rate swap, invisible hand, John von Neumann, Joseph Schumpeter, junk bonds, Kenneth Arrow, law of one price, linear programming, Louis Bachelier, mandelbrot fractal, martingale, means of production, Michael Milken, money market fund, Myron Scholes, new economy, New Journalism, Paul Samuelson, Performance of Mutual Funds in the Period, profit maximization, Ralph Nader, RAND corporation, random walk, Richard Thaler, risk free rate, risk/return, Robert Shiller, Robert Solow, Ronald Reagan, stochastic process, Thales and the olive presses, the market place, The Predators' Ball, the scientific method, The Wealth of Nations by Adam Smith, Thorstein Veblen, transaction costs, transfer pricing, zero-coupon bond, zero-sum game

Paul Cootner, one of the leading finance scholars of the 1960s, once delivered this accolade: “So outstanding is his work that we can say that the study of speculative prices has its moment of glory at its moment of conception.”1 Bachelier laid the groundwork on which later mathematicians constructed a full-fledged theory of probability. He derived a formula that anticipated Einstein’s research into the behavior of particles subject to random shocks in space. And he developed the now universally used concept of stochastic processes, the analysis of random movements among statistical variables. Moreover, he made the first theoretical attempt to value such financial instruments as options and futures, which had active markets even in 1900. And he did all this in an effort to explain why prices in capital markets are impossible to predict!

(LOR) Leland-Rubinstein Associates Leverage Leveraged buyouts Liquidity management market money Preference theory stock “Liquidity Preference as Behavior Toward Risk” (Tobin) Linear programming Loading charges: see Brokerage commissions London School of Economics (LSE) London Stock Exchange Macroeconomics Management Science Marginal utility concept “Market and Industry Factors in Stock Price Performance” (King) Market theories (general discussion). See also specific theories and types of securities competitive disaster avoidance invisible hand linear regression/econometric seasonal fluctuations stochastic process Mathematical economics Mathematical Theory of Non-Uniform Gases, The Maximum expected return concept McCormick Harvester Mean-Variance Analysis Mean-Variance Analysis in Portfolio Choice and Capital Markets (Markowitz) “Measuring the Investment Performance of Pension Funds,” report Mellon Bank Merck Merrill Lynch Minnesota Mining MIT MM Theory “Modern Portfolio Theory.


pages: 247 words: 43,430

Think Complexity by Allen B. Downey

Benoit Mandelbrot, cellular automata, Conway's Game of Life, Craig Reynolds: boids flock, discrete time, en.wikipedia.org, Frank Gehry, Gini coefficient, Guggenheim Bilbao, Laplace demon, mandelbrot fractal, Occupy movement, Paul Erdős, peer-to-peer, Pierre-Simon Laplace, power law, seminal paper, sorting algorithm, stochastic process, strong AI, Thomas Kuhn: the structure of scientific revolutions, Turing complete, Turing machine, Vilfredo Pareto, We are the 99%

, Stanley Milgram sorting, Analysis of Basic Python Operations, Analysis of Basic Python Operations source node, Dijkstra spaceships, Structures, Life Patterns spanning cluster, Percolation special creation, Falsifiability spectral density, Spectral Density spherical cow, The Axes of Scientific Models square, Fractals stable sort, Analysis of Basic Python Operations Stanford Large Network Dataset Collection, Zipf, Pareto, and Power Laws state, Cellular Automata, Stephen Wolfram, Sand Piles stochastic process, The Axes of Scientific Models stock market, SOC, Causation, and Prediction StopIteration, Iterators __str__, Representing Graphs, Representing Graphs strategy, Prisoner’s Dilemma string concatenation, Analysis of Basic Python Operations string methods, Analysis of Basic Python Operations Strogatz, Steven, Paradigm Shift?


pages: 523 words: 143,139

Algorithms to Live By: The Computer Science of Human Decisions by Brian Christian, Tom Griffiths

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, algorithmic bias, algorithmic trading, anthropic principle, asset allocation, autonomous vehicles, Bayesian statistics, behavioural economics, Berlin Wall, Big Tech, Bill Duvall, bitcoin, Boeing 747, Charles Babbage, cognitive load, Community Supported Agriculture, complexity theory, constrained optimization, cosmological principle, cryptocurrency, Danny Hillis, data science, David Heinemeier Hansson, David Sedaris, delayed gratification, dematerialisation, diversification, Donald Knuth, Donald Shoup, double helix, Dutch auction, Elon Musk, exponential backoff, fault tolerance, Fellow of the Royal Society, Firefox, first-price auction, Flash crash, Frederick Winslow Taylor, fulfillment center, Garrett Hardin, Geoffrey Hinton, George Akerlof, global supply chain, Google Chrome, heat death of the universe, Henri Poincaré, information retrieval, Internet Archive, Jeff Bezos, Johannes Kepler, John Nash: game theory, John von Neumann, Kickstarter, knapsack problem, Lao Tzu, Leonard Kleinrock, level 1 cache, linear programming, martingale, multi-armed bandit, Nash equilibrium, natural language processing, NP-complete, P = NP, packet switching, Pierre-Simon Laplace, power law, prediction markets, race to the bottom, RAND corporation, RFC: Request For Comment, Robert X Cringely, Sam Altman, scientific management, sealed-bid auction, second-price auction, self-driving car, Silicon Valley, Skype, sorting algorithm, spectrum auction, Stanford marshmallow experiment, Steve Jobs, stochastic process, Thomas Bayes, Thomas Malthus, Tragedy of the Commons, traveling salesman, Turing machine, urban planning, Vickrey auction, Vilfredo Pareto, Walter Mischel, Y Combinator, zero-sum game

Like the famous Heisenberg uncertainty principle of particle physics, which says that the more you know about a particle’s momentum the less you know about its position, the so-called bias-variance tradeoff expresses a deep and fundamental bound on how good a model can be—on what it’s possible to know and to predict. This notion is found in various places in the machine-learning literature. See, for instance, Geman, Bienenstock, and Doursat, “Neural Networks and the Bias/Variance Dilemma,” and Grenander, “On Empirical Spectral Analysis of Stochastic Processes.” in the Book of Kings: The bronze snake, known as Nehushtan, gets destroyed in 2 Kings 18:4. “pay good money to remove the tattoos”: Gilbert, Stumbling on Happiness. duels less than fifty years ago: If you’re not too fainthearted, you can watch video of a duel fought in 1967 at http://passerelle-production.u-bourgogne.fr/web/atip_insulte/Video/archive_duel_france.swf.

Discover 6, no. 6 (1985): 40–42. Graham, Ronald L., Eugene L. Lawler, Jan Karel Lenstra, and Alexander H. G. Rinnooy Kan. “Optimization and Approximation in Deterministic Sequencing and Scheduling: A Survey.” Annals of Discrete Mathematics 5 (1979): 287–326. Grenander, Ulf. “On Empirical Spectral Analysis of Stochastic Processes.” Arkiv för Matematik 1, no. 6 (1952): 503–531. Gridgeman, T. “Geometric Probability and the Number π.” Scripta Mathematika 25, no. 3 (1960): 183–195. Griffiths, Thomas L., Charles Kemp, and Joshua B. Tenenbaum. “Bayesian Models of Cognition.” In The Cambridge Handbook of Computational Cognitive Modeling.


pages: 665 words: 146,542

Money: 5,000 Years of Debt and Power by Michel Aglietta

accelerated depreciation, Alan Greenspan, bank run, banking crisis, Basel III, Berlin Wall, bitcoin, blockchain, Bretton Woods, British Empire, business cycle, capital asset pricing model, capital controls, cashless society, central bank independence, circular economy, collapse of Lehman Brothers, collective bargaining, corporate governance, David Graeber, debt deflation, dematerialisation, Deng Xiaoping, double entry bookkeeping, energy transition, eurozone crisis, Fall of the Berlin Wall, falling living standards, financial deregulation, financial innovation, Financial Instability Hypothesis, financial intermediation, floating exchange rates, forward guidance, Francis Fukuyama: the end of history, full employment, German hyperinflation, income inequality, inflation targeting, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), invention of writing, invisible hand, joint-stock company, Kenneth Arrow, Kickstarter, land bank, liquidity trap, low interest rates, margin call, means of production, Money creation, money market fund, moral hazard, Nash equilibrium, Network effects, Northern Rock, oil shock, planetary scale, plutocrats, precautionary principle, price stability, purchasing power parity, quantitative easing, race to the bottom, reserve currency, secular stagnation, seigniorage, shareholder value, special drawing rights, special economic zone, stochastic process, Suez crisis 1956, the payments system, the scientific method, tontine, too big to fail, trade route, transaction costs, transcontinental railway, Washington Consensus

This mimetic model’s strength is that it reveals the emergence, from amid this general confusion, of a polarisation around one single object of desire recognised by all (see Box 1.1).21 Box 1.1 Theorem of mimetic convergence In a population of N agents (i = 1, N), on date t each person has a belief ui(t) regarding the debt that represents absolute liquidity. i chooses his belief in t+1 by copying an agent j at random, with the probability pij for j = 1, N. So we have Pr{ui (t+1) = uj(t)} = pij with Σpij = 1 for each i. The mimetic interdependency is formalised as a Markovian stochastic process defined by the matrix Such that the dynamic process is written U (t+1) = PU(t) The theorem shows that -If the graph associated with P is strongly correspondent (matrix P does not break down into independent sub-matrices); -and aperiodic (the process of revising beliefs is not cyclical); -the mimetic contagion converges towards unanimity around a belief, which can be any of the initial beliefs.

Box 6.1 Interest rate rules 1) The Wicksellian norm which the Rilskbank used to break out of inflation in the 1930s set a target for price levels and not inflation rates. It is associated with the rates rule it = īt + φpt in which pt = the log of the price index that is be stabilised. īt follows a stochastic process that is independent of price movements but is correlated to the exogenous fluctuations in the natural rate rt. The relationship defining the equilibrium nominal rate is it = rt + Etpt + 1 – pt. Eliminating it we get: If we separate out the processes followed by rt and īt then pt has a single solution: It follows that prices fluctuate around a long-term level: The long-term value of the general level of prices is independent from demand for money.


pages: 442 words: 39,064

Why Stock Markets Crash: Critical Events in Complex Financial Systems by Didier Sornette

Alan Greenspan, Asian financial crisis, asset allocation, behavioural economics, Berlin Wall, Black Monday: stock market crash in 1987, Bretton Woods, Brownian motion, business cycle, buy and hold, buy the rumour, sell the news, capital asset pricing model, capital controls, continuous double auction, currency peg, Deng Xiaoping, discrete time, diversified portfolio, Elliott wave, Erdős number, experimental economics, financial engineering, financial innovation, floating exchange rates, frictionless, frictionless market, full employment, global village, implied volatility, index fund, information asymmetry, intangible asset, invisible hand, John von Neumann, joint-stock company, law of one price, Louis Bachelier, low interest rates, mandelbrot fractal, margin call, market bubble, market clearing, market design, market fundamentalism, mental accounting, moral hazard, Network effects, new economy, oil shock, open economy, pattern recognition, Paul Erdős, Paul Samuelson, power law, quantitative trading / quantitative finance, random walk, risk/return, Ronald Reagan, Schrödinger's Cat, selection bias, short selling, Silicon Valley, South Sea Bubble, statistical model, stochastic process, stocks for the long run, Tacoma Narrows Bridge, technological singularity, The Coming Technological Singularity, The Wealth of Nations by Adam Smith, Tobin tax, total factor productivity, transaction costs, tulip mania, VA Linux, Y2K, yield curve

Samuelson has proved a general theorem showing that the concept that prices are unpredictable can actually be deduced rigorously [357] from a model that hypothesizes that a stock’s present price pt is set at the expected discounted value of its future dividends dt dt+1 dt+2 (which are supposed to be random variables generated according to any general (but known) stochastic process): pt = dt + 1 dt+1 + 1 2 dt+2 + 1 2 3 dt+3 + · · · (3) where the factors i = 1 − r < 1, which can fluctuate from one time period to the next, account for the depreciation of a future price calculated at present due to the nonzero consumption price index r. We see that pt = dt + 1 pt+1 , and thus the expectation Ept+1 of pt+1 conditioned on the knowledge of the present price pt is Ept+1 = pt − dt (4) 1 This shows that, barring the drift due to the inflation and the dividend, the price increment does not have a systematic component or memory of the past and is thus random.

The economy as an evolving complex system II (Addison-Wesley, Redwood City). 19. Arthur, W. B. (1987). Self-reinforcing mechanisms in economics, Center for Economic Policy Research 111, 1–20. 20. Arthur, W. B., Ermoliev, Y. M., and Kaniovsky, Y. M. (1984). Strong laws for a class of path-dependent stochastic processes with applications, in Proceedings of the International Conference on Stochastic Optimization, A. Shiryaev and R. Wets, editors (Springer-Verlag, New York), pp. 287–300. 21. Arthur, W. B., Holland, J. H., LeBaron, B., Palmer, R., and Taylor, P. (1997). Asset pricing under endogenous expectations in an artificial stock market, in The Economy as an Evolving Complex System II, W.


pages: 651 words: 180,162

Antifragile: Things That Gain From Disorder by Nassim Nicholas Taleb

"World Economic Forum" Davos, Air France Flight 447, Alan Greenspan, Andrei Shleifer, anti-fragile, banking crisis, Benoit Mandelbrot, Berlin Wall, biodiversity loss, Black Swan, business cycle, caloric restriction, caloric restriction, Chuck Templeton: OpenTable:, commoditize, creative destruction, credit crunch, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, discrete time, double entry bookkeeping, Emanuel Derman, epigenetics, fail fast, financial engineering, financial independence, Flash crash, flying shuttle, Gary Taubes, George Santayana, Gini coefficient, Helicobacter pylori, Henri Poincaré, Higgs boson, high net worth, hygiene hypothesis, Ignaz Semmelweis: hand washing, informal economy, invention of the wheel, invisible hand, Isaac Newton, James Hargreaves, Jane Jacobs, Jim Simons, joint-stock company, joint-stock limited liability company, Joseph Schumpeter, Kenneth Arrow, knowledge economy, language acquisition, Lao Tzu, Long Term Capital Management, loss aversion, Louis Pasteur, mandelbrot fractal, Marc Andreessen, Mark Spitznagel, meta-analysis, microbiome, money market fund, moral hazard, mouse model, Myron Scholes, Norbert Wiener, pattern recognition, Paul Samuelson, placebo effect, Ponzi scheme, Post-Keynesian economics, power law, principal–agent problem, purchasing power parity, quantitative trading / quantitative finance, Ralph Nader, random walk, Ray Kurzweil, rent control, Republic of Letters, Ronald Reagan, Rory Sutherland, Rupert Read, selection bias, Silicon Valley, six sigma, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, stochastic process, stochastic volatility, synthetic biology, tacit knowledge, tail risk, Thales and the olive presses, Thales of Miletus, The Great Moderation, the new new thing, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Malthus, too big to fail, transaction costs, urban planning, Vilfredo Pareto, Yogi Berra, Zipf's Law

Alas, according to a simple test: no, sorry. 5 Set a simple filtering rule: all members of a species need to have a neck forty centimeters long in order to survive. After a few generations, the surviving population would have, on average, a neck longer than forty centimeters. (More technically, a stochastic process subjected to an absorbing barrier will have an observed mean higher than the barrier.) 6 The French have a long series of authors who owe part of their status to their criminal record—which includes the poet Ronsard, the writer Jean Genet, and many others. CHAPTER 3 The Cat and the Washing Machine Stress is knowledge (and knowledge is stress)—The organic and the mechanical—No translator needed, for now—Waking up the animal in us, after two hundred years of modernity The bold conjecture made here is that everything that has life in it is to some extent antifragile (but not the reverse).

My dream—the solution—is that we would have a National Entrepreneur Day, with the following message: Most of you will fail, disrespected, impoverished, but we are grateful for the risks you are taking and the sacrifices you are making for the sake of the economic growth of the planet and pulling others out of poverty. You are at the source of our antifragility. Our nation thanks you. 1 A technical comment on why the adaptability criterion is innocent of probability (the nontechnical reader should skip the rest of this note). The property in a stochastic process of not seeing at any time period t what would happen in time after t, that is, any period higher than t, hence reacting with a lag, an incompressible lag, is called nonanticipative strategy, a requirement of stochastic integration. The incompressibility of the lag is central and unavoidable.


pages: 607 words: 185,487

Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed by James C. Scott

agricultural Revolution, Boeing 747, business cycle, classic study, clean water, colonial rule, commoditize, company town, deskilling, facts on the ground, germ theory of disease, Great Leap Forward, informal economy, invention of writing, invisible hand, Jane Jacobs, Kenneth Arrow, land reform, land tenure, Lewis Mumford, Louis Pasteur, megaproject, new economy, New Urbanism, post-Fordism, Potemkin village, price mechanism, profit maximization, Recombinant DNA, road to serfdom, scientific management, Silicon Valley, stochastic process, Suez canal 1869, the built environment, The Death and Life of Great American Cities, the scientific method, Thorstein Veblen, urban decay, urban planning, urban renewal, vertical integration, working poor

For both of them, such tasks are voyages in uncharted waters. There may be some rules of thumb, but there can be no blueprints or battle plans drawn up in advance; the numerous unknowns in the equation make a one-step solution inconceivable. In more technical language, such goals can be approached only by a stochastic process of successive approximations, trial and error, experiment, and learning through experience. The kind of knowledge required in such endeavors is not deductive knowledge from first principles but rather what Greeks of the classical period called nietis, a concept to which we shall return. Usually translated, inadequately, as "cunning," metis is better understood as the kind of knowledge that can be acquired only by long practice at similar but rarely identical tasks, which requires constant adaptation to changing circumstances.

Taking language as a parallel, I believe that the rule of thumb is akin to formal grammar, whereas metis is more like actual speech. Metis is no more derivative of general rules than speech is derivative of grammar. Speech develops from the cradle by imitation, use, trial and error. Learning a mother tongue is a stochastic process-a process of successive, selfcorrecting approximations. We do not begin by learning the alphabet, individual words, parts of speech, and rules of grammar and then trying to use them all in order to produce a grammatically correct sentence. Moreover, as Oakeshott indicates, a knowledge of the rules of speech by themselves is compatible with a complete inability to speak intelligible sentences.


Longevity: To the Limits and Beyond (Research and Perspectives in Longevity) by Jean-Marie Robine, James W. Vaupel, Bernard Jeune, Michel Allard

caloric restriction, caloric restriction, classic study, computer age, conceptual framework, confounding variable, demographic transition, Drosophila, epigenetics, life extension, longitudinal study, phenotype, stem cell, stochastic process

Finch' Summary In this essay, I inquire about little explored sources of non-genetic factors in individual life spans that are displayed between individuals with identical genotypes in controlled laboratory environments. The numbers of oocytes found in the ovaries of inbred mice, for example, show a > 5-fold range between individuals. Smaller, but still extensive variations are also indicated for hippocampal neurons. These variations in cell number can be attributed to stochastic processes during organogenesis, i.e. "developmental noise in cell fate determination." They may be of general importance to functional changes during aging, as argued for reproductive senescence in females which is strongly linked to the time of oocyte depletion. More generally, I hypothesize that variations in cell numbers during development result in individual differences in the reserve cell numbers which, in turn, set critical thresholds for dysfunctions and sources of morbidity during aging.


pages: 321

Finding Alphas: A Quantitative Approach to Building Trading Strategies by Igor Tulchinsky

algorithmic trading, asset allocation, automated trading system, backpropagation, backtesting, barriers to entry, behavioural economics, book value, business cycle, buy and hold, capital asset pricing model, constrained optimization, corporate governance, correlation coefficient, credit crunch, Credit Default Swap, currency risk, data science, deep learning, discounted cash flows, discrete time, diversification, diversified portfolio, Eugene Fama: efficient market hypothesis, financial engineering, financial intermediation, Flash crash, Geoffrey Hinton, implied volatility, index arbitrage, index fund, intangible asset, iterative process, Long Term Capital Management, loss aversion, low interest rates, machine readable, market design, market microstructure, merger arbitrage, natural language processing, passive investing, pattern recognition, performance metric, Performance of Mutual Funds in the Period, popular capitalism, prediction markets, price discovery process, profit motive, proprietary trading, quantitative trading / quantitative finance, random walk, Reminiscences of a Stock Operator, Renaissance Technologies, risk free rate, risk tolerance, risk-adjusted returns, risk/return, selection bias, sentiment analysis, shareholder value, Sharpe ratio, short selling, Silicon Valley, speech recognition, statistical arbitrage, statistical model, stochastic process, survivorship bias, systematic bias, systematic trading, text mining, transaction costs, Vanguard fund, yield curve

Trend analysis is an example of applications of statistical models in alpha research. In particular, a hidden Markov model is frequently utilized for that purpose, based on the belief that price movements of the stock market are not totally random. In a statistics framework, the hidden Markov model is a composition of two or more stochastic processes: a hidden Markov chain, which accounts for the temporal variability, and an observable process, which accounts for the spectral variability. In this approach, the pattern of the stock market behavior is determined based on these probability values at a particular time. The goal is to figure out the hidden state sequence given the observation sequence, extract the long-term probability distribution, and identify the current trend relative to that distribution.


pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy by George Gilder

23andMe, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AlphaGo, AltaVista, Amazon Web Services, AOL-Time Warner, Asilomar, augmented reality, Ben Horowitz, bitcoin, Bitcoin Ponzi scheme, Bletchley Park, blockchain, Bob Noyce, British Empire, Brownian motion, Burning Man, business process, butterfly effect, carbon footprint, cellular automata, Claude Shannon: information theory, Clayton Christensen, cloud computing, computer age, computer vision, crony capitalism, cross-subsidies, cryptocurrency, Danny Hillis, decentralized internet, deep learning, DeepMind, Demis Hassabis, disintermediation, distributed ledger, don't be evil, Donald Knuth, Donald Trump, double entry bookkeeping, driverless car, Elon Musk, Erik Brynjolfsson, Ethereum, ethereum blockchain, fake news, fault tolerance, fiat currency, Firefox, first square of the chessboard, first square of the chessboard / second half of the chessboard, floating exchange rates, Fractional reserve banking, game design, Geoffrey Hinton, George Gilder, Google Earth, Google Glasses, Google Hangouts, index fund, inflation targeting, informal economy, initial coin offering, Internet of things, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, Jim Simons, Joan Didion, John Markoff, John von Neumann, Julian Assange, Kevin Kelly, Law of Accelerating Returns, machine translation, Marc Andreessen, Mark Zuckerberg, Mary Meeker, means of production, Menlo Park, Metcalfe’s law, Money creation, money: store of value / unit of account / medium of exchange, move fast and break things, Neal Stephenson, Network effects, new economy, Nick Bostrom, Norbert Wiener, Oculus Rift, OSI model, PageRank, pattern recognition, Paul Graham, peer-to-peer, Peter Thiel, Ponzi scheme, prediction markets, quantitative easing, random walk, ransomware, Ray Kurzweil, reality distortion field, Recombinant DNA, Renaissance Technologies, Robert Mercer, Robert Metcalfe, Ronald Coase, Ross Ulbricht, Ruby on Rails, Sand Hill Road, Satoshi Nakamoto, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Singularitarianism, Skype, smart contracts, Snapchat, Snow Crash, software is eating the world, sorting algorithm, South Sea Bubble, speech recognition, Stephen Hawking, Steve Jobs, Steven Levy, Stewart Brand, stochastic process, Susan Wojcicki, TED Talk, telepresence, Tesla Model S, The Soul of a New Machine, theory of mind, Tim Cook: Apple, transaction costs, tulip mania, Turing complete, Turing machine, Vernor Vinge, Vitalik Buterin, Von Neumann architecture, Watson beat the top human players on Jeopardy!, WikiLeaks, Y Combinator, zero-sum game

., Proceedings of the Markov Anniversary Meeting (Altadena, Calif.: Boson Books, 2006), 156–57. 3. Claude Elwood Shannon, “A Mathematical Theory of Communications” in The Bell System Technical Journal, October 1948, section 4, “Graphical Representation of a Markoff Process,” in Collected Papers (Piscataway, N.J.: IEEE Press, 1993), 15. “Stochastic processes of the type described above (“The Discrete Noiseless Channel”) are known mathematically as discrete Markov processes. . . . [A] discrete [information] source, for our purposes can be considered to be represented by a Markoff process. The general case can be described as follows: There exist a finite number of possible ‘states’ of a system. . . .


pages: 357 words: 98,854

Epigenetics Revolution: How Modern Biology Is Rewriting Our Understanding of Genetics, Disease and Inheritance by Nessa Carey

Albert Einstein, British Empire, Build a better mousetrap, conceptual framework, discovery of penicillin, double helix, Drosophila, epigenetics, Fellow of the Royal Society, life extension, mouse model, phenotype, selective serotonin reuptake inhibitor (SSRI), stem cell, stochastic process, Thomas Kuhn: the structure of scientific revolutions, twin studies

But over decades all these mild abnormalities in gene expression, resulting from a slightly inappropriate set of chromatin modifications, may lead to a gradually increasing functional impairment. Clinically, we don’t recognise this until it passes some invisible threshold and the patient begins to show symptoms. The epigenetic variation that occurs in developmental programming is at heart a predominantly random process, normally referred to as ‘stochastic’. This stochastic process may account for a significant amount of the variability that develops between the MZ twins who opened this chapter. Random fluctuations in epigenetic modifications during early development lead to non-identical patterns of gene expression. These become epigenetically set and exaggerated over the years, until eventually the genetically identical twins become phenotypically different, sometimes in the most dramatic of ways.


pages: 367 words: 97,136

Beyond Diversification: What Every Investor Needs to Know About Asset Allocation by Sebastien Page

Andrei Shleifer, asset allocation, backtesting, Bernie Madoff, bitcoin, Black Swan, Bob Litterman, book value, business cycle, buy and hold, Cal Newport, capital asset pricing model, commodity super cycle, coronavirus, corporate governance, COVID-19, cryptocurrency, currency risk, discounted cash flows, diversification, diversified portfolio, en.wikipedia.org, equity risk premium, Eugene Fama: efficient market hypothesis, fixed income, future of work, Future Shock, G4S, global macro, implied volatility, index fund, information asymmetry, iterative process, loss aversion, low interest rates, market friction, mental accounting, merger arbitrage, oil shock, passive investing, prediction markets, publication bias, quantitative easing, quantitative trading / quantitative finance, random walk, reserve currency, Richard Feynman, Richard Thaler, risk free rate, risk tolerance, risk-adjusted returns, risk/return, Robert Shiller, robo advisor, seminal paper, shareholder value, Sharpe ratio, sovereign wealth fund, stochastic process, stochastic volatility, stocks for the long run, systematic bias, systematic trading, tail risk, transaction costs, TSMC, value at risk, yield curve, zero-coupon bond, zero-sum game

Chow, George, Eric Jacquier, Mark Kritzman, and Kenneth Lowry. 1999. “Optimal Portfolios in Good Times and Bad,” Financial Analysts Journal, vol. 55, no. 3, pp. 65–73. Chua, David B., Mark Kritzman, and Sébastien Page. 2009. “The Myth of Diversification,” Journal of Portfolio Management, vol. 36, no. 1, pp. 26–35. Clark, Peter K. 1973. “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices,” Econometrica: Journal of the Econometric Society, vol. 41, no. 1, pp. 135–155. Clewell, David, Chris Faulkner-MacDonagh, David Giroux, Sébastien Page, and Charles Shriver. 2017. “Macroeconomic Dashboards for Tactical Asset Allocation,” Journal of Portfolio Management, vol. 44, no. 2, pp. 50–61.


pages: 375 words: 102,166

The Genetic Lottery: Why DNA Matters for Social Equality by Kathryn Paige Harden

23andMe, Affordable Care Act / Obamacare, assortative mating, autism spectrum disorder, Bayesian statistics, Berlin Wall, Black Lives Matter, classic study, clean water, combinatorial explosion, coronavirus, correlation coefficient, correlation does not imply causation, COVID-19, CRISPR, crowdsourcing, delayed gratification, deliberate practice, desegregation, double helix, epigenetics, game design, George Floyd, Gregor Mendel, impulse control, income inequality, Jeff Bezos, longitudinal study, low skilled workers, Mark Zuckerberg, meritocracy, meta-analysis, Monkeys Reject Unequal Pay, phenotype, randomized controlled trial, replication crisis, Scientific racism, stochastic process, surveillance capitalism, TED Talk, The Bell Curve by Richard Herrnstein and Charles Murray, twin studies, War on Poverty, zero-sum game

On average, we are expected to share 50 percent of DNA segments. But that’s on average. If you flip a coin 1,000 times, the expectation is that it will land on heads 50 percent of the time, or 500 times. But in reality, it might land on heads 501 times. Or, even weirder, 545 times. Like flipping a coin, reproduction is a stochastic process. Two siblings are expected to share 50 percent of their DNA segments, but in reality, they might share a little bit more or a little bit less. Micah and I share a little bit less than the expectation—44.6 percent. In 2006, the statistical geneticist Peter Visscher and his colleagues conducted a study that took advantage of the fact that there is random variation in the extent of identity-by-descent sharing between siblings—sometimes it’s lower than 50 percent, sometimes it’s higher.1 For each pair of siblings, they divided the genome into segments called centimorgans (abbreviated cM) and calculated the actual number of 1-cM segments that were shared between the siblings.


pages: 411 words: 108,119

The Irrational Economist: Making Decisions in a Dangerous World by Erwann Michel-Kerjan, Paul Slovic

"World Economic Forum" Davos, Alan Greenspan, An Inconvenient Truth, Andrei Shleifer, availability heuristic, bank run, behavioural economics, Black Swan, business cycle, Cass Sunstein, classic study, clean water, cognitive dissonance, collateralized debt obligation, complexity theory, conceptual framework, corporate social responsibility, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-subsidies, Daniel Kahneman / Amos Tversky, endowment effect, experimental economics, financial innovation, Fractional reserve banking, George Akerlof, hindsight bias, incomplete markets, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Isaac Newton, iterative process, Kenneth Arrow, Loma Prieta earthquake, London Interbank Offered Rate, market bubble, market clearing, money market fund, moral hazard, mortgage debt, Oklahoma City bombing, Pareto efficiency, Paul Samuelson, placebo effect, precautionary principle, price discrimination, price stability, RAND corporation, Richard Thaler, Robert Shiller, Robert Solow, Ronald Reagan, Savings and loan crisis, social discount rate, source of truth, statistical model, stochastic process, subprime mortgage crisis, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, ultimatum game, University of East Anglia, urban planning, Vilfredo Pareto

The existing literature is based on a completely standard expected utility modelling, whereby the welfare of each future generation is evaluated by computing its expected utility based on a probability distribution for the GDP per capita that it will enjoy. A major difficulty, however, is that these probability distributions are ambiguous, in the sense that they are not based on scientific arguments, or on a database large enough to make them completely objective. Indeed, more than one stochastic process is compatible with existing methods for describing economic growth. The Ellsberg paradox tells us that most human beings are averse to ambiguity, which means that they tend to overestimate the probability of the worst-case scenario when computing their subjective expected utility. This suggests that agents systematically violate Savage’s “Sure Thing Principle” (Savage, 1954).


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

"World Economic Forum" Davos, Ada Lovelace, adjacent possible, affirmative action, AI winter, Alfred Russel Wallace, algorithmic bias, algorithmic management, AlphaGo, Amazon Mechanical Turk, animal electricity, autonomous vehicles, behavioural economics, Black Swan, Brexit referendum, British Empire, Cambridge Analytica, cellular automata, Charles Babbage, citizen journalism, Claude Shannon: information theory, combinatorial explosion, Computing Machinery and Intelligence, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, desegregation, discovery of DNA, disinformation, Douglas Hofstadter, Elon Musk, fake news, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Geoffrey Hinton, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, Linda problem, low skilled workers, Mark Zuckerberg, mass immigration, meta-analysis, mutually assured destruction, natural language processing, new economy, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, post-truth, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, Stuart Kauffman, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, warehouse robotics, women in the workforce, Yochai Benkler

v=GGgiGtJk7MA 2The Pew Research Center, 2017, Public Trust in Government: 1958–2017, www.people-press.org/2017/12/14/public-trust-in-government-1958-2017/ 3Gallup, 2018, Confidence in Institutions, https://news.gallup.com/poll/1597/confidence-institutions.aspx 4This in fact led to a protracted conversation on the difference between UK, European and American methods of presenting odds, which led to a wasted afternoon of my graduate studies, a sleepless night working all the relationships out, an inferior mid-term exam score in my Stochastic Processes course and hard work to get an A in the end. So I have omitted this for the reader’s benefit. 5Colin E. Beech, 2008, The Grail and the Golem: The Sociology of Aleatory Artifacts. PhD dissertation. Rensselaer Polytechnic Institute, Troy, NY. Advisor(s) Sal Restivo. AAI3342844. https://dl.acm.org/citation.cfm?


pages: 354 words: 26,550

High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems by Irene Aldridge

algorithmic trading, asset allocation, asset-backed security, automated trading system, backtesting, Black Swan, Brownian motion, business cycle, business process, buy and hold, capital asset pricing model, centralized clearinghouse, collapse of Lehman Brothers, collateralized debt obligation, collective bargaining, computerized trading, diversification, equity premium, fault tolerance, financial engineering, financial intermediation, fixed income, global macro, high net worth, implied volatility, index arbitrage, information asymmetry, interest rate swap, inventory management, Jim Simons, law of one price, Long Term Capital Management, Louis Bachelier, machine readable, margin call, market friction, market microstructure, martingale, Myron Scholes, New Journalism, p-value, paper trading, performance metric, Performance of Mutual Funds in the Period, pneumatic tube, profit motive, proprietary trading, purchasing power parity, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk free rate, risk tolerance, risk-adjusted returns, risk/return, Sharpe ratio, short selling, Small Order Execution System, statistical arbitrage, statistical model, stochastic process, stochastic volatility, systematic trading, tail risk, trade route, transaction costs, value at risk, yield curve, zero-sum game

The market maker is responsible for deciding on and then setting bid and ask prices, receiving all orders, and clearing trades. The market maker’s objective is to maximize profits while avoiding bankruptcy or failure. The latter arise whenever the market maker has no inventory or cash. Both buy and sell orders arrive as independent stochastic processes. The model solution for optimal bid and ask prices lies in the estimation of the rates at which a unit of cash (e.g., a dollar or a “clip” of 10 million in FX) “arrives” to the market maker when a customer comes in to buy securities (pays money to the dealer) and “departs” the market maker when a customer comes in to sell (the dealer pays the customer).


pages: 356 words: 105,533

Dark Pools: The Rise of the Machine Traders and the Rigging of the U.S. Stock Market by Scott Patterson

Alan Greenspan, algorithmic trading, automated trading system, banking crisis, bash_history, Bear Stearns, Bernie Madoff, Black Monday: stock market crash in 1987, butterfly effect, buttonwood tree, buy and hold, Chuck Templeton: OpenTable:, cloud computing, collapse of Lehman Brothers, computerized trading, creative destruction, Donald Trump, financial engineering, fixed income, Flash crash, Ford Model T, Francisco Pizarro, Gordon Gekko, Hibernia Atlantic: Project Express, High speed trading, information security, Jim Simons, Joseph Schumpeter, junk bonds, latency arbitrage, Long Term Capital Management, machine readable, Mark Zuckerberg, market design, market microstructure, Michael Milken, military-industrial complex, pattern recognition, payment for order flow, pets.com, Ponzi scheme, popular electronics, prediction markets, quantitative hedge fund, Ray Kurzweil, Renaissance Technologies, seminal paper, Sergey Aleynikov, Small Order Execution System, South China Sea, Spread Networks laid a new fibre optics cable between New York and Chicago, stealth mode startup, stochastic process, three-martini lunch, Tragedy of the Commons, transaction costs, uptick rule, Watson beat the top human players on Jeopardy!, zero-sum game

The following ad for Getco, for instance, appeared in January 2012: CHICAGO, IL: Work with inter-disciplinary teams of traders & technologists & use trading models to trade profitably on major electronic exchanges; use statistical & mathematical approaches & develop new models to leverage trading capabilities. Must have Master’s in Math, Statistics, Physical Science, Computer Science, or Engineering w/min GPA of 3.4/4.0. Must have proven graduate level coursework in 2 or more of the following: Stochastic Processes, Statistical Methods, Mathematical Finance, Applied Numerical Methods, Machine Learning. Then, in the summer of 2011, a new contender for the high-frequency crown had emerged. Virtu Financial, the computer trading outfit that counted former Island attorney and Nasdaq executive Chris Concannon as a partner, merged with EWT, a California speed-trading operation that operated on exchanges around the world.


pages: 363 words: 109,834

The Crux by Richard Rumelt

activist fund / activist shareholder / activist investor, air gap, Airbnb, AltaVista, AOL-Time Warner, Bayesian statistics, behavioural economics, biodiversity loss, Blue Ocean Strategy, Boeing 737 MAX, Boeing 747, Charles Lindbergh, Clayton Christensen, cloud computing, cognitive bias, commoditize, coronavirus, corporate raider, COVID-19, creative destruction, crossover SUV, Crossrail, deep learning, Deng Xiaoping, diversified portfolio, double entry bookkeeping, drop ship, Elon Musk, en.wikipedia.org, financial engineering, Ford Model T, Herman Kahn, income inequality, index card, Internet of things, Jeff Bezos, Just-in-time delivery, Larry Ellison, linear programming, lockdown, low cost airline, low earth orbit, Lyft, Marc Benioff, Mark Zuckerberg, Masayoshi Son, meta-analysis, Myron Scholes, natural language processing, Neil Armstrong, Network effects, packet switching, PageRank, performance metric, precision agriculture, RAND corporation, ride hailing / ride sharing, Salesforce, San Francisco homelessness, search costs, selection bias, self-driving car, shareholder value, sharing economy, Silicon Valley, Skype, Snapchat, social distancing, SoftBank, software as a service, statistical model, Steve Ballmer, Steve Jobs, stochastic process, Teledyne, telemarketer, TSMC, uber lyft, undersea cable, union organizing, vertical integration, WeWork

He noted, “Engineering schools gradually became schools of physics and mathematics; medical schools became schools of biological science; business schools became schools of finite mathematics.”4 My own life experience supports Simon’s comment about the replacement of design with deduction in professional schools. For the academics who currently populate top professional schools, design is a bit like shop class, akin to automobile repair or welding, and residing at a far remove from respectable activities like the mathematical modeling of stochastic processes and the statistical analysis of selection bias. Study marketing in most masters in business administration (MBA) programs and you will be exposed to theory about consumer behavior and the concept of market segments, but will have little insight into the wide variety of actual company marketing programs.


Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei

backpropagation, bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, disinformation, distributed generation, finite state, industrial research laboratory, information retrieval, information security, iterative process, knowledge worker, linked data, machine readable, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, power law, random walk, recommendation engine, RFID, search costs, semantic web, seminal paper, sentiment analysis, sparse data, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, Thomas Bayes, web application

Clustering-based outlier detection methods are discussed in detail in Section 12.5. 12.3. Statistical Approaches As with statistical methods for clustering, statistical methods for outlier detection make assumptions about data normality. They assume that the normal objects in a data set are generated by a stochastic process (a generative model). Consequently, normal objects occur in regions of high probability for the stochastic model, and objects in the regions of low probability are outliers. The general idea behind statistical methods for outlier detection is to learn a generative model fitting the given data set, and then identify those objects in low-probability regions of the model as outliers.

The kernel density approximation of the probability density function is(12.9) where K() is a kernel and h is the bandwidth serving as a smoothing parameter. Once the probability density function of a data set is approximated through kernel density estimation, we can use the estimated density function to detect outliers. For an object, o, gives the estimated probability that the object is generated by the stochastic process. If is high, then the object is likely normal. Otherwise, o is likely an outlier. This step is often similar to the corresponding step in parametric methods. In summary, statistical methods for outlier detection learn models from data to distinguish normal data objects from outliers. An advantage of using statistical methods is that the outlier detection may be statistically justifiable.


pages: 298 words: 43,745

Understanding Sponsored Search: Core Elements of Keyword Advertising by Jim Jansen

AltaVista, AOL-Time Warner, barriers to entry, behavioural economics, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, content marketing, correlation does not imply causation, data science, en.wikipedia.org, first-price auction, folksonomy, Future Shock, information asymmetry, information retrieval, intangible asset, inventory management, life extension, linear programming, longitudinal study, machine translation, megacity, Nash equilibrium, Network effects, PageRank, place-making, power law, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search costs, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social bookmarking, social web, software as a service, stochastic process, tacit knowledge, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, Vilfredo Pareto, yield management

Web Analytics Demystified: A Marketer’s Guide to Understanding How Your Web Site Affects Your Business. New York: Celilo Group Media. ╇ [3] Pedrick, J. H. and Zufryden, F. S. 1991. “Evaluating the Impact of Advertising Media Plans: A Model of Consumer Purchase Dynamics Using Single Source Data.” Marketing Science, vol. 10(2), pp. 111–130. ╇ [4] Penniman, W. D. 1975. “A Stochastic Process Analysis of Online User Behavior.” In The Annual Meeting of the American Society for Information Science, Washington, DC, pp. 147–148. ╇ [5] Meister, D. and Sullivan, D. 1967. “Evaluation of User Reactions to a Prototype On-Line Information Retrieval System: Report to NASA by the Bunker-Ramo Corporation.


pages: 523 words: 112,185

Doing Data Science: Straight Talk From the Frontline by Cathy O'Neil, Rachel Schutt

Amazon Mechanical Turk, augmented reality, Augustin-Louis Cauchy, barriers to entry, Bayesian statistics, bike sharing, bioinformatics, computer vision, confounding variable, correlation does not imply causation, crowdsourcing, data science, distributed generation, Dunning–Kruger effect, Edward Snowden, Emanuel Derman, fault tolerance, Filter Bubble, finite state, Firefox, game design, Google Glasses, index card, information retrieval, iterative process, John Harrison: Longitude, Khan Academy, Kickstarter, machine translation, Mars Rover, Nate Silver, natural language processing, Netflix Prize, p-value, pattern recognition, performance metric, personalized medicine, pull request, recommendation engine, rent-seeking, selection bias, Silicon Valley, speech recognition, statistical model, stochastic process, tacit knowledge, text mining, the scientific method, The Wisdom of Crowds, Watson beat the top human players on Jeopardy!, X Prize

Keep iterating on this, adding degrees of neighbors one further step out each time. In the limit as this iterative process goes on forever, we’ll get the eigenvalue centrality vector. A First Example of Random Graphs: The Erdos-Renyi Model Let’s work out a simple example where a network can be viewed as a single realization of an underlying stochastic process. Namely, where the existence of a given edge follows a probability distribution, and all the edges are considered independently. Say we start with nodes. Then there are pairs of nodes, or dyads, which can either be connected by an (undirected) edge or not. Then there are possible observed networks.


pages: 561 words: 120,899

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy by Sharon Bertsch McGrayne

Abraham Wald, Alan Greenspan, Bayesian statistics, bioinformatics, Bletchley Park, British Empire, classic study, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, data science, double helix, Dr. Strangelove, driverless car, Edmond Halley, Fellow of the Royal Society, full text search, government statistician, Henri Poincaré, Higgs boson, industrial research laboratory, Isaac Newton, Johannes Kepler, John Markoff, John Nash: game theory, John von Neumann, linear programming, longitudinal study, machine readable, machine translation, meta-analysis, Nate Silver, p-value, Pierre-Simon Laplace, placebo effect, prediction markets, RAND corporation, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman: Challenger O-ring, Robert Mercer, Ronald Reagan, seminal paper, speech recognition, statistical model, stochastic process, Suez canal 1869, Teledyne, the long tail, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, uranium enrichment, We are all Keynesians now, Yom Kippur War

From Tool to Theology Armitage P. (1994) Dennis Lindley: The first 70 years. In Aspects of Uncertainty: A Tribute to D. V. Lindley, eds., PR Freeman and AFM Smith. John Wiley and Sons. Banks, David L. (1996) A Conversation with I. J. Good. Statistical Science (11) 1–19. Dubins LE, Savage LJ. (1976) Inequalities for Stochastic Processes (How to Gamble If You Must). Dover. Box, George EP, et al. (2006) Improving Almost Anything. Wiley. Box GEP, Tiao GC. (1973) Bayesian Inference in Statistical Analysis. Addison-Wesley. Cramér, H. (1976). Half of a century of probability theory: Some personal recollections. Annals of Probability (4) 509–46.


The Trade Lifecycle: Behind the Scenes of the Trading Process (The Wiley Finance Series) by Robert P. Baker

asset-backed security, bank run, banking crisis, Basel III, Black-Scholes formula, book value, Brownian motion, business continuity plan, business logic, business process, collapse of Lehman Brothers, corporate governance, credit crunch, Credit Default Swap, diversification, financial engineering, fixed income, functional programming, global macro, hiring and firing, implied volatility, interest rate derivative, interest rate swap, locking in a profit, London Interbank Offered Rate, low interest rates, margin call, market clearing, millennium bug, place-making, prediction markets, proprietary trading, short selling, statistical model, stochastic process, the market place, the payments system, time value of money, too big to fail, transaction costs, value at risk, Wiener process, yield curve, zero-coupon bond

Some simplifications have to be made because each piece of market data is technically a random variable and its connection (or correlation) to other market data is very hard, if not impossible, to determine. 3. Decide the calculation methodology There are two basic approaches – stochastic or historical. Stochastic processes If a piece of market data is assumed to be a normal distribution then we can ascribe different probabilities to different values. For example: 1% probability of 140, 5% probability of 155, 50% probability of 182 and so on. This removes the need for a large amount of data but ignores correlation between different market data.


pages: 611 words: 130,419

Narrative Economics: How Stories Go Viral and Drive Major Economic Events by Robert J. Shiller

agricultural Revolution, Alan Greenspan, Albert Einstein, algorithmic trading, Andrei Shleifer, autism spectrum disorder, autonomous vehicles, bank run, banking crisis, basic income, behavioural economics, bitcoin, blockchain, business cycle, butterfly effect, buy and hold, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, central bank independence, collective bargaining, computerized trading, corporate raider, correlation does not imply causation, cryptocurrency, Daniel Kahneman / Amos Tversky, debt deflation, digital divide, disintermediation, Donald Trump, driverless car, Edmond Halley, Elon Musk, en.wikipedia.org, Ethereum, ethereum blockchain, fake news, financial engineering, Ford Model T, full employment, George Akerlof, germ theory of disease, German hyperinflation, Great Leap Forward, Gunnar Myrdal, Gödel, Escher, Bach, Hacker Ethic, implied volatility, income inequality, inflation targeting, initial coin offering, invention of radio, invention of the telegraph, Jean Tirole, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, litecoin, low interest rates, machine translation, market bubble, Modern Monetary Theory, money market fund, moral hazard, Northern Rock, nudge unit, Own Your Own Home, Paul Samuelson, Philip Mirowski, plutocrats, Ponzi scheme, public intellectual, publish or perish, random walk, Richard Thaler, Robert Shiller, Ronald Reagan, Rubik’s Cube, Satoshi Nakamoto, secular stagnation, shareholder value, Silicon Valley, speech recognition, Steve Jobs, Steven Pinker, stochastic process, stocks for the long run, superstar cities, The Rise and Fall of American Growth, The Theory of the Leisure Class by Thorstein Veblen, The Wealth of Nations by Adam Smith, theory of mind, Thorstein Veblen, traveling salesman, trickle-down economics, tulip mania, universal basic income, Watson beat the top human players on Jeopardy!, We are the 99%, yellow journalism, yield curve, Yom Kippur War

JSTOR catalogs over nine million scholarly articles and books in all fields, and 7% of these are in business or economics, but 25% of the articles with “ARIMA,” “ARMA,” or “autoregressive” are in business or economics. 9. Moving average models are sometimes justified by reference to the Wold decomposition theorem (1954), which shows that any covariance stationary stochastic process can be modeled as a moving average of noise terms plus a deterministic component. But there is no justification for assuming that simple variants of ARIMA models are so general. We may be better able to do economic forecasting in some cases if we represent these error terms or driving variables as the result of co-epidemics of narratives about which we have some information. 10.


pages: 1,535 words: 337,071

Networks, Crowds, and Markets: Reasoning About a Highly Connected World by David Easley, Jon Kleinberg

Albert Einstein, AltaVista, AOL-Time Warner, Apollo 13, classic study, clean water, conceptual framework, Daniel Kahneman / Amos Tversky, Douglas Hofstadter, Dutch auction, Erdős number, experimental subject, first-price auction, fudge factor, Garrett Hardin, George Akerlof, Gerard Salton, Gerard Salton, Gödel, Escher, Bach, incomplete markets, information asymmetry, information retrieval, John Nash: game theory, Kenneth Arrow, longitudinal study, market clearing, market microstructure, moral hazard, Nash equilibrium, Network effects, Pareto efficiency, Paul Erdős, planetary scale, power law, prediction markets, price anchoring, price mechanism, prisoner's dilemma, random walk, recommendation engine, Richard Thaler, Ronald Coase, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, seminal paper, Simon Singh, slashdot, social contagion, social web, Steve Jobs, Steve Jurvetson, stochastic process, Ted Nelson, the long tail, The Market for Lemons, the strength of weak ties, The Wisdom of Crowds, trade route, Tragedy of the Commons, transaction costs, two and twenty, ultimatum game, Vannevar Bush, Vickrey auction, Vilfredo Pareto, Yogi Berra, zero-sum game

Emergence of scaling in random networks. Science, 286:509–512, 1999. [42] Albert-László Barabási and Zoltan Oltvai. Network biology: Understanding the cell’s functional organization. Nature Reviews Genetics, 5:101–113, 2004. [43] A. D. Barbour and D. Mollison. Epidemics and random graphs. In Stochastic Processes in Epidemic Theory, volume 86 of Lecture Notes in Biomathematics, pages 86–89. Springer, 1990. [44] John A. Barnes. Social Networks. Number 26 in Modules in Anthropology. Addison Wesley, 1972. [45] Chris Barrett and E. Mutambatsere. Agricultural markets in developing countries. In Lawrence E.

Reverse small world experiment. Social Networks, 1:159–192, 1978. [240] Peter D. Killworth, Eugene C. Johnsen, H. Russell Bernard, Gene Ann Shelley, and Christopher McCarty. Estimating the size of personal networks. Social Networks, 12(4):289–312, December 1990. [241] John F. C. Kingman. The coalescent. Stochastic Processes and their Applications, 13:235–248, 1982. [242] Aniket Kittur and Robert E. Kraut. Harnessing the wisdom of crowds in Wikipedia: Quality through coordination. In Proc. CSCW’08: ACM Conference on Computer-Supported Cooperative Work, 2008. BIBLIOGRAPHY 817 [243] Jon Kleinberg. Authoritative sources in a hyperlinked environment.


Principles of Protocol Design by Robin Sharp

accounting loophole / creative accounting, business process, discrete time, exponential backoff, fault tolerance, finite state, functional programming, Gödel, Escher, Bach, information retrieval, loose coupling, MITM: man-in-the-middle, OSI model, packet switching, quantum cryptography, RFC: Request For Comment, stochastic process

This will lead you into the area of other proof techniques for protocols, as well as illustrating how new mechanisms develop as time goes by. Finally, you might like to investigate quantitative properties of some protocols, such as their throughput and delay in the presence of varying loads of traffic. Generally speaking, this requires a knowledge of queueing theory and the theory of stochastic processes. This is not a subject which we pay more than passing attention to in this book. However, some protocols, especially multiplexing protocols, have been the subject of intensive investigation from this point of view. Good discussions of the general theory required are found in [73], while [11] relates the theory more explicitly to the analysis of network protocols. 118 4 Basic Protocol Mechanisms Exercises 4.1.


pages: 696 words: 143,736

The Age of Spiritual Machines: When Computers Exceed Human Intelligence by Ray Kurzweil

Ada Lovelace, Alan Greenspan, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Alvin Toffler, Any sufficiently advanced technology is indistinguishable from magic, backpropagation, Buckminster Fuller, call centre, cellular automata, Charles Babbage, classic study, combinatorial explosion, complexity theory, computer age, computer vision, Computing Machinery and Intelligence, cosmological constant, cosmological principle, Danny Hillis, double helix, Douglas Hofstadter, Everything should be made as simple as possible, financial engineering, first square of the chessboard / second half of the chessboard, flying shuttle, fudge factor, functional programming, George Gilder, Gödel, Escher, Bach, Hans Moravec, I think there is a world market for maybe five computers, information retrieval, invention of movable type, Isaac Newton, iterative process, Jacquard loom, John Gilmore, John Markoff, John von Neumann, Lao Tzu, Law of Accelerating Returns, mandelbrot fractal, Marshall McLuhan, Menlo Park, natural language processing, Norbert Wiener, optical character recognition, ought to be enough for anybody, pattern recognition, phenotype, punch-card reader, quantum entanglement, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Robert Metcalfe, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, social intelligence, speech recognition, Steven Pinker, Stewart Brand, stochastic process, Stuart Kauffman, technological singularity, Ted Kaczynski, telepresence, the medium is the message, The Soul of a New Machine, There's no reason for any individual to have a computer in his home - Ken Olsen, traveling salesman, Turing machine, Turing test, Whole Earth Review, world market for maybe five computers, Y2K

.: Smithsonian Institution Press, 1986. Hoage, R. J. and Larry Goldman. Animal Intelligence: Insights into the Animal Mind. Washington, D.C.: Smithsonian Institution Press, 1986. Hodges, Andrew. Alan Turing: The Enigma. New York: Simon and Schuster, 1983. Hoel, Paul G., Sidney C. Port, and Charles J. Stone. Introduction to Stochastic Processes. Boston: Houghton-Mifflin, 1972. Hofstadter, Douglas R. Gödel, Escher, Bach: An Eternal Golden Braid. New York: Basic Books, 1979. _________. Metamagical Themas: Questing for the Essence of Mind and Pattern. New York: Basic Books, 1985. Hofstadter, Douglas R. and Daniel C. Dennett. The Mind’s I: Fantasies and Reflections on Self and Soul.


pages: 542 words: 145,022

In Pursuit of the Perfect Portfolio: The Stories, Voices, and Key Insights of the Pioneers Who Shaped the Way We Invest by Andrew W. Lo, Stephen R. Foerster

Alan Greenspan, Albert Einstein, AOL-Time Warner, asset allocation, backtesting, behavioural economics, Benoit Mandelbrot, Black Monday: stock market crash in 1987, Black-Scholes formula, Bretton Woods, Brownian motion, business cycle, buy and hold, capital asset pricing model, Charles Babbage, Charles Lindbergh, compound rate of return, corporate governance, COVID-19, credit crunch, currency risk, Daniel Kahneman / Amos Tversky, diversification, diversified portfolio, Donald Trump, Edward Glaeser, equity premium, equity risk premium, estate planning, Eugene Fama: efficient market hypothesis, fake news, family office, fear index, fiat currency, financial engineering, financial innovation, financial intermediation, fixed income, hiring and firing, Hyman Minsky, implied volatility, index fund, interest rate swap, Internet Archive, invention of the wheel, Isaac Newton, Jim Simons, John Bogle, John Meriwether, John von Neumann, joint-stock company, junk bonds, Kenneth Arrow, linear programming, Long Term Capital Management, loss aversion, Louis Bachelier, low interest rates, managed futures, mandelbrot fractal, margin call, market bubble, market clearing, mental accounting, money market fund, money: store of value / unit of account / medium of exchange, Myron Scholes, new economy, New Journalism, Own Your Own Home, passive investing, Paul Samuelson, Performance of Mutual Funds in the Period, prediction markets, price stability, profit maximization, quantitative trading / quantitative finance, RAND corporation, random walk, Richard Thaler, risk free rate, risk tolerance, risk-adjusted returns, risk/return, Robert Shiller, Robert Solow, Ronald Reagan, Savings and loan crisis, selection bias, seminal paper, shareholder value, Sharpe ratio, short selling, South Sea Bubble, stochastic process, stocks for the long run, survivorship bias, tail risk, Thales and the olive presses, Thales of Miletus, The Myth of the Rational Market, The Wisdom of Crowds, Thomas Bayes, time value of money, transaction costs, transfer pricing, tulip mania, Vanguard fund, yield curve, zero-coupon bond, zero-sum game

In one paper, he tackled the important issue of the decision faced by every investor, known formally as the “portfolio selection problem”: deciding how much to consume today versus saving for tomorrow and allocating those savings between risky and risk-free investments (for example, buying Treasury bills), all the while trying to maximize lifetime utility or satisfaction.22 In another paper, published just after completing his dissertation, he examined the same problem using a more realistic “continuous-time” framework, in which prices are constantly changing.23 Merton’s experience and knowledge of the financial markets inspired many of the assumptions that were incorporated into his models: “Because I traded markets, I knew something about the idea that even if you were watching the [ticker] tape very, very close to it, it’s still the case that you couldn’t predict the next price. So, if AT&T was trading, its next trade could be the same. It could be down or up. That was very hard to forecast, no matter how short the interval. And anything I did had to capture that.”24 His pathbreaking work on continuous-time stochastic processes would eventually culminate in a highly regarded book, Continuous-Time Finance.25 As his eventual student Robert Jarrow noted years later, “I consider Bob the father of mathematical finance.… Bob invented continuous-time finance. And, continuous-time finance is the heart of mathematical finance.… Nowhere else in business is the flow of ideas between industry and academics as fluid as it is in mathematical finance.”26 In the final chapter of his dissertation, Merton performed an empirical investigation of Samuelson’s warrant pricing model.27 He looked at three “perpetual” warrants that lacked a maturity date to purchase more shares, issued by the companies Tri-Continental, Allegheny, and Atlas, and found that the Samuelson model generally performed better than some of the proposed alternatives.


pages: 1,331 words: 163,200

Hands-On Machine Learning With Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems by Aurélien Géron

AlphaGo, Amazon Mechanical Turk, Anton Chekhov, backpropagation, combinatorial explosion, computer vision, constrained optimization, correlation coefficient, crowdsourcing, data science, deep learning, DeepMind, don't repeat yourself, duck typing, Elon Musk, en.wikipedia.org, friendly AI, Geoffrey Hinton, ImageNet competition, information retrieval, iterative process, John von Neumann, Kickstarter, machine translation, natural language processing, Netflix Prize, NP-complete, OpenAI, optical character recognition, P = NP, p-value, pattern recognition, pull request, recommendation engine, self-driving car, sentiment analysis, SpamAssassin, speech recognition, stochastic process

Whereas PG algorithms directly try to optimize the policy to increase rewards, the algorithms we will look at now are less direct: the agent learns to estimate the expected sum of discounted future rewards for each state, or the expected sum of discounted future rewards for each action in each state, then uses this knowledge to decide how to act. To understand these algorithms, we must first introduce Markov decision processes (MDP). Markov Decision Processes In the early 20th century, the mathematician Andrey Markov studied stochastic processes with no memory, called Markov chains. Such a process has a fixed number of states, and it randomly evolves from one state to another at each step. The probability for it to evolve from a state s to a state s′ is fixed, and it depends only on the pair (s,s′), not on past states (the system has no memory).


pages: 634 words: 185,116

From eternity to here: the quest for the ultimate theory of time by Sean M. Carroll

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, cellular automata, Claude Shannon: information theory, Columbine, cosmic microwave background, cosmological constant, cosmological principle, dark matter, dematerialisation, double helix, en.wikipedia.org, gravity well, Great Leap Forward, Harlow Shapley and Heber Curtis, heat death of the universe, Henri Poincaré, Isaac Newton, Johannes Kepler, John von Neumann, Lao Tzu, Laplace demon, Large Hadron Collider, lone genius, low earth orbit, New Journalism, Norbert Wiener, pets.com, Pierre-Simon Laplace, Richard Feynman, Richard Stallman, Schrödinger's Cat, Slavoj Žižek, Stephen Hawking, stochastic process, synthetic biology, the scientific method, time dilation, wikimedia commons

Once the system is in that eigenstate, you can keep making the same kind of observation, and you’ll keep getting the same answer (unless something kicks the system out of the eigenstate into another superposition). We can’t say with certainty which eigenstate the system will fall into when an observation is made; it’s an inherently stochastic process, and the best we can do is assign a probability to different outcomes. We can apply this idea to the story of Miss Kitty. According to the Copenhagen interpretation, our choice to observe whether she stopped by the food bowl or the scratching post had a dramatic effect on her wave function, no matter how sneaky we were about it.


pages: 728 words: 182,850

Cooking for Geeks by Jeff Potter

3D printing, A Pattern Language, air gap, carbon footprint, centre right, Community Supported Agriculture, Computer Numeric Control, crowdsourcing, Donald Knuth, double helix, en.wikipedia.org, European colonialism, fear of failure, food miles, functional fixedness, hacker house, haute cuisine, helicopter parent, Internet Archive, iterative process, Kickstarter, lolcat, Parkinson's law, placebo effect, random walk, Rubik’s Cube, slashdot, stochastic process, TED Talk, the scientific method

Note For fun, try marinating a chunk of meat in papaya, which contains an enzyme, papain, that acts as a meat tenderizer by hydrolyzing collagen. One piece of information that is critical to understand in the kitchen, however, is that hydrolysis takes time. The structure has to literally untwist and break up, and due to the amount of energy needed to break the bonds and the stochastic processes involved, this reaction takes longer than simply denaturing the protein. Hydrolyzing collagen not only breaks down the rubbery texture of the denatured structure, but also converts a portion of it to gelatin. When the collagen hydrolyzes, it breaks into variously sized pieces, the smaller of which are able to dissolve into the surrounding liquid, creating gelatin.


pages: 733 words: 179,391

Adaptive Markets: Financial Evolution at the Speed of Thought by Andrew W. Lo

Alan Greenspan, Albert Einstein, Alfred Russel Wallace, algorithmic trading, Andrei Shleifer, Arthur Eddington, Asian financial crisis, asset allocation, asset-backed security, backtesting, bank run, barriers to entry, Bear Stearns, behavioural economics, Berlin Wall, Bernie Madoff, bitcoin, Bob Litterman, Bonfire of the Vanities, bonus culture, break the buck, Brexit referendum, Brownian motion, business cycle, business process, butterfly effect, buy and hold, capital asset pricing model, Captain Sullenberger Hudson, carbon tax, Carmen Reinhart, collapse of Lehman Brothers, collateralized debt obligation, commoditize, computerized trading, confounding variable, corporate governance, creative destruction, Credit Default Swap, credit default swaps / collateralized debt obligations, cryptocurrency, Daniel Kahneman / Amos Tversky, delayed gratification, democratizing finance, Diane Coyle, diversification, diversified portfolio, do well by doing good, double helix, easy for humans, difficult for computers, equity risk premium, Ernest Rutherford, Eugene Fama: efficient market hypothesis, experimental economics, experimental subject, Fall of the Berlin Wall, financial deregulation, financial engineering, financial innovation, financial intermediation, fixed income, Flash crash, Fractional reserve banking, framing effect, Glass-Steagall Act, global macro, Gordon Gekko, greed is good, Hans Rosling, Henri Poincaré, high net worth, housing crisis, incomplete markets, index fund, information security, interest rate derivative, invention of the telegraph, Isaac Newton, it's over 9,000, James Watt: steam engine, Jeff Hawkins, Jim Simons, job satisfaction, John Bogle, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Meriwether, Joseph Schumpeter, Kenneth Rogoff, language acquisition, London Interbank Offered Rate, Long Term Capital Management, longitudinal study, loss aversion, Louis Pasteur, mandelbrot fractal, margin call, Mark Zuckerberg, market fundamentalism, martingale, megaproject, merger arbitrage, meta-analysis, Milgram experiment, mirror neurons, money market fund, moral hazard, Myron Scholes, Neil Armstrong, Nick Leeson, old-boy network, One Laptop per Child (OLPC), out of africa, p-value, PalmPilot, paper trading, passive investing, Paul Lévy, Paul Samuelson, Paul Volcker talking about ATMs, Phillips curve, Ponzi scheme, predatory finance, prediction markets, price discovery process, profit maximization, profit motive, proprietary trading, public intellectual, quantitative hedge fund, quantitative trading / quantitative finance, RAND corporation, random walk, randomized controlled trial, Renaissance Technologies, Richard Feynman, Richard Feynman: Challenger O-ring, risk tolerance, Robert Shiller, Robert Solow, Sam Peltzman, Savings and loan crisis, seminal paper, Shai Danziger, short selling, sovereign wealth fund, Stanford marshmallow experiment, Stanford prison experiment, statistical arbitrage, Steven Pinker, stochastic process, stocks for the long run, subprime mortgage crisis, survivorship bias, systematic bias, Thales and the olive presses, The Great Moderation, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Malthus, Thorstein Veblen, Tobin tax, too big to fail, transaction costs, Triangle Shirtwaist Factory, ultimatum game, uptick rule, Upton Sinclair, US Airways Flight 1549, Walter Mischel, Watson beat the top human players on Jeopardy!, WikiLeaks, Yogi Berra, zero-sum game

But my widespread lecturing on him over the last 20 years has not elicited any information on the subject. How much Poincaré, to whom he dedicates the thesis, contributed to it, I have no knowledge. Finally, as Bachelier’s cited works suggest, he seems to have had something of a one-track mind. But what a track! The rather supercilious references to him, as an unrigorous pioneer in stochastic processes and stimulator of work in that area by more rigorous mathematicians such as Kolmogorov, hardly does Bachelier justice. His methods can hold their own in rigor with the best scientific work of his time, and his fertility was outstanding. Einstein is properly revered for his basic, and independent, discovery of the theory of Brownian motion 5 years after Bachelier.


pages: 823 words: 220,581

Debunking Economics - Revised, Expanded and Integrated Edition: The Naked Emperor Dethroned? by Steve Keen

accounting loophole / creative accounting, Alan Greenspan, banking crisis, banks create money, barriers to entry, behavioural economics, Benoit Mandelbrot, Big bang: deregulation of the City of London, Black Swan, Bonfire of the Vanities, book value, business cycle, butterfly effect, capital asset pricing model, cellular automata, central bank independence, citizen journalism, clockwork universe, collective bargaining, complexity theory, correlation coefficient, creative destruction, credit crunch, David Ricardo: comparative advantage, debt deflation, diversification, double entry bookkeeping, en.wikipedia.org, equity risk premium, Eugene Fama: efficient market hypothesis, experimental subject, Financial Instability Hypothesis, fixed income, Fractional reserve banking, full employment, Glass-Steagall Act, Greenspan put, Henri Poincaré, housing crisis, Hyman Minsky, income inequality, information asymmetry, invisible hand, iterative process, John von Neumann, Kickstarter, laissez-faire capitalism, liquidity trap, Long Term Capital Management, low interest rates, mandelbrot fractal, margin call, market bubble, market clearing, market microstructure, means of production, minimum wage unemployment, Money creation, money market fund, open economy, Pareto efficiency, Paul Samuelson, Phillips curve, place-making, Ponzi scheme, Post-Keynesian economics, power law, profit maximization, quantitative easing, RAND corporation, random walk, risk free rate, risk tolerance, risk/return, Robert Shiller, Robert Solow, Ronald Coase, Savings and loan crisis, Schrödinger's Cat, scientific mainstream, seigniorage, six sigma, South Sea Bubble, stochastic process, The Great Moderation, The Wealth of Nations by Adam Smith, Thorstein Veblen, time value of money, total factor productivity, tulip mania, wage slave, zero-sum game

I therefore suggest that the economists revise their curriculum and require that the following topics be taught: calculus through the advanced level, ordinary differential equations (including advanced), partial differential equations (including Green functions), classical mechanics through modern nonlinear dynamics, statistical physics, stochastic processes (including solving Smoluchowski–Fokker–Planck equations), computer programming (C, Pascal, etc.) and, for complexity, cell biology. Time for such classes can be obtained in part by eliminating micro- and macro-economics classes from the curriculum. The students will then face a much harder curriculum, and those who survive will come out ahead.


pages: 920 words: 233,102

Unelected Power: The Quest for Legitimacy in Central Banking and the Regulatory State by Paul Tucker

"Friedman doctrine" OR "shareholder theory", Alan Greenspan, Andrei Shleifer, bank run, banking crisis, barriers to entry, Basel III, battle of ideas, Bear Stearns, Ben Bernanke: helicopter money, Berlin Wall, Bretton Woods, Brexit referendum, business cycle, capital controls, Carmen Reinhart, Cass Sunstein, central bank independence, centre right, conceptual framework, corporate governance, diversified portfolio, electricity market, Fall of the Berlin Wall, financial innovation, financial intermediation, financial repression, first-past-the-post, floating exchange rates, forensic accounting, forward guidance, Fractional reserve banking, Francis Fukuyama: the end of history, full employment, George Akerlof, Greenspan put, incomplete markets, inflation targeting, information asymmetry, invisible hand, iterative process, Jean Tirole, Joseph Schumpeter, Kenneth Arrow, Kenneth Rogoff, liberal capitalism, light touch regulation, Long Term Capital Management, low interest rates, means of production, Money creation, money market fund, Mont Pelerin Society, moral hazard, Northern Rock, operational security, Pareto efficiency, Paul Samuelson, price mechanism, price stability, principal–agent problem, profit maximization, public intellectual, quantitative easing, regulatory arbitrage, reserve currency, risk free rate, risk tolerance, risk-adjusted returns, road to serfdom, Robert Bork, Ronald Coase, seigniorage, short selling, Social Responsibility of Business Is to Increase Its Profits, stochastic process, subprime mortgage crisis, tail risk, The Chicago School, The Great Moderation, The Market for Lemons, the payments system, too big to fail, transaction costs, Vilfredo Pareto, Washington Consensus, yield curve, zero-coupon bond, zero-sum game

That much is entailed by the first Design Precept, cast as a revived “nondelegation doctrine” in part III (chapter 14). The big questions are what it means in principle and in practice. Roughly speaking, policy makers need to determine the severity of shock that the system should be able to withstand. In principle, that would be driven by three things: A view of the underlying (stochastic) process generating the first-round losses from end borrowers that hit the system A picture (or model) of the structure of the financial system through which those losses and other shocks are transmitted around the system A tolerance for systemic crisis The first and second are properly objects of scientific inquiry by technocrats and researchers.


pages: 798 words: 240,182

The Transhumanist Reader by Max More, Natasha Vita-More

"World Economic Forum" Davos, 23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, Computing Machinery and Intelligence, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Douglas Engelbart, Drosophila, en.wikipedia.org, endogenous growth, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, Future Shock, game design, germ theory of disease, Hans Moravec, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, Louis Pasteur, Menlo Park, meta-analysis, moral hazard, Network effects, Nick Bostrom, Norbert Wiener, pattern recognition, Pepto Bismol, phenotype, positional goods, power law, precautionary principle, prediction markets, presumed consent, Project Xanadu, public intellectual, radical life extension, Ray Kurzweil, reversible computing, RFID, Ronald Reagan, scientific worldview, silicon-based life, Singularitarianism, social intelligence, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, synthetic biology, systems thinking, technological determinism, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, VTOL, Whole Earth Review, women in the workforce, zero-sum game

One might ask: How do we get from 107 bytes that specify the brain in the genome to 1016 bytes in the mature brain? This is not hard to understand, since we do this type of meaningful data expansion routinely in our self-organizing software paradigms. For example, a genetic ­algorithm can be efficiently coded, but in turn creates data far greater in size than itself using a stochastic process, which in turn self-organizes in response to a complex environment (the problem space). The result of this process is meaningful information far greater than the original program. We know that this is exactly how the creation of the brain works. The genome specifies initially semi-random interneuronal connection wiring patterns in specific regions of the brain (random within certain constraints and rules), and these patterns (along with the ­neurotransmitter-concentration levels) then undergo their own internal evolutionary process to self-organize to reflect the interactions of that person with their experiences and environment.


pages: 764 words: 261,694

The Elements of Statistical Learning (Springer Series in Statistics) by Trevor Hastie, Robert Tibshirani, Jerome Friedman

algorithmic bias, backpropagation, Bayesian statistics, bioinformatics, computer age, conceptual framework, correlation coefficient, data science, G4S, Geoffrey Hinton, greed is good, higher-order functions, linear programming, p-value, pattern recognition, random walk, selection bias, sparse data, speech recognition, statistical model, stochastic process, The Wisdom of Crowds

Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images, IEEE Transactions on Pattern Analysis and Machine Intelligence 6: 721–741. Genkin, A., Lewis, D. and Madigan, D. (2007). Large-scale Bayesian logistic regression for text categorization, Technometrics 49(3): 291–304. Genovese, C. and Wasserman, L. (2004). A stochastic process approach to false discovery rates, Annals of Statistics 32(3): 1035–1061. Gersho, A. and Gray, R. (1992). Vector Quantization and Signal Compression, Kluwer Academic Publishers, Boston, MA. Girosi, F., Jones, M. and Poggio, T. (1995). Regularization theory and neural network architectures, Neural Computation 7: 219–269.


The Art of Computer Programming: Fundamental Algorithms by Donald E. Knuth

Charles Babbage, discrete time, distributed generation, Donald Knuth, fear of failure, Fermat's Last Theorem, G4S, Gerard Salton, Isaac Newton, Ivan Sutherland, Jacquard loom, Johannes Kepler, John von Neumann, linear programming, linked data, Menlo Park, probability theory / Blaise Pascal / Pierre de Fermat, sorting algorithm, stochastic process, Turing machine

Suppose each arc e of G has been assigned a probability p(e), where the probabilities satisfy the conditions 0 < p(e) < 1; ^ p(e) = 1 for 1 < j < n. init(e)=Vj Consider a random path, which starts at V\ and subsequently chooses branch e of G with probability p(e), until Vn is reached; the choice of branch taken at each step is to be independent of all previous choices. 2.3.4.2 ORIENTED TREES 381 For example, consider the graph of exercise 2.3.4.1-7, and assign the respective probabilities 1, \, \, |, 1, f, \, \, \ to arcs ei, e2,.. •, e9. Then the path "Start-A- B-C-A-D-B-C-Stop" is chosen with probability l-|-l-|-|-|-l-i = tIs- Such random paths are called Markov chains, after the Russian mathematician Andrei A. Markov, who first made extensive studies of stochastic processes of this kind. The situation serves as a model for certain algorithms, although our requirement that each choice must be independent of the others is a very strong assumption. The purpose of this exercise is to analyze the computation time for algorithms of this kind. The analysis is facilitated by considering the n x n matrix A — (aij), where aij = ^2p{e) summed over all arcs e that go from Vi to Vj.


pages: 1,799 words: 532,462

The Codebreakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet by David Kahn

anti-communist, Bletchley Park, British Empire, Charles Babbage, classic study, Claude Shannon: information theory, computer age, cotton gin, cuban missile crisis, Easter island, end-to-end encryption, Fellow of the Royal Society, heat death of the universe, Honoré de Balzac, index card, interchangeable parts, invention of the telegraph, Isaac Newton, Johannes Kepler, John von Neumann, Louis Daguerre, machine translation, Maui Hawaii, Norbert Wiener, out of africa, pattern recognition, place-making, planned obsolescence, Plato's cave, pneumatic tube, popular electronics, positional goods, Republic of Letters, Searching for Interstellar Communications, stochastic process, Suez canal 1869, the scientific method, trade route, Turing machine, union organizing, yellow journalism, zero-sum game

The second section, STED (for “Standard Technical Equipment Development”) conducts basic cryptographic research. It looks for new principles of encipherment. It ascertains whether new developments in technology, such as the transistor and the tunnel diode, have cryptographic applications. Using such esoteric tools as Galois field theory, stochastic processes, and group, matrix, and number theory, it will construct a mathematical model of a proposed cipher machine and will simulate its operation on a computer, thus producing the cipher without having to build the hardware. Rotor principles have often been tested for cryptographic strength in this way.