stochastic process

47 results back to index


pages: 416 words: 39,022

Asset and Risk Management: Risk Oriented Finance by Louis Esch, Robert Kieffer, Thierry Lopez

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

asset allocation, Brownian motion, business continuity plan, business process, capital asset pricing model, computer age, corporate governance, discrete time, diversified portfolio, implied volatility, index fund, interest rate derivative, iterative process, P = NP, p-value, random walk, risk/return, shareholder value, statistical model, stochastic process, transaction costs, value at risk, Wiener process, yield curve, zero-coupon bond

This is a distribution symmetrical with respect to 0, which corresponds to a normal distribution for n = 2 and gives rise to a leptokurtic distribution (resp. negative kurtosis distribution) for n < 2 (n > 2). 2.3 STOCHASTIC PROCESSES 2.3.1 General considerations The term stochastic process is applied to a random variable that is a function of the time variable: {Xt : t ∈ T }. 354 Asset and Risk Management f (x) v=1 v=2 v=3 x 0 Figure A2.15 Generalised error distribution If the set T of times is discrete, the stochastic process is simply a sequence of random variables. However, in a number of financial applications such as Black and Scholes’ model, it will be necessary to consider stochastic processes in continuous time. For each possible result ω ∈ , the function of Xt (ω) of the variable t is known as the path of the stochastic process. A stochastic process is said to have independent increments when, regardless of the times t1 < t2 < . . . < tn , the r.v.s Xt1 , Xt2 − Xt1 , Xt3 − Xt2 , . . . are independent.

It is, however, possible to extend the definition to a concept of stochastic differential, through the theory of stochastic integral calculus.8 As the stochastic process zt is defined within the interval [a; b], the stochastic integral of zt is defined within [a; b] with respect to the standard Brownian motion wt by: a 7 8 b zt dwt = lim n→∞ δ→0 n−1 ztk (wtk+1 − wtk ) k=0 The root function presents a vertical tangent at the origin. The full development of this theory is outside the scope of this work. Probabilistic Concepts where, we have: 357 a = t0 < t1 < . . . < tn = b δ = max (tk − tk−1 ) k=1,...,n Let us now consider a stochastic process Zt (for which we wish to define the stochastic differential)and a standard Brownian motion wt . If there is a stochastic process zt such that t Zt = Z0 + 0 zs dws , then it is said that Zt admits the stochastic differential dZt = zt dwt .

8.1.2 The data in the example 8.2 Calculations 8.2.1 Treasury portfolio case 8.2.2 Bond portfolio case 8.3 The normality hypothesis PART IV FROM RISK MANAGEMENT TO ASSET MANAGEMENT Introduction 9 224 224 230 234 235 238 241 243 243 243 244 244 244 250 252 255 256 Portfolio Risk Management 9.1 General principles 9.2 Portfolio risk management method 9.2.1 Investment strategy 9.2.2 Risk framework 257 257 257 258 258 10 Optimising the Global Portfolio via VaR 10.1 Taking account of VaR in Sharpe’s simple index method 10.1.1 The problem of minimisation 10.1.2 Adapting the critical line algorithm to VaR 10.1.3 Comparison of the two methods 10.2 Taking account of VaR in the EGP method 10.2.1 Maximising the risk premium 10.2.2 Adapting the EGP method algorithm to VaR 10.2.3 Comparison of the two methods 10.2.4 Conclusion 10.3 Optimising a global portfolio via VaR 10.3.1 Generalisation of the asset model 10.3.2 Construction of an optimal global portfolio 10.3.3 Method of optimisation of global portfolio 265 266 266 267 269 269 269 270 271 272 274 275 277 278 11 Institutional Management: APT Applied to Investment Funds 11.1 Absolute global risk 11.2 Relative global risk/tracking error 11.3 Relative fund risk vs. benchmark abacus 11.4 Allocation of systematic risk 285 285 285 287 288 x Contents 2.2 Theoretical distributions 2.2.1 Normal distribution and associated ones 2.2.2 Other theoretical distributions 2.3 Stochastic processes 2.3.1 General considerations 2.3.2 Particular stochastic processes 2.3.3 Stochastic differential equations 347 347 350 353 353 354 356 Appendix 3 Statistical Concepts 3.1 Inferential statistics 3.1.1 Sampling 3.1.2 Two problems of inferential statistics 3.2 Regressions 3.2.1 Simple regression 3.2.2 Multiple regression 3.2.3 Nonlinear regression 359 359 359 360 362 362 363 364 Appendix 4 Extreme Value Theory 4.1 Exact result 4.2 Asymptotic results 4.2.1 Extreme value theorem 4.2.2 Attraction domains 4.2.3 Generalisation 365 365 365 365 366 367 Appendix 5 Canonical Correlations 5.1 Geometric presentation of the method 5.2 Search for canonical characters 369 369 369 Appendix 6 371 Algebraic Presentation of Logistic Regression Appendix 7 Time Series Models: ARCH-GARCH and EGARCH 7.1 ARCH-GARCH models 7.2 EGARCH models 373 373 373 Appendix 8 Numerical Methods for Solving Nonlinear Equations 8.1 General principles for iterative methods 8.1.1 Convergence 8.1.2 Order of convergence 8.1.3 Stop criteria 8.2 Principal methods 8.2.1 First order methods 8.2.2 Newton–Raphson method 8.2.3 Bisection method 375 375 375 376 376 377 377 379 380 Contents 8.3 Nonlinear equation systems 8.3.1 General theory of n-dimensional iteration 8.3.2 Principal methods xi 380 381 381 Bibliography 383 Index 389 Collaborators Christian Berbé, Civil engineer from Université libre de Bruxelles and ABAF financial analyst.


pages: 313 words: 34,042

Tools for Computational Finance by Rüdiger Seydel

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

bioinformatics, Black-Scholes formula, Brownian motion, continuous integration, discrete time, implied volatility, incomplete markets, interest rate swap, linear programming, London Interbank Offered Rate, mandelbrot fractal, martingale, random walk, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process, zero-coupon bond

The easiest way to consider stochastic movements is via an additive term, 32 Chapter 1 Modeling Tools for Financial Options dx = a(x, t) + b(x, t)ξt . dt Here we use the notations a: deterministic part, bξt : stochastic part, ξt denotes a generalized stochastic process. An example of a generalized stochastic process is white noise. For a brief definition of white noise we note that to each stochastic process a generalized version can be assigned [Ar74]. For generalized stochastic processes derivatives of any order can be defined. Suppose that Wt is the generalized version of a Wiener process, then Wt can be differentiated. Then white noise ξt is d Wt , or vice versa, defined as ξt = Ẇt = dt t Wt = ξs ds. 0 That is, a Wiener process is obtained by smoothing the white noise. The smoother integral version dispenses with using generalized stochastic processes. Hence the integrated form of ẋ = a(x, t) + b(x, t)ξt is studied, t t x(t) = x0 + a(x(s), s)ds + b(x(s), s)ξs ds, t0 t0 and we replace ξs ds = dWs .

Here we consider the continuoustime situation. That is, t ∈ IR varies continuously in a time interval I, which typically represents 0 ≤ t ≤ T . A more complete notation for a stochastic process is {Xt , t ∈ I}, or (Xt )0≤t≤T . Let the chance play for all t in the interval 0 ≤ t ≤ T , then the resulting function Xt is called realization or path of the stochastic process. Special properties of stochastic processes have lead to the following names: Gaussian process: All finite-dimensional distributions (Xt1 , . . . , Xtk ) are Gaussian. Hence specifically Xt is distributed normally for all t. Markov process: Only the present value of Xt is relevant for its future motion. That is, the past history is fully reflected in the present value.4 An example of a process that is both Gaussian and Markov, is the Wiener process. 4 This assumption together with the assumption of an immediate reaction of the market to arriving informations are called hypothesis of the efficient market [Bo98]. 26 Chapter 1 Modeling Tools for Financial Options 11500 11000 10500 10000 9500 9000 8500 8000 7500 7000 0 50 100 150 200 250 300 350 400 450 500 Fig. 1.14.

In multi-period models and continuous models ∆ must be adapted dynamically. The general definition is ∂V (S, t) ; ∆ = ∆(S, t) = ∂S the expression (1.16) is a discretized version. 1.6 Stochastic Processes Brownian motion originally meant the erratic motion of a particle (pollen) on the surface of a fluid, caused by tiny impulses of molecules. Wiener suggested a mathematical model for this motion, the Wiener process. But earlier Bachelier had applied Brownian motion to model the motion of stock prices, which instantly respond to the numerous upcoming informations similar as pollen react to the impacts of molecules. The illustration of the Dow in Figure 1.14 may serve as motivation. A stochastic process is a family of random variables Xt , which are defined for a set of parameters t (−→ Appendix B1). Here we consider the continuoustime situation.


pages: 320 words: 33,385

Market Risk Analysis, Quantitative Methods in Finance by Carol Alexander

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

asset allocation, backtesting, barriers to entry, Brownian motion, capital asset pricing model, constrained optimization, credit crunch, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, implied volatility, interest rate swap, market friction, market microstructure, p-value, performance metric, quantitative trading / quantitative finance, random walk, risk tolerance, risk-adjusted returns, risk/return, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process, yield curve

Readers interested in estimating the parameters of a GARCH model when they come to Chapter II.4 will need to understand maximum likelihood estimation. Section I.3.7 shows how to model the evolution of financial asset prices and returns using a stochastic process in both discrete and continuous time. The translation between discrete and continuous time, and the relationship between the continuous time representation and the discrete time representation of a stochastic process, is very important indeed. The theory of finance requires an understanding of both discrete time and continuous time stochastic processes. Section I.3.8 summarizes and concludes. Some prior knowledge of basic calculus and elementary linear algebra is required to understand this chapter. Specifically, an understanding of Sections I.1.3 and I.2.4 is assumed.

But we do 134 Quantitative Methods in Finance not know and so we need to estimate the variance using the maximum likelihood estimator ˆ 2 given by (I.3.132). Then, using ˆ in place of we have ˆ (I.3.135) estse X = √ n and ˆ 2 (I.3.136) estse ˆ 2 = √ 2n I.3.7 STOCHASTIC PROCESSES IN DISCRETE AND CONTINUOUS TIME A stochastic process is a sequence of identically distributed random variables. For most of our purposes random variables are continuous, indeed they are often assumed to be normal, but the sequence may be over continuous or discrete time. That is, we consider continuous state processes in both continuous and discrete time. • The study of discrete time stochastic processes is called time series analysis. In the time domain the simplest time series models are based on regression analysis, which is introduced in the next chapter. A simple example of a time series model is the first order autoregression, and this is defined below along with a basic test for stationarity.

I.3.7.3 Stochastic Models for Asset Prices and Returns Time series of asset prices behave quite differently from time series of returns. In efficient markets a time series of prices or log prices will follow a random walk. More generally, even in the presence of market frictions and inefficiencies, prices and log prices of tradable assets are integrated stochastic processes. These are fundamentally different from the associated returns, which are generated by stationary stochastic processes. Figures I.3.28 and I.3.29 illustrate the fact that prices and returns are generated by very different types of stochastic process. Figure I.3.28 shows time series of daily prices (lefthand scale) and log prices (right-hand scale) of the Dow Jones Industrial Average (DJIA) DJIA 12000 9.4 Log DJIA 9.3 11000 9.2 10000 9.1 9000 9 8000 8.9 Sep-01 May-01 Jan-01 Sep-00 May-00 Jan-00 Sep-99 May-99 Jan-99 Sep-98 May-98 8.8 Jan-98 7000 Figure I.3.28 Daily prices and log prices of DJIA index 56 This is not the only possible discretization of a continuous increment.


pages: 447 words: 104,258

Mathematics of the Financial Markets: Financial Instruments and Derivatives Modelling, Valuation and Risk Issues by Alain Ruttiens

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

algorithmic trading, asset allocation, asset-backed security, backtesting, banking crisis, Black Swan, Black-Scholes formula, Brownian motion, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, discounted cash flows, discrete time, diversification, fixed income, implied volatility, interest rate derivative, interest rate swap, margin call, market microstructure, martingale, p-value, passive investing, quantitative trading / quantitative finance, random walk, risk/return, Sharpe ratio, short selling, statistical model, stochastic process, stochastic volatility, time value of money, transaction costs, value at risk, volatility smile, Wiener process, yield curve, zero-coupon bond

F forward price, or future price (depends on the context) FV future value -ibor generic for LIBOR, EURIBOR, or any other inter-bank market rate K strike price of an option κ kurtosis M month or million, depending on context MD modified duration MtM “Marked to Market” (= valued to the observed current market price) μ drift of a stochastic process N total number of a series (integer number), or nominal (notional) amount (depends on the context) (.) Gaussian (normal) density distribution function N(.) Gaussian (normal) cumulative distribution function P put price P{.} probability of {.} PV present value (.) Poisson density distribution function r generic symbol for a rate of return rf risk-free return ρ(.) correlation of (.) skew skewness S spot price of an asset (equity, currency, etc.), as specified by the context STD(.) standard deviation of (.) σ volatility of a stochastic process t current time, or time in general (depends on the context) t0 initial time T maturity time τ tenor, that is, time interval between current time t and maturity T V(.) variance of (.) (.) stochastic process of (.)

Provided F(x) is continuously differentiable, we can determine the corresponding density function f(x) associated to the random variable X as Stochastic Processes A stochastic process can be defined as a collection of random variables defined on the same probability space (Ω, , P) and “indexed” by a set of parameter T, that is, {Xt, t ∈ T}. Within the framework of our chapter, t is the time. For a given outcome or sample ω, Xt(ω) for t ∈ T is called a sample path, realization or trajectory of the process. The space containing all possible values of Xt is called the state space. Further in this chapter, we will only consider a one-dimension state space, namely the set of real numbers , that refers to T, and random variables Xt involved in stochastic processes {Xt, t ∈ T} will be denoted by where “∼” indicates its random nature over time t; these random variables will be such as a price, a rate or a return.

., NEFTCI in the further reading at the end of the chapter). 9 Other financial models: from ARMA to the GARCH family The previous chapter dealt with stochastic processes, which consist of (returns) models involving a mixture of deterministic and stochastic components. By contrast, the models developed here present three major differences: These models are deterministic; since they are aiming to model a non-deterministic variable such as a return, the difference between the model output and the actual observed value is a probabilistic error term. By contrast with stochastic processes described by differential equations, these models are built in discrete time, in practice, the periodicity of the modeled return (daily, for example). By contrast with usual Markovian stochastic processes, these models incorporate in the general case a limited number of previous return values, so that they are not Markovian.


pages: 443 words: 51,804

Handbook of Modeling High-Frequency Data in Finance by Frederi G. Viens, Maria C. Mariani, Ionut Florescu

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

algorithmic trading, asset allocation, automated trading system, backtesting, Black-Scholes formula, Brownian motion, business process, continuous integration, corporate governance, discrete time, distributed generation, fixed income, Flash crash, housing crisis, implied volatility, incomplete markets, linear programming, mandelbrot fractal, market friction, market microstructure, martingale, Menlo Park, p-value, pattern recognition, performance metric, principal–agent problem, random walk, risk tolerance, risk/return, short selling, statistical model, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process

The iterative method we will use for this problem was developed by Chadam and Yin in Ref. 22 to study a similar partial integro-differential problem. 13.3.1 STATEMENT OF THE PROBLEM As pointed out in Ref. 17, when modeling high frequency data in applications, a Lévy-like stochastic process appears to be the best fit. When using these models, option prices are found by solving the resulting PIDE. For example, integrodifferential equations appear in exponential Lévy models, where the market price of an asset is represented as the exponential of a Lévy stochastic process. These models have been discussed in several published works such as Refs 17 and 23. 365 13.3 Another Iterative Method In this section, we consider the following integro-differential model for a European call option ∂C σ 2S2 ∂ 2C ∂C (S, t) − rC(S, t) (S, t) + rS (S, t) + ∂t ∂S 2 ∂S 2 ∂C y y + ν(dy) C(Se , t) − C(S, t) − S(e − 1) (S, t) = 0, ∂S (13.28) where the market price of an asset is represented as the exponential of a Lévy stochastic process (see Chapter 12 of Ref. 17).

Physica A 2003;318:279–292 [Proceedings of International Statistical Physics Conference, Kolkata]. 19. Mantegna RN, Stanley HE. Stochastic process with ultra-slow convergence to a Gaussian: the truncated Levy flight. Phys Rev Lett 1994;73:2946–2949. 20. Peng CK, Mietus J, Hausdorff JM, Havlin S, Stanley HE, Goldberger AL. Longrange anticorrelations and non-Gaussian behavior of the heartbeat. Phys Rev Lett 1993;70:1343–1346. 21. Peng CK, Buldyrev SV, Havlin S, Simons M, Stanley HE, Goldberger AL. Mosaic organization of DNA nucleotides. Phys Rev E 1994;49:1685–1689. 22. Levy P. Calcul des probabilités. Paris: Gauthier-Villars; 1925. 23. Khintchine AYa, Levy P. Sur les lois stables. C R Acad Sci Paris 1936;202:374–376. 24. Koponen I. Analytic approach to the problem of convergence of truncated Levy flights towards the Gaussian stochastic process. Phys Rev E 1995;52:1197–1199. 25. Podobnik B, Ivanov PCh, Lee Y, Stanley HE.

Stable non-Gaussian random processes: stochastic models with infinite variance. New York: Chapman and Hall; 1994. 6. Levy P. Calcul des probabilités. Paris: Gauthier-Villars; 1925. 7. Khintchine AYa, Levy P. Sur les lois stables. C R Acad Sci Paris;1936;202:374. 8. Mantegna RN, Stanley HE. Stochastic process with ultra-slow convergence to a Gaussian: the truncated Levy flight. Phys Rev Lett;1994;73:2946– 2949. 9. Koponen I. Analytic approach to the problem of convergence of truncated Levy flights towards the Gaussian stochastic process. Phys Rev E;1995;52:1197–1199. 10. Weron R. Levy-stable distributions revisited: tail index> 2 does not exclude the Levy-stable regime. Int J Mod Phys C; 2001;12:209–223. Chapter Thirteen Solutions to Integro-Differential Parabolic Problem Arising on Financial Mathematics MARIA C.

Analysis of Financial Time Series by Ruey S. Tsay

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Asian financial crisis, asset allocation, Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, correlation coefficient, data acquisition, discrete time, frictionless, frictionless market, implied volatility, index arbitrage, Long Term Capital Management, market microstructure, martingale, p-value, pattern recognition, random walk, risk tolerance, short selling, statistical model, stochastic process, stochastic volatility, telemarketer, transaction costs, value at risk, volatility smile, Wiener process, yield curve

ISBN: 0-471-41544-8 CHAPTER 6 Continuous-Time Models and Their Applications Price of a financial asset evolves over time and forms a stochastic process, which is a statistical term used to describe the evolution of a random variable over time. The observed prices are a realization of the underlying stochastic process. The theory of stochastic process is the basis on which the observed prices are analyzed and statistical inference is made. There are two types of stochastic process for modeling the price of an asset. The first type is called the discrete-time stochastic process, in which the price changes at discrete time points. All the processes discussed in the previous chapters belong to this category. For example, the daily closing price of IBM stock on the New York Stock Exchange forms a discrete-time stochastic process. Here the price changes only at the closing of a trading day.

For more description on options, see Hull (1997). 6.2 SOME CONTINUOUS-TIME STOCHASTIC PROCESSES In mathematical statistics, a continuous-time continuous stochastic process is defined on a probability space (, F, P), where is a nonempty space, F is a σ -field consisting of subsets of , and P is a probability measure; see Chapter 1 of Billingsley (1986). The process can be written as {x(η, t)}, where t denotes time and is continuous in [0, ∞). For a given t, x(η, t) is a real-valued continuous random variable (i.e., a mapping from to the real line), and η is an element of . For the price of an asset at time t, the range of x(η, t) is the set of non-negative real numbers. For a given η, {x(η, t)} is a time series with values depending on the time t. For simplicity, we 223 STOCHASTIC PROCESSES write a continuous-time stochastic process as {xt } with the understanding that, for a given t, xt is a random variable.

As a result, we cannot use the usual intergation in calculus to handle integrals involving a standard Brownian motion when we consider the value of an asset over time. Another approach must be sought. This is the purpose of discussing Ito’s calculus in the next section. 6.2.2 Generalized Wiener Processes The Wiener process is a special stochastic process with zero drift and variance proportional to the length of time interval. This means that the rate of change in expectation is zero and the rate of change in variance is 1. In practice, the mean and variance of a stochastic process can evolve over time in a more complicated manner. Hence, further generalization of stochastic process is needed. To this end, we consider the generalized Wiener process in which the expectation has a drift rate µ and the rate of variance change is σ 2 . Denote such a process by xt and use the notation dy for a small change in the variable y.

Mathematical Finance: Core Theory, Problems and Statistical Algorithms by Nikolai Dokuchaev

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Black-Scholes formula, Brownian motion, buy low sell high, discrete time, fixed income, implied volatility, incomplete markets, martingale, random walk, short selling, stochastic process, stochastic volatility, transaction costs, volatility smile, Wiener process, zero-coupon bond

If yes, give an example; if no, prove it. such that x and y are rational Problem 1.60 Let Q2 be the set of all pairs 2 numbers. We consider a random direct line L in R such that with probability 1, and that the angle between L and the vector (1, 0) has the uniform distribution on [0, π). Find the probability that the set © 2007 Nikolai Dokuchaev is finite. 2 Basics of stochastic processes In this chapter, some basic facts and definitions from the theory of stochastic (random) processes are given, including filtrations, martingales, Markov times, and Markov processes. 2.1 Definitions of stochastic processes Sometimes it is necessary to consider random variables or vectors that depend on time. Definition 2.1 A sequence of random variables ξt, t=0, 1, 2,…, is said to be a discrete time stochastic (or random) process. be given. A mapping ξ:[0,T]×Ω→R is said to be a Definition 2.2 Let continuous time stochastic (random) process if ξ(t,ω) is a random variable for a.e.

It suffices to apply Theorem 4.42 with f≡0, b≡1, then yx,s(t)=w(t)−w(s)+x, and the corresponding operator is In Example 4.49, representation (4.16) is said to be the probabilistic representation of the solution. In particular, it follows that where is the probability density function for N(x, T−s). Note that this function is also well known in the theory of parabolic equations: it is the so-called fundamental solution of the heat equation. The representation of functions of the stochastic processes via solution of parabolic partial differential equations (PDEs) helps to study stochastic processes: one can use numerical methods developed for PDEs (i.e., finite differences, fundamental solutions, etc.). On the other hand, the probabilistic representation of a solution of parabolic PDEs can also help to study PDEs. For instance, one can use Monte Carlo simulation for numerical solution of PDEs. Some theoretical results can also be proved easier with probabilistic representation (for example, the so-called maximum principle for parabolic equations follows from this representation: if φ≥0 and Ψ≥0 in (4.15), then V≥0).

British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging in Publication Data A catalog record for this book has been requested ISBN 0-203-96472-1 Master e-book ISBN ISBN10: 0-415-41447-4 (hbk) ISBN10: 0-415-41448-2 (pbk) ISBN10: 0-203-96472-1 (Print Edition) (ebk) ISBN13: 978-0-415-41447-0 (hbk) ISBN13: 978-0-415-41448-7 (pbk) ISBN13: 978-0-203-96472-9 (Print Edition) (ebk) © 2007 Nikolai Dokuchaev Contents Preface vi 1 Review of probability theory 1 2 Basics of stochastic processes 17 3 Discrete time market models 23 4 Basics of Ito calculus and stochastic analysis 49 5 Continuous time market models 75 6 American options and binomial trees 110 7 Implied and historical volatility 132 8 Review of statistical estimation 139 9 Estimation of models for stock prices 168 Legend of notations and abbreviations 182 Selected answers and key figures 183 Bibliography 184 © 2007 Nikolai Dokuchaev Preface Dedicated to Natalia, Lidia, and Mikhail This book gives a systematic, self-sufficient, and yet short presentation of the mainstream topics of Mathematical Finance and related part of Stochastic Analysis and Statistical Finance that covers typical university programs.


pages: 153 words: 12,501

Mathematics for Economics and Finance by Michael Harrison, Patrick Waldron

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Brownian motion, buy low sell high, capital asset pricing model, compound rate of return, discrete time, incomplete markets, law of one price, market clearing, risk tolerance, riskless arbitrage, short selling, stochastic process

A random vector is just a vector of random variables. It can also be thought of as a vector-valued function on the sample space Ω. A stochastic process is a collection of random variables or random vectors indexed by time, e.g. {x̃t : t ∈ T } or just {x̃t } if the time interval is clear from the context. For the purposes of this part of the course, we will assume that the index set consists of just a finite number of times i.e. that we are dealing with discrete time stochastic processes. Then a stochastic process whose elements are N -dimensional random vectors is equivalent to an N |T |-dimensional random vector. The (joint) c.d.f. of a random vector or stochastic process is the natural extension of the one-dimensional concept. Random variables can be discrete, continuous or mixed. The expectation (mean, average) of a discrete r.v., x̃, with possible values x1 , x2 , x3 , . . . is given by E [x̃] ≡ ∞ X xi P r (x̃ = xi ) .

This framework is sufficient to illustrate the similarities and differences between the most popular approaches. When we consider consumer choice under uncertainty, consumption plans will have to specify a fixed consumption vector for each possible state of nature or state of the world. This just means that each consumption plan is a random vector. Let us review the associated concepts from basic probability theory: probability space; random variables and vectors; and stochastic processes. Let Ω denote the set of all possible states of the world, called the sample space. A collection of states of the world, A ⊆ Ω, is called an event. Let A be a collection of events in Ω. The function P : A → [0, 1] is a probability function if 1. (a) Ω ∈ A (b) A ∈ A ⇒ Ω − A ∈ A (c) Ai ∈ A for i = 1, . . . , ∞ ⇒ S∞ i=1 Ai ∈ A (i.e. A is a sigma-algebra of events) Revised: December 2, 1998 86 5.2.

The prices of this, and the other elementary claims, must, by no arbitrage, equal the prices of the corresponding replicating portfolios. 5.5 The Expected Utility Paradigm 5.5.1 Further axioms The objects of choice with which we are concerned in a world with uncertainty could still be called consumption plans, but we will acknowledge the additional structure now described by terming them lotteries. If there are k physical commodities, a consumption plan must specify a k-dimensional vector, x ∈ <k , for each time and state of the world. We assume a finite number of times, denoted by the set T . The possible states of the world are denoted by the set Ω. So a consumption plan or lottery is just a collection of |T | k-dimensional random vectors, i.e. a stochastic process. Again to distinguish the certainty and uncertainty cases, we let L denote the collection of lotteries under consideration; X will now denote the set of possible values of the lotteries in L. Revised: December 2, 1998 94 5.5. THE EXPECTED UTILITY PARADIGM Preferences are now described by a relation on L. We will continue to assume that preference relations are complete, reflexive, transitive, and continuous.

The Concepts and Practice of Mathematical Finance by Mark S. Joshi

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Black-Scholes formula, Brownian motion, correlation coefficient, Credit Default Swap, delta neutral, discrete time, Emanuel Derman, implied volatility, incomplete markets, interest rate derivative, interest rate swap, London Interbank Offered Rate, martingale, millennium bug, quantitative trading / quantitative finance, short selling, stochastic process, stochastic volatility, the market place, time value of money, transaction costs, value at risk, volatility smile, yield curve, zero-coupon bond

We shall say that the family X of random variables Xt satisfies the stochastic differential equation, dXt = µ(t, Xt)dt + a(t, Xt)dWt, (5.8) The Ito calculus 106 if for any t, we have that Xr+h - Xt - h s(t, Xt) - a(t, Xr)(Wt+h - Wt) is a random variable with mean and variance which are o(h). We shall call such a family of random variables an Ito process or sometimes just a stochastic process. Note that if a is identically zero, we have that Xt+h - Xt - h s(t, Xt) (5.9) is of mean and variance o(h). We have thus essentially recovered the differential equation dXt (5.10) µ(t, Xt). dt The essential aspect of this definition is that if we know X0 and that Xt satisfies the stochastic differential equation, (5.8), then Xt is fully determined. In other terms, the stochastic differential equation has a unique solution. An important corollary of this is that µ and a together with Xo are the only quantities we need to know in order to define a stochastic process. Equally important is the issue of existence - it is not immediately obvious that a family Xt satisfying a given stochastic differential equation exists.

Rather surprisingly, this leads to the Black-Scholes price. We therefore have a very powerful alternative method for pricing options. Justifying this procedure requires an excursion into some deep and powerful mathematics. 6.4 The concept of information 141 Before we can proceed to a better understanding of option pricing, we need a better understanding of the nature of stochastic processes. In particular, we need to think a little more deeply about what a stochastic process is. We have talked about a continuous family of processes, Xt, such that X, - XS has a certain distribution. As long as we only look at a finite number of values of t and s this is conceptually fairly clear, but once we start looking at all values at once it as a lot less obvious what these statements mean. One way out is to take the view that each random variable Xt displays some aspect of a single more fundamental variable.

The argument we gave above still works; if a portfolio is of zero value and can be positive with positive probability tomorrow then to get the expectation to be zero, there must be a positive probability of negative value tomorrow. Hence, as before arbitrage is impossible. This is still not particularly useful however, as we know that a risky asset will in general grow faster than a riskless bond on average due to the risk aversion of market participants. To get round this problem, we ask what the rate of growth means for a stochastic process. The stochastic process is determined by a probability measure on the sample space which is the space of paths. However, the definition of an arbitrage barely mentions the probability measure. All it says is that it is impossible to set up a portfolio with zero value today which has a positive probability of being of positive value in the future, and a zero probability of being of negative value. The actual magnitude of the positive probability is not mentioned.


pages: 819 words: 181,185

Derivatives Markets by David Goldenberg

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Black-Scholes formula, Brownian motion, capital asset pricing model, commodity trading advisor, compound rate of return, conceptual framework, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, financial innovation, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, law of one price, locking in a profit, London Interbank Offered Rate, Louis Bachelier, margin call, market microstructure, martingale, Norbert Wiener, price mechanism, random walk, reserve currency, risk/return, riskless arbitrage, Sharpe ratio, short selling, stochastic process, stochastic volatility, time value of money, transaction costs, volatility smile, Wiener process, Y2K, yield curve, zero-coupon bond

How do we take derivatives of smooth functions of stochastic processes, say F(Xt ,t), such as (GBM SDE) where the process is the solution of a stochastic differential equation dXt=Xt dt+Xt dWt with initial value X0? We start with the observation that we can expect to end up with another stochastic process that is also the solution to another stochastic differential equation. This new stochastic differential equation for the total differential of F(Xt ,t) will have a new set of drift and diffusion coefficients. The question is what exactly are the drift and diffusion coefficients of dF(Xt ,t)? This is one of the problems that K. Itô solved in his famous formula called Itô’s lemma. To understand Itô’s lemma, keep in mind that there are two stochastic processes involved. The first is the underlying process (think of it as the stock).

The second equation above says that, E r(S1()|S0)=(1+r′)S0>S0 unless r′=0. Even under risk neutrality (which doesn’t mean zero interest rates), the martingale requirement that E r(S1()|S0)=S0 is clearly violated. Stock prices under risk neutrality are not martingales. However they aren’t very far from martingales. 524 OPTIONS Definition of a Sub (Super) Martingale 1. A discrete-time stochastic process (Xn())n=0,1,2,3,… is called a sub-martingale if E(Xn)<∞, E(Xn2 )<∞ and E(Xn+1()|Xn)>Xn for all n=0,1,2,3,… 2. A discrete-time stochastic process (Xn())n=0,1,2,3,… is called a super-martingale if E(Xn)<∞, E(Xn2)<∞, and E(Xn+1()|Xn )<Xn for all n=0,1,2,3,… We expect stock prices to be sub-martingales, not martingales, for two separate and different reasons: 1. All assets, risky or not, have to provide a reward for time and waiting. This reward is the risk-free rate. 2.

We will begin with the prototype of all continuous time models, and that is arithmetic Brownian motion (ABM). ABM is the most basic and important stochastic process in continuous time and continuous space, and it has many desirable properties including the strong Markov property, the martingale property, independent increments, normality, and continuous sample paths. Of course, here we want to focus on options pricing rather than the pure mathematical theory. The idea here is to partially prepare you for courses in mathematical finance. The details we have to leave out are usually covered in such courses. 16.1 ARITHMETIC BROWNIAN MOTION (ABM) ABM is a stochastic process {Wt()}t0 defined on a sample space (,ℑW,℘W ). We won’t go into all the details as to exactly what (,ℑW,℘W ) represents but you can think of the probability measure, ℘W, which is called Wiener measure, to be defined in terms of the transition density function p(T,y;t,x) for =T–t, OPTION PRICING IN CONTINUOUS TIME 541 p(T ,y;t,x ) = p( ,x,y ) ⎛ 1 ⎞ −(y −x )2 / 2 =⎜ ⎟e ⎝ 2 ⎠ Norbert Wiener gave the first rigorous mathematical construction (existence proof) for ABM and, because of this, it is sometimes called the Wiener process.


pages: 345 words: 86,394

Frequently Asked Questions in Quantitative Finance by Paul Wilmott

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, asset allocation, Black-Scholes formula, Brownian motion, butterfly effect, capital asset pricing model, collateralized debt obligation, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, discrete time, diversified portfolio, Emanuel Derman, Eugene Fama: efficient market hypothesis, fixed income, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, iterative process, London Interbank Offered Rate, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, market bubble, martingale, Norbert Wiener, quantitative trading / quantitative finance, random walk, regulatory arbitrage, risk/return, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, transaction costs, urban planning, value at risk, volatility arbitrage, volatility smile, Wiener process, yield curve, zero-coupon bond

Short Answer Brownian Motion is a stochastic process with stationary independent normally distributed increments and which also has continuous sample paths. It is the most common stochastic building block for random walks in finance. Example Pollen in water, smoke in a room, pollution in a river, are all examples of Brownian motion. And this is the common model for stock prices as well. Long Answer Brownian motion (BM) is named after the Scottish botanist who first described the random motions of pollen grains suspended in water. The mathematics of this process were formalized by Bachelier, in an option-pricing context, and by Einstein. The mathematics of BM is also that of heat conduction and diffusion. Mathematically, BM is a continuous, stationary, stochastic process with independent normally distributed increments.

Wilmott magazine, September Halton, JH 1960 On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals. Num. Maths. 2 84-90 Hammersley, JM & Handscomb, DC 1964 Monte Carlo Methods. Methuen, London Harrison, JM & Kreps, D 1979 Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory 20 381-408 Harrison, JM & Pliska, SR 1981 Martingales and stochastic integrals in the theory of continuous trading. Stochastic Processes and their Applications 11 215-260 Haselgrove, CB 1961 A method for numerical integration. Mathematics of Computation 15 323-337 Heath, D, Jarrow, R & Morton, A 1992 Bond pricing and the term structure of interest rates: a new methodology. Econometrica 60 77-105 Ho, T & Lee, S 1986 Term structure movements and pricing interest rate contingent claims. Journal of Finance 42 1129-1142 Itô, K 1951 On stochastic differential equations.

Journal of Financial Economics 3 167-79 Haug, EG 2003 Know your weapon, Parts 1 and 2. Wilmott magazine, May and July Haug, EG 2006 The complete Guide to Option Pricing Formulas. McGraw-Hill Lewis, A 2000 Option Valuation under Stochastic Volatility. Finance Press What are the Forward and Backward Equations? Short Answer Forward and backward equations usually refer to the differential equations governing the transition probability density function for a stochastic process. They are diffusion equations and must therefore be solved in the appropriate direction in time, hence the names. Example An exchange rate is currently 1.88. What is the probability that it will be over 2 by this time next year? If you have a stochastic differential equation model for this exchange rate then this question can be answered using the equations for the transition probability density function.


pages: 209 words: 13,138

Empirical Market Microstructure: The Institutions, Economics and Econometrics of Securities Trading by Joel Hasbrouck

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

barriers to entry, conceptual framework, correlation coefficient, discrete time, disintermediation, distributed generation, experimental economics, financial intermediation, index arbitrage, interest rate swap, inventory management, market clearing, market design, market friction, market microstructure, martingale, price discovery process, price discrimination, quantitative trading / quantitative finance, random walk, Richard Thaler, second-price auction, short selling, statistical model, stochastic process, stochastic volatility, transaction costs, two-sided market, ultimatum game

Price discreteness, for example, reflects a tick size (minimum pricing increment) that is generally set in level units. For reasons that will be discussed shortly, the drift can be dropped in most microstructure analyses. When µ = 0, pt cannot be forecast beyond its most recent value: E[pt+1 | pt , pt−1 , . . .] = pt . A process with this property is generally described as a martingale. One definition of a martingale is a discrete stochastic process {xt } where E|xt | < ∞ for all t, and E(xt+1 | xt , xt−1 , . . . ) = xt (see Karlin and Taylor (1975) or Ross (1996)). Martingale behavior of asset prices is a classic result arising in many economic models with individual optimization, absence of arbitrage, or security market equilibrium (Cochrane (2005)). The result is generally contingent, however, on assumptions of frictionless trading opportunities, which are not appropriate in most microstructure applications.

Placing the price change first is simply an expositional simplification and carries no implications that this variable is first in any causal sense. The chapter treats the general case but uses a particular structural model for purposes of illustration. The structural model is a bivariate model of price changes and trade directions: yt = [pt qt ]′ . 9.1 Modeling Vector Time Series The basic descriptive statistics of a vector stochastic process { yt } are the process mean µ = E[yt ] and the vector autocovariances. The vector autocovariances are defined as the matrices 78 MULTIVARIATE LINEAR MICROSTRUCTURE MODELS Ŵk = E( yt − E [yt ])(yt−k − E [yt ])′ for k = . . . −2, −1, 0, +1, +2, . . . (9.1) In suppressing the dependence of µ and Ŵk on t, we have implicitly invoked an assumption of covariance stationarity. Note that although a univariate autocorrelation has the property that γk = γ−k , the corresponding property in the multivariate case is Ŵk = Ŵ′−k .

The buy limit price is denoted Lt . If at time t, pt ≥ Lt , then the agent has effectively submitted a marketable limit order, which achieves immediate execution. A limit order priced at Lt < pt will be executed during period t if pτ ≤ Lt for any time t < τ < t + 1. The situation is depicted in figure 15.2. A limit order priced at Lt executes if the stock price follows path B but not path A. This is a standard problem in stochastic processes, and many exact results are available. The diffusion-barrier notion of execution is at best a first approximation. In many markets, a buy limit order might be executed by a market (or marketable) sell order while the best ask is still well above the limit price. We will subsequently generalize the execution mechanism to allow this. For the moment, though, it might be noted that the present situation is not without precedent.


pages: 206 words: 70,924

The Rise of the Quants: Marschak, Sharpe, Black, Scholes and Merton by Colin Read

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Black-Scholes formula, Bretton Woods, Brownian motion, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, David Ricardo: comparative advantage, discovery of penicillin, discrete time, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, floating exchange rates, full employment, Henri Poincaré, implied volatility, index fund, Isaac Newton, John von Neumann, Joseph Schumpeter, Long Term Capital Management, Louis Bachelier, margin call, market clearing, martingale, means of production, moral hazard, naked short selling, price stability, principal–agent problem, quantitative trading / quantitative finance, RAND corporation, random walk, risk tolerance, risk/return, Ronald Reagan, shareholder value, Sharpe ratio, short selling, stochastic process, The Chicago School, the scientific method, too big to fail, transaction costs, tulip mania, Works Progress Administration, yield curve

Black saw the description and prediction of interest rates to be a multi-faceted and challenging problem. While he had demonstrated that an options price depends on the underlying stock price mean and volatility, and the risk-free interest rate, the overall market for interest rates is much more multi-dimensional. The interest rate yield curve, which graphs rates against maturities, depends on many markets and instruments, each of which is subject to stochastic processes. His interest and collaboration with Emanuel Derman and Bill Toy resulted in a model of interest rates that was first used profitably by Goldman Sachs through the 1980s, but eventually entered the public domain when they published their work in the Financial Analysts Journal in 1990.2 Their model provided reasonable estimates for both the prices and volatilities of treasury bonds, and is still used today.

Black-Scholes model – a model that can determine the price of a European call option based on the assumption that the underlying security follows a geometric Brownian motion with constant drift and volatility. Bond – a financial instrument that provides periodic (typically semi-annual) interest payments and the return of the paid-in capital upon maturity in exchange for a fixed price. Brownian motion – the simplest of the class of continuous-time stochastic processes that describes the random motion of a particle or a security that is buffeted by forces that are normally distributed in strength. Calculus of variations – a mathematical technique that can determine the optimal path of a variable, like savings or consumption, over time. Call – an option to purchase a specified security at a specified future time and price. Capital allocation line – a line drawn on the graph of all possible combinations of risky and risk-free assets that shows the best risk–reward horizon.

Keynesian model – a model developed by John Maynard Keynes that demonstrates savings may not necessarily be balanced with new investment and the gross domestic product may differ from that which would result in full employment. Kurtosis – a statistical measure of the distribution of observations about the expected mean as a deviation from that predicted by the normal distribution. Life cycle – the characterization of a process from its birth to death. Life Cycle Model – a model of household consumption behavior from the beginning of its earning capacity to the end of the household. Markov process – a stochastic process with the memorylessness property for which the present state, future state, and past observations are independent. Markowitz bullet – the upper boundary of the efficient frontier of various portfolios when graphed according to risk and return. Martingale – a model of a process for which past events cannot predict future outcomes. Mean – a mathematical technique that can be calculated based on a number of alternative weightings to produce an average for a set of numbers.


pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable by James Owen Weatherall

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Asian financial crisis, bank run, Benoit Mandelbrot, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, butterfly effect, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, collateralized debt obligation, collective bargaining, dark matter, Edward Lorenz: Chaos theory, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial innovation, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, John Nash: game theory, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, martingale, new economy, Paul Lévy, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk-adjusted returns, Robert Gordon, Robert Shiller, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, The Chicago School, The Myth of the Rational Market, tulip mania, V2 rocket, volatility smile

Although discussing such debates is far from the scope of this book, I should note that the arguments offered here for how one should think of the status of mathematical models in finance are closely connected to more general discussions concerning the status of mathematical or physical theories quite generally. “. . . named after Scottish botanist Robert Brown . . .”: Brown’s observations were published as Brown (1828). “The mathematical treatment of Brownian motion . . .”: More generally, Brownian motion is an example of a random or “stochastic” process. For an overview of the mathematics of stochastic processes, see Karlin and Taylor (1975, 1981). “. . . it was his 1905 paper that caught Perrin’s eye”: Einstein published four papers in 1905. One of them was the one I refer to here (Einstein 1905b), but the other three were equally remarkable. In Einstein (1905a), he first suggests that light comes in discrete packets, now called quanta or photons; in Einstein (1905c), he introduces his special theory of relativity; and in Einstein (1905d), he proposes the famous equation e = mc2

The Code-Breakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet. New York: Scribner. Kaplan, Ian. 2002. “The Predictors by Thomas A. Bass: A Retrospective.” This is a comment on The Predictors by a former employee of the Prediction Company. Available at http://www.bearcave.com/bookrev/predictors2.html. Karlin, Samuel, and Howard M. Taylor. 1975. A First Course in Stochastic Processes. 2nd ed. San Diego, CA: Academic Press. — — — . 1981. A Second Course in Stochastic Processes. San Diego, CA: Academic Press. Katzmann, Robert A. 2008. Daniel Patrick Moynihan: The Intellectual in Public Life. Washington, DC: Woodrow Wilson Center Press. Kelly, J., Jr. 1956. “A New Interpretation of Information Rate.” IRE Transactions on Information Theory 2 (3, September): 185–89. Kelly, Kevin. 1994a. “Cracking Wall Street.”

“Consumer Prices, the Consumer Price Index, and the Cost of Living.” Journal of Economic Perspectives 12 (1, Winter): 3–26. Bosworth, Barry P. 1997. “The Politics of Immaculate Conception.” The Brookings Review, June, 43–44. Bouchaud, Jean-Philippe, and Didier Sornette. 1994. “The Black-Scholes Option Pricing Problem in Mathematical Finance: Generalization and Extensions for a Large Class of Stochastic Processes.” Journal de Physique 4 (6): 863–81. Bower, Tom. 1984. Klaus Barbie, Butcher of Lyons. London: M. Joseph. Bowman, D. D., G. Ouillion, C. G. Sammis, A. Sornette, and D. Sornette. 1998. “An Observational Test of the Critical Earthquake Concept.” Journal of Geophysical Research 103: 24359–72. Broad, William J. 1992. “Defining the New Plowshares Those Old Swords Will Make.” The New York Times, February 5.


pages: 695 words: 194,693

Money Changes Everything: How Finance Made Civilization Possible by William N. Goetzmann

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Andrei Shleifer, asset allocation, asset-backed security, banking crisis, Benoit Mandelbrot, Black Swan, Black-Scholes formula, Bretton Woods, Brownian motion, capital asset pricing model, Cass Sunstein, collective bargaining, colonial exploitation, compound rate of return, conceptual framework, corporate governance, Credit Default Swap, David Ricardo: comparative advantage, debt deflation, delayed gratification, Detroit bankruptcy, disintermediation, diversified portfolio, double entry bookkeeping, Edmond Halley, en.wikipedia.org, equity premium, financial independence, financial innovation, financial intermediation, fixed income, frictionless, frictionless market, full employment, high net worth, income inequality, index fund, invention of the steam engine, invention of writing, invisible hand, James Watt: steam engine, joint-stock company, joint-stock limited liability company, laissez-faire capitalism, Louis Bachelier, mandelbrot fractal, market bubble, means of production, money: store of value / unit of account / medium of exchange, moral hazard, new economy, passive investing, Paul Lévy, Ponzi scheme, price stability, principal–agent problem, profit maximization, profit motive, quantitative trading / quantitative finance, random walk, Richard Thaler, Robert Shiller, Robert Shiller, shareholder value, short selling, South Sea Bubble, sovereign wealth fund, spice trade, stochastic process, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, time value of money, too big to fail, trade liberalization, trade route, transatlantic slave trade, transatlantic slave trade, tulip mania, wage slave

Mandelbrot was a student of Paul Lévy’s—the son of the man who gave Bachelier bad marks at his examination at the École Polytechnique in 1900. Lévy’s research focused on “stochastic processes”: mathematical models that describe the behavior of some variable through time. For example, we saw in Chapter 15 that Jules Regnault proposed and tested a stochastic process that varied randomly, which resulted in a rule about risk increasing with the square root of time. Likewise, Louis Bachelier more formally developed a random-walk stochastic process. Paul Lévy formalized these prior random walk models into a very general family of stochastic processes referred to as Lévy processes. Brownian motion was just one process in the family of Lévy processes—and perhaps the best behaved of them. Other stochastic processes have such things as discontinuous jumps and unusually large shocks (which might, for example, explain the crash of 1987, when the US stock market lost 22.6% of its value in a single day).

One of his major contributions to the literature on finance (published in 1966) was a proof that an efficient market implies that stock prices may not follow a random walk, but that they must be unpredictable. It was a nice refinement of Regnault’s hypothesis articulated almost precisely a century prior. Although Mandelbrot ultimately developed a fractal-based option-pricing model with two of his students that allowed for extreme events and a more general stochastic process, for various reasons Mandelbrot never saw it adopted in practice to any great extent. I suspect that this is because the solution, while potentially useful, is complicated and contradicts most other tools that quantitative financiers use. With Mandelbrot’s models, it is all or nothing. You have to take a leap beyond the world of Brownian motion and throw out old friends like Bernoulli’s law of large numbers.

Benoit Mandelbrot believed he had discovered a deep structure to the world in general and financial markets in particular. His insights, however, can be traced directly back to the special tradition of mathematical inquiry that has its roots in the Enlightenment. I think this is what most excited him about his work—thinking of it in historical context as a culmination of applications of probability to markets. Although not all quants are aware of it, when they use a stochastic process (like Brownian motion) to price a security or figure out a hedge, they are drawing from a very deep well of mathematical knowledge that would not have existed but for the emergence of financial markets in Europe. Yes, the models that modern quants have applied to markets can go wrong. Models are crude attempts to characterize a reality that is complex and continually evolving. Despite the crashes—or perhaps because of them—financial markets have continually challenged the best and brightest minds with puzzles that hold the promise of intellectual and pecuniary rewards.


pages: 855 words: 178,507

The Information: A History, a Theory, a Flood by James Gleick

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, bank run, bioinformatics, Brownian motion, butterfly effect, citation needed, Claude Shannon: information theory, clockwork universe, computer age, conceptual framework, crowdsourcing, death of newspapers, discovery of DNA, double helix, Douglas Hofstadter, en.wikipedia.org, Eratosthenes, Fellow of the Royal Society, Gödel, Escher, Bach, Henri Poincaré, Honoré de Balzac, index card, informal economy, information retrieval, invention of the printing press, invention of writing, Isaac Newton, Jacquard loom, Jacquard loom, Jaron Lanier, jimmy wales, John von Neumann, Joseph-Marie Jacquard, Louis Daguerre, Marshall McLuhan, Menlo Park, microbiome, Milgram experiment, Network effects, New Journalism, Norbert Wiener, On the Economy of Machinery and Manufactures, PageRank, pattern recognition, phenotype, pre–internet, Ralph Waldo Emerson, RAND corporation, reversible computing, Richard Feynman, Richard Feynman, Simon Singh, Socratic dialogue, Stephen Hawking, Steven Pinker, stochastic process, talking drums, the High Line, The Wisdom of Crowds, transcontinental railway, Turing machine, Turing test, women in the workforce

.♦ To illuminate the structure of the message Shannon turned to some methodology and language from the physics of stochastic processes, from Brownian motion to stellar dynamics. (He cited a landmark 1943 paper by the astrophysicist Subrahmanyan Chandrasekhar in Reviews of Modern Physics.♦) A stochastic process is neither deterministic (the next event can be calculated with certainty) nor random (the next event is totally free). It is governed by a set of probabilities. Each event has a probability that depends on the state of the system and perhaps also on its previous history. If for event we substitute symbol, then a natural written language like English or Chinese is a stochastic process. So is digitized speech; so is a television signal. Looking more deeply, Shannon examined statistical structure in terms of how much of a message influences the probability of the next symbol.

His colleagues thought this was a bit “addled”—that Shannon’s work was “more technology than mathematics,”♦ as Kolmogorov recalled it afterward. “It is true,” he said, “that Shannon left to his successors the rigorous ‘justification’ of his ideas in some difficult cases. However, his mathematical intuition was amazingly precise.” Kolmogorov was not as enthusiastic about cybernetics. Norbert Wiener felt a kinship with him—they had both done early work on stochastic processes and Brownian motion. On a visit to Moscow, Wiener said, “When I read the works of Academician Kolmogorov, I feel that these are my thoughts as well, this is what I wanted to say. And I know that Academician Kolmogorov has the same feeling when reading my works.”♦ But the feeling was evidently not shared. Kolmogorov steered his colleagues toward Shannon instead. “It is easy to understand that as a mathematical discipline cybernetics in Wiener’s understanding lacks unity,” he said, “and it is difficult to imagine productive work in training a specialist, say a postgraduate student, in cybernetics in this sense.”♦ He already had real results to back up his instincts: a useful generalized formulation of Shannon entropy, and an extension of his information measure to processes in both discrete and continuous time.

classification, 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 3.1, 3.2 Clausius, Rudolf, 9.1, 9.2, 9.3 Clauson-Thue, William, 5.1, 5.2, 5.3 Clement, Joseph, 4.1, 4.2 clocks, synchronization of, 1.1, 5.1, 5.2, 5.3, 5.4 cloud, information, 14.1, 14.2 clustering Clytemnestra code attempts to reduce cost of telegraphy, 5.1, 5.2 Babbage’s interest in cipher and compression systems for telegraphy, 5.1, 5.2, 5.3, 5.4, 5.5, 5.6 Enigma, 7.1, 7.2, 7.3 genetic, 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.8, 10.9, 10.10 in Jacquard loom operations Morse, prl.1, 1.1, 1.2, 1.3, 1.4, 5.1, 5.2, 5.3, 5.4, 6.1, 11.1 as noise for printing telegraph Shannon’s interest in, prl.1, 6.1, 7.1 telegraphy before Morse code, 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 see also cryptography coding theory, 8.1, 8.2, 10.1, 12.1 cognitive science, 8.1, 8.2, 8.3, 8.4 Colebrooke, Henry collective consciousness, epl.1, epl.2, epl.3, epl.4, epl.5 Colossus computing machine Columbus, Christopher combinatorial analysis, 6.1, 10.1, 10.2 communication by algorithm with alien life-form, 12.1, 12.2, 12.3, 12.4, 12.5 Babbage’s mechanical notation for describing, 4.1, 4.2, 5.1 constrained channels of, 2.1, 2.2 disruptive effects of new technologies in, 15.1, 15.2 emergence of global consciousness, epl.1, epl.2, epl.3 evolution of electrical technologies for, 5.1, 5.2, 6.1, 6.2 fundamental problem of, prl.1, 7.1, 7.2, 8.1 human evolution and, prl.1, prl.2 implications of technological evolution of, 15.1, 15.2 information overload and, epl.1, epl.2 knowledge needs for, 12.1, 12.2, 12.3 in origins of governance Shannon’s diagram of, 7.1, 7.2, 7.3 as stochastic process symbolic logic to describe systems of system elements, 7.1, 7.2 in Twitter, epl.1, epl.2 see also talking drums; telegraphy; telephony; transmission of information compact disc, prl.1, 8.1, epl.1 complexity, 12.1, 12.2, 12.3, 12.4, 12.5, 12.6, 12.7, 12.8, 12.9 compression of information; see data compression “Computable Numbers, On” (Turing), 7.1, 7.2, 12.1 computation in Babylonian mathematics, 2.1, 2.2 computable and uncomputable numbers, 7.1, 7.2, 7.3, 7.4, 12.1, 12.2, 12.3 of differential equations, 4.1, 4.2 in evolution of complex structures human computers, 4.1, 4.2, 4.3 thermodynamics of, 13.1, 13.2, 13.3, 13.4 Turing machine for, 7.1, 7.2, 7.3, 7.4, 7.5, 7.6 see also calculators; computers computer(s) analog and digital, 8.1, 8.2 chess-playing, 8.1, 8.2 comparison to humans, 8.1, 8.2 cost of memory storage cost of work of, 13.1, 13.2 early mechanical, prl.1, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 8.1 growth of memory and processing speed of, 14.1, 14.2, 14.3, 14.4 inductive learning in perception of thinking by, 8.1, 8.2, 8.3, 8.4 public awareness of quantum-based, 13.1, 13.2, 13.3, 13.4 Shannon’s information theory in, prl.1, 6.1, 7.1, 7.2, 8.1, 8.2 significance of information theory in development of spread of memes through Turing’s conceptualization of, 8.1, 8.2, 8.3 universe as, 14.1, 14.2 see also calculators; computation; programming Conference on Cybernetics, 8.1, 8.2, 8.3, 8.4, 8.5, 8.6, 8.7, 8.8, 8.9, 8.10, 8.11, 8.12 Connolly, Sean J.

Mathematics for Finance: An Introduction to Financial Engineering by Marek Capinski, Tomasz Zastawniak

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Black-Scholes formula, Brownian motion, capital asset pricing model, cellular automata, delta neutral, discounted cash flows, discrete time, diversified portfolio, interest rate derivative, interest rate swap, locking in a profit, London Interbank Offered Rate, margin call, martingale, quantitative trading / quantitative finance, random walk, short selling, stochastic process, time value of money, transaction costs, value at risk, Wiener process, zero-coupon bond

This results in the following bond prices at time 1: 101.14531 in the up state and 100.9999 in the down state. (The latter is the same as for the par bond.) Expectation with respect to the risk-neutral probability gives the initial bond price 100.05489, so the floor is worth 0.05489. Bibliography Background Reading: Probability and Stochastic Processes Ash, R. B. (1970), Basic Probability Theory, John Wiley & Sons, New York. Brzeźniak, Z. and Zastawniak, T. (1999), Basic Stochastic Processes, Springer Undergraduate Mathematics Series, Springer-Verlag, London. Capiński, M. and Kopp, P. E. (1999), Measure, Integral and Probability, Springer Undergraduate Mathematics Series, Springer-Verlag, London. Capiński, M. and Zastawniak, T. (2001), Probability Through Problems, Springer-Verlag, New York. Chung, K. L. (1974), A Course in Probability Theory, Academic Press, New York.

Erdmann Oxford University L.C.G. Rogers University of Cambridge E. Süli Oxford University J.F. Toland University of Bath Other books in this series A First Course in Discrete Mathematics I. Anderson Analytic Methods for Partial Differential Equations G. Evans, J. Blackledge, P. Yardley Applied Geometry for Computer Graphics and CAD D. Marsh Basic Linear Algebra, Second Edition T.S. Blyth and E.F. Robertson Basic Stochastic Processes Z. Brzeźniak and T. Zastawniak Elementary Differential Geometry A. Pressley Elementary Number Theory G.A. Jones and J.M. Jones Elements of Abstract Analysis M. Ó Searcóid Elements of Logic via Numbers and Sets D.L. Johnson Essential Mathematical Biology N.F. Britton Fields, Flows and Waves: An Introduction to Continuum Models D.F. Parker Further Linear Algebra T.S. Blyth and E.F. Robertson Geometry R.

Monte Carlo Simulation and Finance by Don L. McLeish

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, discrete time, distributed generation, finite state, frictionless, frictionless market, implied volatility, incomplete markets, invention of the printing press, martingale, p-value, random walk, Sharpe ratio, short selling, stochastic process, stochastic volatility, the market place, transaction costs, value at risk, Wiener process, zero-coupon bond

This process Zs is, both in discrete and continuous time, a martingale. MODELS IN CONTINUOUS TIME 67 Wiener Process 3 2.5 2 W(t) 1.5 1 0.5 0 -0.5 -1 0 1 2 3 4 5 t 6 7 8 9 Figure 2.6: A sample path of the Wiener process Models in Continuous Time We begin with some oversimplified rules of stochastic calculus which can be omitted by those with a background in Brownian motion and diffusion. First, we define a stochastic process Wt called the standard Brownian motion or Wiener process having the following properties; 1. For each h > 0, the increment W (t+h)−W (t) has a N (0, h) distribution and is independent of all preceding increments W (u) − W (v), t > u > v > 0. 2. W (0 ) = 0 . [FIGURE 2.6 ABOUT HERE] The fact that such a process exists is by no means easy to see. It has been an important part of the literature in Physics, Probability and Finance at least since the papers of Bachelier and Einstein, about 100 years ago.

And when the drift term a(Xt , t ) is linear in Xt , the solution of an ordinary differential equation will allow the calculation of the expected value of the process and this is the first and most basic description of its behaviour. The MODELS IN CONTINUOUS TIME 77 appendix provides an elementary review of techniques for solving partial and ordinary differential equations. However, that the information about a stochastic process obtained from a deterministic object such as a ordinary or partial differential equation is necessarily limited. For example, while we can sometimes obtain the marginal distribution of the process at time t it is more difficult to obtain quantities such as the joint distribution of variables which depending on the path of the process, and these are important in valuing certain types of exotic options such as lookback and barrier options.

Solving deterministic differential equations can sometimes provide a solution to a specific problem such as finding the arbitrage-free price of a derivative. In general, for more complex features of the derivative such as the distribution of return, important for considerations such as the Value at Risk, we need to obtain a solution {Xt , 0 < t < T }to an equation of the above form which is a stochastic process. Typically this can only be done by simulation. One of the simplest methods of simulating such a process is motivated through a crude interpretation of the above equation in terms of discrete time steps, that is that a small increment Xt+h − Xt in the process is approximately normally distributed with mean given by a(Xt , t)hand variance given by σ 2 (Xt , t)h. We generate these increments sequentially, beginning with an assumed value for X0 , and then adding to obtain an approximation to the value of the process at discrete times t = 0, h, 2h, 3h, . . ..


pages: 105 words: 18,832

The Collapse of Western Civilization: A View From the Future by Naomi Oreskes, Erik M. Conway

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

anti-communist, correlation does not imply causation, en.wikipedia.org, energy transition, invisible hand, laissez-faire capitalism, market fundamentalism, means of production, oil shale / tar sands, road to serfdom, Ronald Reagan, stochastic process, the built environment, the market place

T h e F r e n z y o F F o s s i l F u e l s 17 This was consistent with the expectation—based on physical theory—that warmer sea surface temperatures in regions of cyclogenesis could, and likely would, drive either more hurricanes or more intense ones. However, they backed away from this conclusion under pressure from their scientific colleagues. Much of the argument surrounded the concept of statistical significance. Given what we now know about the dominance of nonlinear systems and the distribution of stochastic processes, the then-dominant notion of a 95 percent confidence limit is hard to fathom. Yet overwhelming evidence suggests that twentieth-century scientists believed that a claim could be accepted only if, by the standards of Fisherian statistics, the possibility that an observed event could have happened by chance was less than 1 in 20. Many phenomena whose causal mechanisms were physically, chemically, or biologically linked to warmer temperatures were dis-missed as “unproven” because they did not adhere to this standard of demonstration.

Commodity Trading Advisors: Risk, Performance Analysis, and Selection by Greg N. Gregoriou, Vassilios Karavas, François-Serge Lhabitant, Fabrice Douglas Rouah

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Asian financial crisis, asset allocation, backtesting, capital asset pricing model, collateralized debt obligation, commodity trading advisor, compound rate of return, constrained optimization, corporate governance, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, discrete time, distributed generation, diversification, diversified portfolio, dividend-yielding stocks, fixed income, high net worth, implied volatility, index arbitrage, index fund, interest rate swap, iterative process, linear programming, London Interbank Offered Rate, Long Term Capital Management, market fundamentalism, merger arbitrage, Mexican peso crisis / tequila crisis, p-value, Ponzi scheme, quantitative trading / quantitative finance, random walk, risk-adjusted returns, risk/return, Sharpe ratio, short selling, stochastic process, systematic trading, technology bubble, transaction costs, value at risk

Faff and Hallahan (2001) argue that survivorship bias is more likely to cause performance reversals than performance persistence. The data used show considerable kurtosis (see Table 3.1). However, this kurtosis may be caused by heteroskedasticity (returns of some funds are more variable than others). REGRESSION TEST OF PERFORMANCE PERSISTENCE To measure performance persistence, a model of the stochastic process that generates returns is required. The process considered is: rit = αi + βi rt + εit , ε it ~ N(0, σ i2 ) i = 1, K , n and t = 1, K , T (3.1) where rit = return of fund (or CTA) i in month t rt = average fund returns in month t slope parameter bi = differences in leverage. The model allows each fund to have a different variance, which is consistent with past research. We also considered models that assumed that bi is zero, with either fixed effects (dummy variables) for time or random effects instead.

This demonstrates that most of the nonnormality shown in Table 3.1 is due to heteroskedasticity. MONTE CARLO STUDY In their method, EGR ranked funds by their mean return or modified Sharpe ratio in a first period, and then determined whether the funds that ranked high in the first period also ranked high in the second period. We use Monte Carlo simulation to determine the power and size of hypothesis tests with EGR’s method when data follow the stochastic process given in equation 3.1. Data were generated by specifying values of α, β, and σ. The simulation used 1,000 replications and 120 simulated funds. The mean return over all funds, r̄t, is derived from the values of α and β as: Σα i Σε it + n n rt = Σβ i 1− n where all sums are from i = 1 to n. A constant value of α simulates no performance persistence. For the data sets generated with persistence present, α was generated randomly based on the mean and variance of β’s in each of the three data sets.

Chicago Mercantile Exchange. (1999) “Question and Answer Report: Managed Futures Accounts.” Report No. M584/10M/1299. www.cve.com. Christoffersen, P. (2003) Elements of Financial Risk Management. San Diego, CA: Academic Press. Chung, S. Y. (1999) “Portfolio Risk Measurement: A Review of Value at Risk.” Journal of Alternative Investments, Vol. 2, No. 1, pp. 34–42. Clark, P. K. (1973) “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices.” Econometrica, Vol. 41, No. 1, pp. 135–155. Clayton, U. (2003) A Guide to the Law of Securitisation in Australia. Sydney, Australia: Clayton Company. Cooley, P. L., R. L. Roenfeldt, and N. K. Modani. (1977) “Interdependence of Market Risk Measures.” Journal of Business, Vol. 50, No. 3, pp. 356–363. Cootner, P. (1967) “Speculation and Hedging.”

How I Became a Quant: Insights From 25 of Wall Street's Elite by Richard R. Lindsey, Barry Schachter

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, algorithmic trading, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, asset allocation, asset-backed security, backtesting, bank run, banking crisis, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, business process, buy low sell high, capital asset pricing model, centre right, collateralized debt obligation, corporate governance, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, currency manipulation / currency intervention, discounted cash flows, disintermediation, diversification, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, full employment, George Akerlof, Gordon Gekko, hiring and firing, implied volatility, index fund, interest rate derivative, interest rate swap, John von Neumann, linear programming, Loma Prieta earthquake, Long Term Capital Management, margin call, market friction, market microstructure, martingale, merger arbitrage, Nick Leeson, P = NP, pattern recognition, pensions crisis, performance metric, prediction markets, profit maximization, purchasing power parity, quantitative trading / quantitative finance, QWERTY keyboard, RAND corporation, random walk, Ray Kurzweil, Richard Feynman, Richard Feynman, Richard Stallman, risk-adjusted returns, risk/return, shareholder value, Sharpe ratio, short selling, Silicon Valley, six sigma, sorting algorithm, statistical arbitrage, statistical model, stem cell, Steven Levy, stochastic process, systematic trading, technology bubble, The Great Moderation, the scientific method, too big to fail, trade route, transaction costs, transfer pricing, value at risk, volatility smile, Wiener process, yield curve, young professional

Like many mathematicians and physicists, I found the mathematics of the Black-Scholes options pricing formula incredibly interesting. For starters, after years of specializing in pure mathematics, I was starting from scratch in a totally new area. It allowed me to start to learn basic mathematics instead of delving deeper and deeper into advanced subjects. I literally had to start from scratch and learn probability theory and then the basics of stochastic processes, things I knew nothing at all about. Not to mention I knew nothing about financial markets, derivatives, or JWPR007-Lindsey 122 May 7, 2007 16:55 h ow i b e cam e a quant anything at all to do with finance. It was exciting to learn so much from scratch. In the midst of reading about Black-Scholes, I was also deeply involved with writing the book with Victor Ginzburg from the University of Chicago.

Richard Grinold, who was my prethesis advisor, gave me a copy of the HJM paper a couple of weeks before the seminar and told me to dig into it. This represents some of the best academic advice I have ever received since I am not sure that I would have immediately realized the model’s importance and potential for further work by myself. The rest, in some sense, is history. I really enjoyed the paper because I was struggling to understand some of the rather abstract questions in stochastic process theory that it dealt with, and I quickly decided to work on the HJM model for my dissertation. Broadly speaking, the HJM paradigm still represents the state of the art in interest rate derivatives pricing, so having been working with it from the very beginning is definitely high on my list of success factors later in life. In my five years at Berkeley, I met a few other people of critical importance to my career path, and life in general.

At Columbia College, I decided to enroll in its three-two program, which meant that I spent three years studying the contemporary civilization and humanities core curriculum, as well as the hard sciences, and then two years at the Columbia School of Engineering. There, I found a home in operations research, which allowed me to study computer science and applied mathematics, including differential equations, stochastic processes, statistical quality control, and mathematical programming. While studying for my master’s in operations research at Columbia, I had the opportunity to work at the Rand Institute, where math and computer science were applied to real-world problems. There I was involved in developing a large-scale simulation model designed to optimize response times for the New York City Fire Department. My interest in applied math led me to Carnegie-Mellon’s Graduate School of Industrial Administration, which had a strong operations research faculty.


pages: 396 words: 112,748

Chaos by James Gleick

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Benoit Mandelbrot, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Murray Gell-Mann, Norbert Wiener, pattern recognition, Richard Feynman, Richard Feynman, Stephen Hawking, stochastic process, trade route

It seems to have been the issue on which many different fields of science were stuck—they were stuck on this aspect of the nonlinear behavior of systems. Now, nobody would have thought that the right background for this problem was to know particle physics, to know something about quantum field theory, and to know that in quantum field theory you have these structures known as the renormalization group. Nobody knew that you would need to understand the general theory of stochastic processes, and also fractal structures. “Mitchell had the right background. He did the right thing at the right time, and he did it very well. Nothing partial. He cleaned out the whole problem.” Feigenbaum brought to Los Alamos a conviction that his science had failed to understand hard problems—nonlinear problems. Although he had produced almost nothing as a physicist, he had accumulated an unusual intellectual background.

Astute readers, though, could tell that I preferred Joe Ford’s more freewheeling “cornucopia” style of definition—“Dynamics freed at last from the shackles of order and predictability…”—and still do. But everything evolves in the direction of specialization, and strictly speaking, “chaos” is now a very particular thing. When Yaneer Bar-Yam wrote a kilopage textbook, Dynamics of Complex Systems, in 2003, he took care of chaos proper in the first section of the first chapter. (“The first chapter, I have to admit, is 300 pages, okay?” he says.) Then came Stochastic Processes, Modeling Simulation, Cellular Automata, Computation Theory and Information Theory, Scaling, Renormalization, and Fractals, Neural Networks, Attractor Networks, Homogenous Systems, Inhomogenous Systems, and so on. Bar-Yam, the son of a high-energy physicist, had studied condensed matter physics and become an engineering professor at Boston University, but he left in 1997 to found the New England Complex Systems Institute.


pages: 518 words: 107,836

How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy) by Benjamin Peters

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Andrei Shleifer, Benoit Mandelbrot, bitcoin, Brownian motion, Claude Shannon: information theory, cloud computing, cognitive dissonance, computer age, conceptual framework, crony capitalism, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, Dissolution of the Soviet Union, double helix, Drosophila, Francis Fukuyama: the end of history, From Mathematics to the Technologies of Life and Death, hive mind, index card, informal economy, invisible hand, Jacquard loom, Jacquard loom, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, linear programming, mandelbrot fractal, Marshall McLuhan, means of production, Menlo Park, Mikhail Gorbachev, mutually assured destruction, Network effects, Norbert Wiener, packet switching, pattern recognition, Paul Erdős, Peter Thiel, RAND corporation, rent-seeking, road to serfdom, Ronald Coase, scientific mainstream, Steve Jobs, Stewart Brand, stochastic process, technoutopianism, The Structural Transformation of the Public Sphere, transaction costs, Turing machine

During World War II, Wiener researched ways to integrate human gunner and analog computer agency in antiaircraft artillery fire-control systems, vaulting his wartime research on the feedback processes among humans and machines into a general science of communication and control, with the gun and gunner ensemble (the man and the antiaircraft gun cockpit) as the original image of the cyborg.5 To designate this new science of control and feedback mechanisms, Wiener coined the neologism cybernetics from the Greek word for steersman, which is a predecessor to the English term governor (there is a common consonant-vowel structure between cybern- and govern—k/g + vowel + b/v + ern). Wiener’s popular masterworks ranged further still, commingling complex mathematical analysis (especially noise and stochastic processes), exposition on the promise and threat associated with automated information technology, and various speculations of social, political, and religious natures.6 For Wiener, cybernetics was a working out of the implications of “the theory of messages” and the ways that information systems organized life, the world, and the cosmos. He found parallel structures in the communication and control systems operating in animal neural pathways, electromechanical circuits, and information flows in larger social systems.7 The fact that his work speaks in general mathematical terms also sped his work’s reception and eventual embrace by a wide range of readers, including Soviet philosopher-critics, as examined later.

Because the coauthors were sensitive to how language, especially foreign terms, packs in questions of international competition, the coauthors attempted to keep their language as technical and abstract as possible, reminding the reader that the cybernetic mind-machine analogy was central to the emerging science but should be understood only “from a functional point of view,” not a philosophical one.76 The technical and abstract mathematical language of Wiener’s cybernetics thus served as a political defense against Soviet philosopher-critics and as ballast for generalizing the coauthors’ ambitions for scientists in other fields. They employed a full toolbox of cybernetic terminology, including signal words such as homeostasis, feedback, entropy, reflex, and the binary digit. They also repeated Wiener and Shannon’s emphases on probabilistic, stochastic processes as the preferred mathematical medium for scripting behavioral patterns onto abstract logical systems, including a whole section that elaborated on the mind-machine analogy with special emphasis on the central processor as capable of memory, responsiveness, and learning.77Wiener’s call for cyberneticists with “Leibnizian catholicity” of scientific interests was tempered into its negative form—a warning against disciplinary isolationism.78 On the last page of the article, the coauthors smoothed over the adoption of Wiener, an American, as foreign founder of Soviet cybernetics by summarizing and stylizing Wiener’s “sharp critique of capitalist society,” his pseudo-Marxist prediction of a “new industrial revolution” that would arise out of the “chaotic conditions of the capitalist market,” and his widely publicized postwar fear of “the replacement of common workers with mechanical robots.”79 A word play in Russian animates this last phrase: the Russian word for worker, or rabotnik, differs only by a vowel transformation from robot, the nearly universal term coined in 1927 by the playwright Karel Capek from the Czech word for “forced labor.”80 The first industrial revolution replaced the hand with the machine, or the rabotnik with the robot, and Wiener’s science, the coauthors dreamed, would help usher in a “second industrial revolution” in which the labor of the human mind could be carried out by intelligent machines, thus freeing, as Marx had intimated a century earlier, the mind to higher pursuits.


pages: 111 words: 1

Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Nicholas Taleb

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Antoine Gombaud: Chevalier de Méré, availability heuristic, backtesting, Benoit Mandelbrot, Black Swan, complexity theory, corporate governance, currency peg, Daniel Kahneman / Amos Tversky, discounted cash flows, diversified portfolio, endowment effect, equity premium, global village, hindsight bias, Long Term Capital Management, loss aversion, mandelbrot fractal, mental accounting, meta analysis, meta-analysis, quantitative trading / quantitative finance, QWERTY keyboard, random walk, Richard Feynman, Richard Feynman, road to serfdom, Robert Shiller, Robert Shiller, shareholder value, Sharpe ratio, Steven Pinker, stochastic process, too big to fail, Turing test, Yogi Berra

The Tools The notion of alternative histories discussed in the last chapter can be extended considerably and subjected to all manner of technical refinement. This brings us to the tools used in my profession to toy with uncertainty. I will outline them next. Monte Carlo methods, in brief, consist of creating artificial history using the following concepts. First, consider the sample path. The invisible histories have a scientific name, alternative sample paths, a name borrowed from the field of mathematics of probability called stochastic processes. The notion of path, as opposed to outcome, indicates that it is not a mere MBA-style scenario analysis, but the examination of a sequence of scenarios along the course of time. We are not just concerned with where a bird can end up tomorrow night, but rather with all the various places it can possibly visit during the time interval. We are not concerned with what the investor’s worth would be in, say, a year, but rather of the heart-wrenching rides he may experience during that period.

Starting at $100, in one scenario it can end up at $20 having seen a high of $220; in another it can end up at $145 having seen a low of $10. Another example is the evolution of your wealth during an evening at a casino. You start with $1,000 in your pocket, and measure it every fifteen minutes. In one sample path you have $2,200 at midnight; in another you barely have $20 left for a cab fare. Stochastic processes refer to the dynamics of events unfolding with the course of time. Stochastic is a fancy Greek name for random. This branch of probability concerns itself with the study of the evolution of successive random events—one could call it the mathematics of history. The key about a process is that it has time in it. What is a Monte Carlo generator? Imagine that you can replicate a perfect roulette wheel in your attic without having recourse to a carpenter.


pages: 119 words: 10,356

Topics in Market Microstructure by Ilija I. Zovko

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Brownian motion, continuous double auction, correlation coefficient, financial intermediation, Gini coefficient, market design, market friction, market microstructure, Murray Gell-Mann, p-value, quantitative trading / quantitative finance, random walk, stochastic process, stochastic volatility, transaction costs

Quantitative Finance, 2:346–353, 2002. 100 BIBLIOGRAPHY W. S. Choi, S. B. Lee, and P. I. Yu. Estimating the permanent and transitory components of the bid/ask spread. In C.-F. e. Lee, editor, Advances in investment analysis and portfolio management. Volume 5. Elsevier, 1998. T. Chordia and B. Swaminathan. Trading volume and crossautocorrelations in stock returns. Journal of Finance, LV(2), April 2000. P. K. Clark. Subordinated stochastic process model with finite variance for speculative prices. Econometrica, 41(1):135–155, 1973. K. J. Cohen, S. F. Maier, R. A. Schwartz, and D. K. Whitcomb. Transaction costs, order placement strategy, and existence of the bid-ask spread. Journal of Political Economy, 89(2):287–305, 1981. K. J. Cohen, R. M. Conroy, and S. F. Maier. Order flow and the quality of the market. In Y. Amihud, T. Ho, and R.

Fifty Challenging Problems in Probability With Solutions by Frederick Mosteller

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Isaac Newton, John von Neumann, prisoner's dilemma, RAND corporation, stochastic process

As is well known, no strategy can give him a higher probability of achieving his goal, and the probability is this high if and only if he makes sure either to lose x or win y eventually. The Leeser Paradise The Lesser Paradise resembles the Golden Paradise with the imoortant difference that before leaving the haH the gambler must pay an income tax ·First pUblished, 1965 Reprinted by Dover Publications. Inc in 1976 under the title Inequalities for stochastic processes 56 of t 100% (0 < t < 1) on any net positive income that he has won there. It is therefore no harder or easier for him to win y dollars with an initial fortune of x than it is for his brother in the Golden Paradise to win y/(I - t) dollars. The greatest probability with which he can achieve his goal is therefore (I - t)x (I) (1 - t)x +y The Paradise Lost Here, the croupier collects the tax of !


pages: 442 words: 39,064

Why Stock Markets Crash: Critical Events in Complex Financial Systems by Didier Sornette

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Asian financial crisis, asset allocation, Berlin Wall, Bretton Woods, Brownian motion, capital asset pricing model, capital controls, continuous double auction, currency peg, Deng Xiaoping, discrete time, diversified portfolio, Elliott wave, Erdős number, experimental economics, financial innovation, floating exchange rates, frictionless, frictionless market, full employment, global village, implied volatility, index fund, invisible hand, John von Neumann, joint-stock company, law of one price, Louis Bachelier, mandelbrot fractal, margin call, market bubble, market clearing, market design, market fundamentalism, mental accounting, moral hazard, Network effects, new economy, oil shock, open economy, pattern recognition, Paul Erdős, quantitative trading / quantitative finance, random walk, risk/return, Ronald Reagan, Schrödinger's Cat, short selling, Silicon Valley, South Sea Bubble, statistical model, stochastic process, Tacoma Narrows Bridge, technological singularity, The Coming Technological Singularity, The Wealth of Nations by Adam Smith, Tobin tax, total factor productivity, transaction costs, tulip mania, VA Linux, Y2K, yield curve

General proof that properly anticipated prices are random. Samuelson has proved a general theorem showing that the concept that prices are unpredictable can actually be deduced rigorously [357] from a model that hypothesizes that a stock’s present price pt is set at the expected discounted value of its future dividends dt dt+1 dt+2 (which are supposed to be random variables generated according to any general (but known) stochastic process): pt = dt + 1 dt+1 + 1 2 dt+2 + 1 2 3 dt+3 + · · · (3) where the factors i = 1 − r < 1, which can fluctuate from one time period to the next, account for the depreciation of a future price calculated at present due to the nonzero consumption price index r. We see that pt = dt + 1 pt+1 , and thus the expectation Ept+1 of pt+1 conditioned on the knowledge of the present price pt is Ept+1 = pt − dt 1 (4) This shows that, barring the drift due to the inflation and the dividend, the price increment does not have a systematic component or memory of the past and is thus random.

Inductive reasoning and bounded rationality (The El Farol Problem), American Economic Review (Papers and Proceedings) 84. 18. Arthur, W., Lane, D., and Durlauf, S., Editors (1997). The economy as an evolving complex system II (Addison-Wesley, Redwood City). 19. Arthur, W. B. (1987). Self-reinforcing mechanisms in economics, Center for Economic Policy Research 111, 1–20. 20. Arthur, W. B., Ermoliev, Y. M., and Kaniovsky, Y. M. (1984). Strong laws for a class of path-dependent stochastic processes with applications, in Proceedings of the International Conference on Stochastic Optimization, A. Shiryaev and R. Wets, editors (Springer-Verlag, New York), pp. 287–300. 21. Arthur, W. B., Holland, J. H., LeBaron, B., Palmer, R., and Taylor, P. (1997). Asset pricing under endogenous expectations in an artificial stock market, in The Economy as an Evolving Complex System II, W. Arthur, D. Lane, and S.

Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing (Writing Science) by Thierry Bardini

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Apple II, augmented reality, Bill Duvall, conceptual framework, Douglas Engelbart, Dynabook, experimental subject, Grace Hopper, hiring and firing, hypertext link, index card, information retrieval, invention of hypertext, Jaron Lanier, Jeff Rulifson, John von Neumann, knowledge worker, Menlo Park, Mother of all demos, new economy, Norbert Wiener, packet switching, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, RFC: Request For Comment, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, stochastic process, Ted Nelson, the medium is the message, theory of mind, Turing test, unbiased observer, Vannevar Bush, Whole Earth Catalog

In the conceptual world, both the transmission and the trans- formation of what Whorf called "culturally ordained forms and categories" IS the process by which people learn. The crucial point in Bateson's synthesis lay in the characterization of all such processes as "stochastic": Both genetic change and the process called learnIng (including the somatic changes induced by the envIronment) are stochastic processes. In each case there is, I believe, a stream of events that is random in certain aspects and in each case there is a nonrandom selective process which causes certain of the random com- ponents to "surVIve" longer than others. Without the random, there can be no new thIng. . . . We face, then, two great stochastic systems that are partly in interaction and partly isolated from each other. One system IS withIn the individual and is called learnIng; the other is immanent In heredIty and in populations and IS called evolutIon.

In all three of these features, David stresses "software over hardware," or "the touch typist's memory of a particular arrangement of the keys" over this particular arrangement of the keys, and concludes "this, then, was a situation in which the precise details of timing in the developmental se- quence had made it profitable in the short run to adapt machines to the habit of men (or to women, as was increasingly the case) rather than the other way around. And things have been this way ever since" (ibid., 336).6 Thus, it was by institutionalization as an incorporating practice that the QWERTY standard became established. The establishment of a commercial education network favoring the QWERTY was the decisive factor, the source of the" historical accident" that governed the stochastic process that secured forever the supremacy of the QWERTY. It is indeed because of such an "acci- dent" that the six or seven years during which Remington enjoyed the early advantage of being the sole owner of the typewriter patent also saw its selling agents establish profitable and durable business associations with the com- mercial education business. These early business ties soon gave place to an or- ganized and institutional network of associations that secured Remington's position in the typewriter business.


pages: 425 words: 122,223

Capital Ideas: The Improbable Origins of Modern Wall Street by Peter L. Bernstein

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, asset allocation, backtesting, Benoit Mandelbrot, Black-Scholes formula, Bonfire of the Vanities, Brownian motion, buy low sell high, capital asset pricing model, debt deflation, diversified portfolio, Eugene Fama: efficient market hypothesis, financial innovation, financial intermediation, fixed income, full employment, implied volatility, index arbitrage, index fund, interest rate swap, invisible hand, John von Neumann, Joseph Schumpeter, law of one price, linear programming, Louis Bachelier, mandelbrot fractal, martingale, means of production, new economy, New Journalism, profit maximization, Ralph Nader, RAND corporation, random walk, Richard Thaler, risk/return, Robert Shiller, Robert Shiller, Ronald Reagan, stochastic process, the market place, The Predators' Ball, the scientific method, The Wealth of Nations by Adam Smith, Thorstein Veblen, transaction costs, transfer pricing, zero-coupon bond

Paul Cootner, one of the leading finance scholars of the 1960s, once delivered this accolade: “So outstanding is his work that we can say that the study of speculative prices has its moment of glory at its moment of conception.”1 Bachelier laid the groundwork on which later mathematicians constructed a full-fledged theory of probability. He derived a formula that anticipated Einstein’s research into the behavior of particles subject to random shocks in space. And he developed the now universally used concept of stochastic processes, the analysis of random movements among statistical variables. Moreover, he made the first theoretical attempt to value such financial instruments as options and futures, which had active markets even in 1900. And he did all this in an effort to explain why prices in capital markets are impossible to predict! Bachelier’s opening paragraphs contain observations about “fluctuations on the Exchange” that could have been written today.

(LOR) Leland-Rubinstein Associates Leverage Leveraged buyouts Liquidity management market money Preference theory stock “Liquidity Preference as Behavior Toward Risk” (Tobin) Linear programming Loading charges: see Brokerage commissions London School of Economics (LSE) London Stock Exchange Macroeconomics Management Science Marginal utility concept “Market and Industry Factors in Stock Price Performance” (King) Market theories (general discussion). See also specific theories and types of securities competitive disaster avoidance invisible hand linear regression/econometric seasonal fluctuations stochastic process Mathematical economics Mathematical Theory of Non-Uniform Gases, The Maximum expected return concept McCormick Harvester Mean-Variance Analysis Mean-Variance Analysis in Portfolio Choice and Capital Markets (Markowitz) “Measuring the Investment Performance of Pension Funds,” report Mellon Bank Merck Merrill Lynch Minnesota Mining MIT MM Theory “Modern Portfolio Theory. How the New Investment Technology Evolved” Money Managers, The (“Adam Smith”) Money market funds Mortgages government-guaranteed prepaid rates on “‘Motionless’ Motion of Swift’s Flying Island, The” (Merton) Multiple manager risk analysis (MULMAN) Mutual funds individual investment in performance analysis of portfolio management and Value Line National Bureau of Economic Research National General Naval Research Logistics Quarterly New School for Social Research New York Stock Exchange volume of trading New York Times averages “Noise” (Black) Noise trading asset prices and inefficiency of October, 1987, crash OPEC countries Operations Research Optimal capital structure Optimal investment strategy: see Diversification; Portfolio(s), optimal “Optimization of a Quadratic Function Subject to Linear Constraints, The” (Markowitz) Optimization theory Options call contracts expected return on implicit out-of-the-money/in-the-money pricing formulas put valuation Options markets over-the-counter Pacific Stock Exchange Paul A.


pages: 523 words: 143,139

Algorithms to Live By: The Computer Science of Human Decisions by Brian Christian, Tom Griffiths

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, algorithmic trading, anthropic principle, asset allocation, autonomous vehicles, Berlin Wall, Bill Duvall, bitcoin, Community Supported Agriculture, complexity theory, constrained optimization, cosmological principle, cryptocurrency, Danny Hillis, delayed gratification, dematerialisation, diversification, double helix, Elon Musk, fault tolerance, Fellow of the Royal Society, Firefox, first-price auction, Flash crash, Frederick Winslow Taylor, George Akerlof, global supply chain, Google Chrome, Henri Poincaré, information retrieval, Internet Archive, Jeff Bezos, John Nash: game theory, John von Neumann, knapsack problem, Lao Tzu, linear programming, martingale, Nash equilibrium, natural language processing, NP-complete, P = NP, packet switching, prediction markets, race to the bottom, RAND corporation, RFC: Request For Comment, Robert X Cringely, sealed-bid auction, second-price auction, self-driving car, Silicon Valley, Skype, sorting algorithm, spectrum auction, Steve Jobs, stochastic process, Thomas Malthus, traveling salesman, Turing machine, urban planning, Vickrey auction, Walter Mischel, Y Combinator

Like the famous Heisenberg uncertainty principle of particle physics, which says that the more you know about a particle’s momentum the less you know about its position, the so-called bias-variance tradeoff expresses a deep and fundamental bound on how good a model can be—on what it’s possible to know and to predict. This notion is found in various places in the machine-learning literature. See, for instance, Geman, Bienenstock, and Doursat, “Neural Networks and the Bias/Variance Dilemma,” and Grenander, “On Empirical Spectral Analysis of Stochastic Processes.” in the Book of Kings: The bronze snake, known as Nehushtan, gets destroyed in 2 Kings 18:4. “pay good money to remove the tattoos”: Gilbert, Stumbling on Happiness. duels less than fifty years ago: If you’re not too fainthearted, you can watch video of a duel fought in 1967 at http://passerelle-production.u-bourgogne.fr/web/atip_insulte/Video/archive_duel_france.swf. as athletes overfit their tactics: For an interesting example of very deliberately overfitting fencing, see Harmenberg, Epee 2.0.

Nature 363 (1993): 315–319. Gould, Stephen Jay. “The Median Isn’t the Message.” Discover 6, no. 6 (1985): 40–42. Graham, Ronald L., Eugene L. Lawler, Jan Karel Lenstra, and Alexander H. G. Rinnooy Kan. “Optimization and Approximation in Deterministic Sequencing and Scheduling: A Survey.” Annals of Discrete Mathematics 5 (1979): 287–326. Grenander, Ulf. “On Empirical Spectral Analysis of Stochastic Processes.” Arkiv för Matematik 1, no. 6 (1952): 503–531. Gridgeman, T. “Geometric Probability and the Number π.” Scripta Mathematika 25, no. 3 (1960): 183–195. Griffiths, Thomas L., Charles Kemp, and Joshua B. Tenenbaum. “Bayesian Models of Cognition.” In The Cambridge Handbook of Computational Cognitive Modeling. Edited by Ron Sun. Cambridge, UK: Cambridge University Press, 2008. Griffiths, Thomas L., Falk Lieder, and Noah D.


pages: 247 words: 43,430

Think Complexity by Allen B. Downey

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Benoit Mandelbrot, cellular automata, Conway's Game of Life, Craig Reynolds: boids flock, discrete time, en.wikipedia.org, Frank Gehry, Gini coefficient, Guggenheim Bilbao, mandelbrot fractal, Occupy movement, Paul Erdős, sorting algorithm, stochastic process, strong AI, Thomas Kuhn: the structure of scientific revolutions, Turing complete, Turing machine, We are the 99%

, Stanley Milgram sorting, Analysis of Basic Python Operations, Analysis of Basic Python Operations source node, Dijkstra spaceships, Structures, Life Patterns spanning cluster, Percolation special creation, Falsifiability spectral density, Spectral Density spherical cow, The Axes of Scientific Models square, Fractals stable sort, Analysis of Basic Python Operations Stanford Large Network Dataset Collection, Zipf, Pareto, and Power Laws state, Cellular Automata, Stephen Wolfram, Sand Piles stochastic process, The Axes of Scientific Models stock market, SOC, Causation, and Prediction StopIteration, Iterators __str__, Representing Graphs, Representing Graphs strategy, Prisoner’s Dilemma string concatenation, Analysis of Basic Python Operations string methods, Analysis of Basic Python Operations Strogatz, Steven, Paradigm Shift?, Watts and Strogatz The Structure of Scientific Revolutions, Paradigm Shift?


pages: 651 words: 180,162

Antifragile: Things That Gain From Disorder by Nassim Nicholas Taleb

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Air France Flight 447, Andrei Shleifer, banking crisis, Benoit Mandelbrot, Berlin Wall, Black Swan, credit crunch, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, discrete time, double entry bookkeeping, Emanuel Derman, epigenetics, financial independence, Flash crash, Gary Taubes, Gini coefficient, Henri Poincaré, high net worth, Ignaz Semmelweis: hand washing, informal economy, invention of the wheel, invisible hand, Isaac Newton, James Hargreaves, Jane Jacobs, joint-stock company, joint-stock limited liability company, Joseph Schumpeter, knowledge economy, Lao Tzu, Long Term Capital Management, loss aversion, Louis Pasteur, mandelbrot fractal, meta analysis, meta-analysis, microbiome, moral hazard, mouse model, Norbert Wiener, pattern recognition, placebo effect, Ponzi scheme, principal–agent problem, purchasing power parity, quantitative trading / quantitative finance, Ralph Nader, random walk, Ray Kurzweil, rent control, Republic of Letters, Ronald Reagan, Rory Sutherland, Silicon Valley, six sigma, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, stochastic process, stochastic volatility, The Great Moderation, The Wealth of Nations by Adam Smith, Thomas Malthus, too big to fail, transaction costs, urban planning, Yogi Berra, Zipf's Law

Next we turn to a central distinction between the things that like stress and other things that don’t. 1 Cato was the statesman who, three books ago (Fooled by Randomness), expelled all philosophers from Rome. 2 This little bit of effort seems to activate the switch between two distinct mental systems, one intuitive and the other analytical, what psychologists call “system 1” and “system 2.” 3 There is nothing particularly “white” in white noise; it is simply random noise that follows a Normal Distribution. 4 The obvious has not been tested empirically: Can the occurrence of extreme events be predicted from past history? Alas, according to a simple test: no, sorry. 5 Set a simple filtering rule: all members of a species need to have a neck forty centimeters long in order to survive. After a few generations, the surviving population would have, on average, a neck longer than forty centimeters. (More technically, a stochastic process subjected to an absorbing barrier will have an observed mean higher than the barrier.) 6 The French have a long series of authors who owe part of their status to their criminal record—which includes the poet Ronsard, the writer Jean Genet, and many others. CHAPTER 3 The Cat and the Washing Machine Stress is knowledge (and knowledge is stress)—The organic and the mechanical—No translator needed, for now—Waking up the animal in us, after two hundred years of modernity The bold conjecture made here is that everything that has life in it is to some extent antifragile (but not the reverse).

My dream—the solution—is that we would have a National Entrepreneur Day, with the following message: Most of you will fail, disrespected, impoverished, but we are grateful for the risks you are taking and the sacrifices you are making for the sake of the economic growth of the planet and pulling others out of poverty. You are at the source of our antifragility. Our nation thanks you. 1 A technical comment on why the adaptability criterion is innocent of probability (the nontechnical reader should skip the rest of this note). The property in a stochastic process of not seeing at any time period t what would happen in time after t, that is, any period higher than t, hence reacting with a lag, an incompressible lag, is called nonanticipative strategy, a requirement of stochastic integration. The incompressibility of the lag is central and unavoidable. Organisms can only have nonanticipative strategies—hence nature can only be nonpredictive. This point is not trivial at all, and has even confused probabilists such as the Russian School represented by Stratonovich and the users of his method of integration, who fell into the common mental distortion of thinking that the future sends some signal detectable by us.


pages: 348 words: 39,850

Data Scientists at Work by Sebastian Gutierrez

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, algorithmic trading, bioinformatics, bitcoin, business intelligence, chief data officer, clean water, cloud computing, computer vision, continuous integration, correlation does not imply causation, crowdsourcing, data is the new oil, DevOps, domain-specific language, follow your passion, full text search, informal economy, information retrieval, Infrastructure as a Service, inventory management, iterative process, linked data, Mark Zuckerberg, microbiome, Moneyball by Michael Lewis explains big data, move fast and break things, natural language processing, Network effects, nuclear winter, optical character recognition, pattern recognition, Paul Graham, personalized medicine, Peter Thiel, pre–internet, quantitative hedge fund, quantitative trading / quantitative finance, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman, self-driving car, side project, Silicon Valley, Skype, software as a service, speech recognition, statistical model, Steve Jobs, stochastic process, technology bubble, text mining, the scientific method, web application

This kind of slicing process—called alternative splicing—is the reason why it looks like we only have like 20,000 genes in the human genome, but we have vast amounts more of all these different proteins. There’s incredible diversity in proteins. So lots of people—such as Yarden Katz4 at MIT—are developing algorithms to take this high-throughput data and understand what’s actually happening and what the generative ­stochastic processes are. If you take the naïve computer science view, every cell is basically a little computer, right? It has this chunk of memory, and DNA is the compressed obfuscated buggy binary that runs inside this complicated set of stochastic differential equations. If we’re going to figure out how this all works, then we have to start applying better computational techniques. So, yes, it’s very much the case that there are people tackling different biological systems with similar data and algorithm goals.

It’s nice because, as someone with computer science training, I can do complicated things in Python. It’s not like MATLAB, where you have to jump through a million different hoops. And I can drop down in C++ when I need it for speed. Mathematically, a lot of what I work on is Bayesian models using Markov chain Monte Carlo to try and do inference. I really like that universe because the world is so simple when you think about it probabilistically. You can think of a stochastic process that you can condition on your data and do the ­inference. That’s great! It means the set of math I have to know is actually shockingly small, especially because often the problems that I’m working on don’t have data from a billion neurons yet—we have data from 100. And so I’d much rather spend my time building complicated correct models and then, www.it-ebooks.info Data Scientists at Work when the data gets larger, figure out how to simplify those systems; rather than start out with something simple and later rework the models when the data set size grows.


pages: 357 words: 98,854

Epigenetics Revolution: How Modern Biology Is Rewriting Our Understanding of Genetics, Disease and Inheritance by Nessa Carey

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, British Empire, Build a better mousetrap, conceptual framework, discovery of penicillin, double helix, Drosophila, epigenetics, Fellow of the Royal Society, life extension, mouse model, phenotype, stem cell, stochastic process, Thomas Kuhn: the structure of scientific revolutions

But over decades all these mild abnormalities in gene expression, resulting from a slightly inappropriate set of chromatin modifications, may lead to a gradually increasing functional impairment. Clinically, we don’t recognise this until it passes some invisible threshold and the patient begins to show symptoms. The epigenetic variation that occurs in developmental programming is at heart a predominantly random process, normally referred to as ‘stochastic’. This stochastic process may account for a significant amount of the variability that develops between the MZ twins who opened this chapter. Random fluctuations in epigenetic modifications during early development lead to non-identical patterns of gene expression. These become epigenetically set and exaggerated over the years, until eventually the genetically identical twins become phenotypically different, sometimes in the most dramatic of ways.


pages: 411 words: 108,119

The Irrational Economist: Making Decisions in a Dangerous World by Erwann Michel-Kerjan, Paul Slovic

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Andrei Shleifer, availability heuristic, bank run, Black Swan, Cass Sunstein, clean water, cognitive dissonance, collateralized debt obligation, complexity theory, conceptual framework, corporate social responsibility, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-subsidies, Daniel Kahneman / Amos Tversky, endowment effect, experimental economics, financial innovation, Fractional reserve banking, George Akerlof, hindsight bias, incomplete markets, invisible hand, Isaac Newton, iterative process, Loma Prieta earthquake, London Interbank Offered Rate, market bubble, market clearing, moral hazard, mortgage debt, placebo effect, price discrimination, price stability, RAND corporation, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, statistical model, stochastic process, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, ultimatum game, University of East Anglia, urban planning

The existing literature is based on a completely standard expected utility modelling, whereby the welfare of each future generation is evaluated by computing its expected utility based on a probability distribution for the GDP per capita that it will enjoy. A major difficulty, however, is that these probability distributions are ambiguous, in the sense that they are not based on scientific arguments, or on a database large enough to make them completely objective. Indeed, more than one stochastic process is compatible with existing methods for describing economic growth. The Ellsberg paradox tells us that most human beings are averse to ambiguity, which means that they tend to overestimate the probability of the worst-case scenario when computing their subjective expected utility. This suggests that agents systematically violate Savage’s “Sure Thing Principle” (Savage, 1954). More precisely, it seems that the way we evaluate uncertain prospects depends on how precise our information about the underlying probabilities is.


pages: 354 words: 26,550

High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems by Irene Aldridge

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

algorithmic trading, asset allocation, asset-backed security, automated trading system, backtesting, Black Swan, Brownian motion, business process, capital asset pricing model, centralized clearinghouse, collapse of Lehman Brothers, collateralized debt obligation, collective bargaining, diversification, equity premium, fault tolerance, financial intermediation, fixed income, high net worth, implied volatility, index arbitrage, interest rate swap, inventory management, law of one price, Long Term Capital Management, Louis Bachelier, margin call, market friction, market microstructure, martingale, New Journalism, p-value, paper trading, performance metric, profit motive, purchasing power parity, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk tolerance, risk-adjusted returns, risk/return, Sharpe ratio, short selling, Small Order Execution System, statistical arbitrage, statistical model, stochastic process, stochastic volatility, systematic trading, trade route, transaction costs, value at risk, yield curve

In the Garman (1976) model, the market has one monopolistic market maker (dealer). The market maker is responsible for deciding on and then setting bid and ask prices, receiving all orders, and clearing trades. The market maker’s objective is to maximize profits while avoiding bankruptcy or failure. The latter arise whenever the market maker has no inventory or cash. Both buy and sell orders arrive as independent stochastic processes. The model solution for optimal bid and ask prices lies in the estimation of the rates at which a unit of cash (e.g., a dollar or a “clip” of 10 million in FX) “arrives” to the market maker when a customer comes in to buy securities (pays money to the dealer) and “departs” the market maker when a customer comes in to sell (the dealer pays the customer). Suppose the probability of an arrival, a customer order to buy a security at the market ask price pa is denoted λa .


pages: 356 words: 105,533

Dark Pools: The Rise of the Machine Traders and the Rigging of the U.S. Stock Market by Scott Patterson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

algorithmic trading, automated trading system, banking crisis, bash_history, Bernie Madoff, butterfly effect, buttonwood tree, cloud computing, collapse of Lehman Brothers, Donald Trump, Flash crash, Francisco Pizarro, Gordon Gekko, Hibernia Atlantic: Project Express, High speed trading, Joseph Schumpeter, latency arbitrage, Long Term Capital Management, Mark Zuckerberg, market design, market microstructure, pattern recognition, pets.com, Ponzi scheme, popular electronics, prediction markets, quantitative hedge fund, Ray Kurzweil, Renaissance Technologies, Sergey Aleynikov, Small Order Execution System, South China Sea, Spread Networks laid a new fibre optics cable between New York and Chicago, stealth mode startup, stochastic process, transaction costs, Watson beat the top human players on Jeopardy!

The following ad for Getco, for instance, appeared in January 2012: CHICAGO, IL: Work with inter-disciplinary teams of traders & technologists & use trading models to trade profitably on major electronic exchanges; use statistical & mathematical approaches & develop new models to leverage trading capabilities. Must have Master’s in Math, Statistics, Physical Science, Computer Science, or Engineering w/min GPA of 3.4/4.0. Must have proven graduate level coursework in 2 or more of the following: Stochastic Processes, Statistical Methods, Mathematical Finance, Applied Numerical Methods, Machine Learning. Then, in the summer of 2011, a new contender for the high-frequency crown had emerged. Virtu Financial, the computer trading outfit that counted former Island attorney and Nasdaq executive Chris Concannon as a partner, merged with EWT, a California speed-trading operation that operated on exchanges around the world.

Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, distributed generation, finite state, information retrieval, iterative process, knowledge worker, linked data, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, random walk, recommendation engine, RFID, semantic web, sentiment analysis, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, web application

A straightforward adaptation of a clustering method for outlier detection can be very costly, and thus does not scale up well for large data sets. Clustering-based outlier detection methods are discussed in detail in Section 12.5. 12.3. Statistical Approaches As with statistical methods for clustering, statistical methods for outlier detection make assumptions about data normality. They assume that the normal objects in a data set are generated by a stochastic process (a generative model). Consequently, normal objects occur in regions of high probability for the stochastic model, and objects in the regions of low probability are outliers. The general idea behind statistical methods for outlier detection is to learn a generative model fitting the given data set, and then identify those objects in low-probability regions of the model as outliers. However, there are many different ways to learn generative models.

The kernel density approximation of the probability density function is(12.9) where K() is a kernel and h is the bandwidth serving as a smoothing parameter. Once the probability density function of a data set is approximated through kernel density estimation, we can use the estimated density function to detect outliers. For an object, o, gives the estimated probability that the object is generated by the stochastic process. If is high, then the object is likely normal. Otherwise, o is likely an outlier. This step is often similar to the corresponding step in parametric methods. In summary, statistical methods for outlier detection learn models from data to distinguish normal data objects from outliers. An advantage of using statistical methods is that the outlier detection may be statistically justifiable.


pages: 696 words: 143,736

The Age of Spiritual Machines: When Computers Exceed Human Intelligence by Ray Kurzweil

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Buckminster Fuller, call centre, cellular automata, combinatorial explosion, complexity theory, computer age, computer vision, cosmological constant, cosmological principle, Danny Hillis, double helix, Douglas Hofstadter, first square of the chessboard / second half of the chessboard, fudge factor, George Gilder, Gödel, Escher, Bach, I think there is a world market for maybe five computers, information retrieval, invention of movable type, Isaac Newton, iterative process, Jacquard loom, Jacquard loom, John von Neumann, Lao Tzu, Law of Accelerating Returns, mandelbrot fractal, Marshall McLuhan, Menlo Park, natural language processing, Norbert Wiener, optical character recognition, pattern recognition, phenotype, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Richard Feynman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, speech recognition, Steven Pinker, Stewart Brand, stochastic process, technological singularity, Ted Kaczynski, telepresence, the medium is the message, traveling salesman, Turing machine, Turing test, Whole Earth Review, Y2K

Engines of Change: The American Industrial Revolution, 1790-1860. Washington, D.C.: Smithsonian Institution Press, 1986. Hoage, R. J. and Larry Goldman. Animal Intelligence: Insights into the Animal Mind. Washington, D.C.: Smithsonian Institution Press, 1986. Hodges, Andrew. Alan Turing: The Enigma. New York: Simon and Schuster, 1983. Hoel, Paul G., Sidney C. Port, and Charles J. Stone. Introduction to Stochastic Processes. Boston: Houghton-Mifflin, 1972. Hofstadter, Douglas R. Gödel, Escher, Bach: An Eternal Golden Braid. New York: Basic Books, 1979. _________. Metamagical Themas: Questing for the Essence of Mind and Pattern. New York: Basic Books, 1985. Hofstadter, Douglas R. and Daniel C. Dennett. The Mind’s I: Fantasies and Reflections on Self and Soul. New York: Basic Books, 1981. Hofstadter, Douglas R., Gray Clossman, and Marsha Meredith.


pages: 561 words: 120,899

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy by Sharon Bertsch McGrayne

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

bioinformatics, British Empire, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, double helix, Edmond Halley, Fellow of the Royal Society, full text search, Henri Poincaré, Isaac Newton, John Nash: game theory, John von Neumann, linear programming, meta analysis, meta-analysis, Nate Silver, p-value, placebo effect, prediction markets, RAND corporation, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Ronald Reagan, speech recognition, statistical model, stochastic process, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, uranium enrichment, Yom Kippur War

Venter, Gary G. (fall 1987) Credibility. CAS Forum 81–147. Chapter 7. From Tool to Theology Armitage P. (1994) Dennis Lindley: The first 70 years. In Aspects of Uncertainty: A Tribute to D. V. Lindley, eds., PR Freeman and AFM Smith. John Wiley and Sons. Banks, David L. (1996) A Conversation with I. J. Good. Statistical Science (11) 1–19. Dubins LE, Savage LJ. (1976) Inequalities for Stochastic Processes (How to Gamble If You Must). Dover. Box, George EP, et al. (2006) Improving Almost Anything. Wiley. Box GEP, Tiao GC. (1973) Bayesian Inference in Statistical Analysis. Addison-Wesley. Cramér, H. (1976). Half of a century of probability theory: Some personal recollections. Annals of Probability (4) 509–46. D’Agostini, Giulio. (2005) The Fermi’s Bayes theorem. Bulletin of the International Society of Bayesian Analysis (1) 1–4.


pages: 298 words: 43,745

Understanding Sponsored Search: Core Elements of Keyword Advertising by Jim Jansen

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AltaVista, barriers to entry, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, correlation does not imply causation, en.wikipedia.org, first-price auction, information retrieval, inventory management, life extension, linear programming, megacity, Nash equilibrium, Network effects, PageRank, place-making, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social web, software as a service, stochastic process, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, yield management

Indianapolis, IN: Wiley. ╇ [2] Peterson, E. 2004. Web Analytics Demystified: A Marketer’s Guide to Understanding How Your Web Site Affects Your Business. New York: Celilo Group Media. ╇ [3] Pedrick, J. H. and Zufryden, F. S. 1991. “Evaluating the Impact of Advertising Media Plans: A Model of Consumer Purchase Dynamics Using Single Source Data.” Marketing Science, vol. 10(2), pp. 111–130. ╇ [4] Penniman, W. D. 1975. “A Stochastic Process Analysis of Online User Behavior.” In The Annual Meeting of the American Society for Information Science, Washington, DC, pp. 147–148. ╇ [5] Meister, D. and Sullivan, D. 1967. “Evaluation of User Reactions to a Prototype On-Line Information Retrieval System: Report to NASA by the Bunker-Ramo Corporation. Report Number NASA CR-918.” Bunker-Ramo Corporation, Oak Brook, IL. ╇ [6] Directors, A.

Principles of Protocol Design by Robin Sharp

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

accounting loophole / creative accounting, business process, discrete time, fault tolerance, finite state, Gödel, Escher, Bach, information retrieval, loose coupling, packet switching, RFC: Request For Comment, stochastic process, x509 certificate

For example, you might like to pursue all the references to the Alternating Bit Protocol in the literature, starting with the ones given in connection with Protocol 5. This will lead you into the area of other proof techniques for protocols, as well as illustrating how new mechanisms develop as time goes by. Finally, you might like to investigate quantitative properties of some protocols, such as their throughput and delay in the presence of varying loads of traffic. Generally speaking, this requires a knowledge of queueing theory and the theory of stochastic processes. This is not a subject which we pay more than passing attention to in this book. However, some protocols, especially multiplexing protocols, have been the subject of intensive investigation from this point of view. Good discussions of the general theory required are found in [73], while [11] relates the theory more explicitly to the analysis of network protocols. 118 4 Basic Protocol Mechanisms Exercises 4.1.


pages: 634 words: 185,116

From eternity to here: the quest for the ultimate theory of time by Sean M. Carroll

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, cellular automata, Claude Shannon: information theory, Columbine, cosmic microwave background, cosmological constant, cosmological principle, dark matter, dematerialisation, double helix, en.wikipedia.org, gravity well, Harlow Shapley and Heber Curtis, Henri Poincaré, Isaac Newton, John von Neumann, Lao Tzu, lone genius, New Journalism, Norbert Wiener, pets.com, Richard Feynman, Richard Feynman, Richard Stallman, Schrödinger's Cat, Slavoj Žižek, Stephen Hawking, stochastic process, the scientific method, wikimedia commons

That kind of wave function, concentrated entirely on a single possible observational outcome, is known as an “eigenstate.” Once the system is in that eigenstate, you can keep making the same kind of observation, and you’ll keep getting the same answer (unless something kicks the system out of the eigenstate into another superposition). We can’t say with certainty which eigenstate the system will fall into when an observation is made; it’s an inherently stochastic process, and the best we can do is assign a probability to different outcomes. We can apply this idea to the story of Miss Kitty. According to the Copenhagen interpretation, our choice to observe whether she stopped by the food bowl or the scratching post had a dramatic effect on her wave function, no matter how sneaky we were about it. When we didn’t look, she was in a superposition of the two possibilities, with equal amplitude; when she then moved on to the sofa or the table, we added up the contributions from each of the intermediate steps, and found there was interference.


pages: 728 words: 182,850

Cooking for Geeks by Jeff Potter

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, A Pattern Language, carbon footprint, centre right, Community Supported Agriculture, crowdsourcing, double helix, en.wikipedia.org, European colonialism, fear of failure, food miles, hacker house, haute cuisine, helicopter parent, Internet Archive, iterative process, Parkinson's law, placebo effect, random walk, slashdot, stochastic process, the scientific method

It’s possible to break up the collagen chemically, too: lysosomal enzymes will attack the structure and "break the covalent bonds" in chem-speak, but this isn’t so useful to know in the kitchen. Note For fun, try marinating a chunk of meat in papaya, which contains an enzyme, papain, that acts as a meat tenderizer by hydrolyzing collagen. One piece of information that is critical to understand in the kitchen, however, is that hydrolysis takes time. The structure has to literally untwist and break up, and due to the amount of energy needed to break the bonds and the stochastic processes involved, this reaction takes longer than simply denaturing the protein. Hydrolyzing collagen not only breaks down the rubbery texture of the denatured structure, but also converts a portion of it to gelatin. When the collagen hydrolyzes, it breaks into variously sized pieces, the smaller of which are able to dissolve into the surrounding liquid, creating gelatin. It’s this gelatin that gives dishes such as braised ox tail, slow-cooked short ribs, and duck confit their distinctive mouthfeel.


pages: 798 words: 240,182

The Transhumanist Reader by Max More, Natasha Vita-More

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Drosophila, en.wikipedia.org, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, P = NP, pattern recognition, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Richard Feynman, Ronald Reagan, silicon-based life, Singularitarianism, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce

If we estimate about 102 bytes of information to encode these details (which may be low), we have 1016 bytes, considerably more than the 109 bytes that you mentioned. One might ask: How do we get from 107 bytes that specify the brain in the genome to 1016 bytes in the mature brain? This is not hard to understand, since we do this type of meaningful data expansion routinely in our self-organizing software paradigms. For example, a genetic ­algorithm can be efficiently coded, but in turn creates data far greater in size than itself using a stochastic process, which in turn self-organizes in response to a complex environment (the problem space). The result of this process is meaningful information far greater than the original program. We know that this is exactly how the creation of the brain works. The genome specifies initially semi-random interneuronal connection wiring patterns in specific regions of the brain (random within certain constraints and rules), and these patterns (along with the ­neurotransmitter-concentration levels) then undergo their own internal evolutionary process to self-organize to reflect the interactions of that person with their experiences and environment.


pages: 823 words: 220,581

Debunking Economics - Revised, Expanded and Integrated Edition: The Naked Emperor Dethroned? by Steve Keen

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

accounting loophole / creative accounting, banking crisis, banks create money, barriers to entry, Benoit Mandelbrot, Big bang: deregulation of the City of London, Black Swan, Bonfire of the Vanities, butterfly effect, capital asset pricing model, cellular automata, central bank independence, citizen journalism, clockwork universe, collective bargaining, complexity theory, correlation coefficient, credit crunch, David Ricardo: comparative advantage, debt deflation, diversification, double entry bookkeeping, en.wikipedia.org, Eugene Fama: efficient market hypothesis, experimental subject, Financial Instability Hypothesis, Fractional reserve banking, full employment, Henri Poincaré, housing crisis, Hyman Minsky, income inequality, invisible hand, iterative process, John von Neumann, laissez-faire capitalism, liquidity trap, Long Term Capital Management, mandelbrot fractal, margin call, market bubble, market clearing, market microstructure, means of production, minimum wage unemployment, open economy, place-making, Ponzi scheme, profit maximization, quantitative easing, RAND corporation, random walk, risk tolerance, risk/return, Robert Shiller, Robert Shiller, Ronald Coase, Schrödinger's Cat, scientific mainstream, seigniorage, six sigma, South Sea Bubble, stochastic process, The Great Moderation, The Wealth of Nations by Adam Smith, Thorstein Veblen, time value of money, total factor productivity, tulip mania, wage slave

The impact of this power inversion can be seen in the physicist Joe McCauley’s observations about the need to reform economics education: The real problem with my proposal for the future of economics departments is that current economics and finance students typically do not know enough mathematics to understand (a) what econophysicists are doing, or (b) to evaluate the neo-classical model (known in the trade as ‘The Citadel’) critically enough to see, as Alan Kirman put it, that ‘No amount of attention to the walls will prevent The Citadel from being empty.’ I therefore suggest that the economists revise their curriculum and require that the following topics be taught: calculus through the advanced level, ordinary differential equations (including advanced), partial differential equations (including Green functions), classical mechanics through modern nonlinear dynamics, statistical physics, stochastic processes (including solving Smoluchowski–Fokker–Planck equations), computer programming (C, Pascal, etc.) and, for complexity, cell biology. Time for such classes can be obtained in part by eliminating micro- and macro-economics classes from the curriculum. The students will then face a much harder curriculum, and those who survive will come out ahead. So might society as a whole. (McCauley 2006: 607–8) This amplifies a point that, as a critic of economics with a reasonable grounding in mathematics myself, has long set me apart from most other critics: neoclassical economics is not bad because it is mathematical per se, but because it is bad mathematics. 16 | DON’T SHOOT ME, I’M ONLY THE PIANO Why mathematics is not the problem Many critics of economics have laid the blame for its manifest failures at the feet of mathematics.

The Art of Computer Programming: Fundamental Algorithms by Donald E. Knuth

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

discrete time, distributed generation, fear of failure, Fermat's Last Theorem, Isaac Newton, Jacquard loom, Jacquard loom, John von Neumann, linear programming, linked data, Menlo Park, probability theory / Blaise Pascal / Pierre de Fermat, Richard Feynman, sorting algorithm, stochastic process, Turing machine

Suppose each arc e of G has been assigned a probability p(e), where the probabilities satisfy the conditions 0 < p(e) < 1; ^ p(e) = 1 for 1 < j < n. init(e)=Vj Consider a random path, which starts at V\ and subsequently chooses branch e of G with probability p(e), until Vn is reached; the choice of branch taken at each step is to be independent of all previous choices. 2.3.4.2 ORIENTED TREES 381 For example, consider the graph of exercise 2.3.4.1-7, and assign the respective probabilities 1, \, \, |, 1, f, \, \, \ to arcs ei, e2,.. •, e9. Then the path "Start-A- B-C-A-D-B-C-Stop" is chosen with probability l-|-l-|-|-|-l-i = tIs- Such random paths are called Markov chains, after the Russian mathematician Andrei A. Markov, who first made extensive studies of stochastic processes of this kind. The situation serves as a model for certain algorithms, although our requirement that each choice must be independent of the others is a very strong assumption. The purpose of this exercise is to analyze the computation time for algorithms of this kind. The analysis is facilitated by considering the n x n matrix A — (aij), where aij = ^2p{e) summed over all arcs e that go from Vi to Vj.