68 results back to index

**
Mathematical Finance: Theory, Modeling, Implementation
** by
Christian Fries

Black-Scholes formula, Brownian motion, continuous integration, discrete time, fixed income, implied volatility, interest rate derivative, martingale, quantitative trading / quantitative ﬁnance, random walk, short selling, Steve Jobs, stochastic process, stochastic volatility, volatility smile, Wiener process, zero-coupon bond

.: Conditional Expectation: Let the σ-algebra C be generated by the sets C1 = {ω1 , ω2 , ω3 }, C2 = {ω4 , ω5 , ω6 }, C3 = {ω7 , . . . , ω10 }. In this sense C may be interpreted as an information set and X|C as a filtered version of X. If it is only possible to make statements upon events in C then one may only make statements about X which could also be made about X|C . C| 2.2. Stochastic Processes q Definition 17 (Stochastic Process): A family X = {Xt | 0 ≤ t < ∞} of random variables Xt : (Ω, F ) → (S , S) is called (time continuous) stochastic process. If (S , S) = (Rd , B(Rd )), we say that X is a d-dimensional stochastic process. The family X may also be interpreted as a X : [0, ∞) × Ω → S : X(t, ω) := Xt (ω) ∀ (t, ω) ∈ [0, ∞) × Ω. y If the range (S , S) is not given explicitly we assume (S , S) = 35 This work is licensed under a Creative Commons License. http://creativecommons.org/licenses/by-nc-nd/2.5/deed.en (Rd , B(Rd )).

…

Random Variables: Z • Z(ω) dP(ω) – Lebesgue integral. Integral of a random variable Z with respect to a measure P (cf. expectation). Stochastic Processes: Z • X(t1 , ω) dP(ω) – Lebesgue integral. Integral of a random variable X(t1 ) with respect to a measure P. X(t) dt – Lebesgue Integral or Riemann integral. The (pathwise) integral of the stochastic process X with respect to t. X(t) dW(t) – Itô integral. The (pathwise) integral of the stochastic process X with respect to a Brownian motion W. t1 Ω Ω t2 Z • t1 Z • t2 t1 Z Ω X(t,ω) X(t1 , ω) dP(ω) ω1 0 t1 T Z T 0 X(t)dW (t)[ω1 ] t Figure 2.10.: Integration of stochastic processes The notion of a stochastic integral may be extended to more general integrands and/or more general integrators. For completeness we mention: 57 This work is licensed under a Creative Commons License. http://creativecommons.org/licenses/by-nc-nd/2.5/deed.en Comments welcome. ©2004, 2005, 2006 Christian Fries Version 1.3.19 [build 20061210]- 13th December 2006 http://www.christian-fries.de/finmath/ CHAPTER 2.

…

A Family of σ-algebras {Ft | t ≥ 0}, where F s ⊆ Ft ⊆ F q for 0 ≤ s ≤ t, is called a Filtration on (Ω, F ). y Definition 20 (Generated Filtration): Let X denote a stochastic process on (Ω, F ). We define q FtX := σ(X s ; 0 ≤ s ≤ t) := the smallest σ-algebra with respect to which X s is measurable ∀ s ∈ [0, t]. y Definition 21 (Adapted Process): q Let X denote a stochastic process on (Ω, F ) and {Ft } a filtration on (Ω, F ). The process X is called {Ft }-adapted, if Xt is Ft -measurable for all t ≥ 0. y Interpretation: In Figure 2.4 we depict a filtration of four σ-algebras with increasing refinement (left to right). The black borders surround the generators of the corresponding σ-algebra. If a stochastic process maps a gray value to each elementary event (or path) ωi of Ω (left), then we have 36 This work is licensed under a Creative Commons License. http://creativecommons.org/licenses/by-nc-nd/2.5/deed.en Comments welcome. ©2004, 2005, 2006 Christian Fries Version 1.3.19 [build 20061210]- 13th December 2006 http://www.christian-fries.de/finmath/ 2.4.

pages: 416 words: 39,022

**
Asset and Risk Management: Risk Oriented Finance
** by
Louis Esch,
Robert Kieffer,
Thierry Lopez

asset allocation, Brownian motion, business continuity plan, business process, capital asset pricing model, computer age, corporate governance, discrete time, diversified portfolio, fixed income, implied volatility, index fund, interest rate derivative, iterative process, P = NP, p-value, random walk, risk/return, shareholder value, statistical model, stochastic process, transaction costs, value at risk, Wiener process, yield curve, zero-coupon bond

This is a distribution symmetrical with respect to 0, which corresponds to a normal distribution for n = 2 and gives rise to a leptokurtic distribution (resp. negative kurtosis distribution) for n < 2 (n > 2). 2.3 STOCHASTIC PROCESSES 2.3.1 General considerations The term stochastic process is applied to a random variable that is a function of the time variable: {Xt : t ∈ T }. 354 Asset and Risk Management f (x) v=1 v=2 v=3 x 0 Figure A2.15 Generalised error distribution If the set T of times is discrete, the stochastic process is simply a sequence of random variables. However, in a number of ﬁnancial applications such as Black and Scholes’ model, it will be necessary to consider stochastic processes in continuous time. For each possible result ω ∈ , the function of Xt (ω) of the variable t is known as the path of the stochastic process. A stochastic process is said to have independent increments when, regardless of the times t1 < t2 < . . . < tn , the r.v.s Xt1 , Xt2 − Xt1 , Xt3 − Xt2 , . . . are independent.

…

It is, however, possible to extend the deﬁnition to a concept of stochastic differential, through the theory of stochastic integral calculus.8 As the stochastic process zt is deﬁned within the interval [a; b], the stochastic integral of zt is deﬁned within [a; b] with respect to the standard Brownian motion wt by: a 7 8 b zt dwt = lim n→∞ δ→0 n−1 ztk (wtk+1 − wtk ) k=0 The root function presents a vertical tangent at the origin. The full development of this theory is outside the scope of this work. Probabilistic Concepts where, we have: 357 a = t0 < t1 < . . . < tn = b δ = max (tk − tk−1 ) k=1,...,n Let us now consider a stochastic process Zt (for which we wish to deﬁne the stochastic differential)and a standard Brownian motion wt . If there is a stochastic process zt such that t Zt = Z0 + 0 zs dws , then it is said that Zt admits the stochastic differential dZt = zt dwt .

…

8.1.2 The data in the example 8.2 Calculations 8.2.1 Treasury portfolio case 8.2.2 Bond portfolio case 8.3 The normality hypothesis PART IV FROM RISK MANAGEMENT TO ASSET MANAGEMENT Introduction 9 224 224 230 234 235 238 241 243 243 243 244 244 244 250 252 255 256 Portfolio Risk Management 9.1 General principles 9.2 Portfolio risk management method 9.2.1 Investment strategy 9.2.2 Risk framework 257 257 257 258 258 10 Optimising the Global Portfolio via VaR 10.1 Taking account of VaR in Sharpe’s simple index method 10.1.1 The problem of minimisation 10.1.2 Adapting the critical line algorithm to VaR 10.1.3 Comparison of the two methods 10.2 Taking account of VaR in the EGP method 10.2.1 Maximising the risk premium 10.2.2 Adapting the EGP method algorithm to VaR 10.2.3 Comparison of the two methods 10.2.4 Conclusion 10.3 Optimising a global portfolio via VaR 10.3.1 Generalisation of the asset model 10.3.2 Construction of an optimal global portfolio 10.3.3 Method of optimisation of global portfolio 265 266 266 267 269 269 269 270 271 272 274 275 277 278 11 Institutional Management: APT Applied to Investment Funds 11.1 Absolute global risk 11.2 Relative global risk/tracking error 11.3 Relative fund risk vs. benchmark abacus 11.4 Allocation of systematic risk 285 285 285 287 288 x Contents 2.2 Theoretical distributions 2.2.1 Normal distribution and associated ones 2.2.2 Other theoretical distributions 2.3 Stochastic processes 2.3.1 General considerations 2.3.2 Particular stochastic processes 2.3.3 Stochastic differential equations 347 347 350 353 353 354 356 Appendix 3 Statistical Concepts 3.1 Inferential statistics 3.1.1 Sampling 3.1.2 Two problems of inferential statistics 3.2 Regressions 3.2.1 Simple regression 3.2.2 Multiple regression 3.2.3 Nonlinear regression 359 359 359 360 362 362 363 364 Appendix 4 Extreme Value Theory 4.1 Exact result 4.2 Asymptotic results 4.2.1 Extreme value theorem 4.2.2 Attraction domains 4.2.3 Generalisation 365 365 365 365 366 367 Appendix 5 Canonical Correlations 5.1 Geometric presentation of the method 5.2 Search for canonical characters 369 369 369 Appendix 6 371 Algebraic Presentation of Logistic Regression Appendix 7 Time Series Models: ARCH-GARCH and EGARCH 7.1 ARCH-GARCH models 7.2 EGARCH models 373 373 373 Appendix 8 Numerical Methods for Solving Nonlinear Equations 8.1 General principles for iterative methods 8.1.1 Convergence 8.1.2 Order of convergence 8.1.3 Stop criteria 8.2 Principal methods 8.2.1 First order methods 8.2.2 Newton–Raphson method 8.2.3 Bisection method 375 375 375 376 376 377 377 379 380 Contents 8.3 Nonlinear equation systems 8.3.1 General theory of n-dimensional iteration 8.3.2 Principal methods xi 380 381 381 Bibliography 383 Index 389 Collaborators Christian Berbé, Civil engineer from Université libre de Bruxelles and ABAF ﬁnancial analyst.

pages: 320 words: 33,385

**
Market Risk Analysis, Quantitative Methods in Finance
** by
Carol Alexander

asset allocation, backtesting, barriers to entry, Brownian motion, capital asset pricing model, constrained optimization, credit crunch, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, fixed income, implied volatility, interest rate swap, market friction, market microstructure, p-value, performance metric, quantitative trading / quantitative ﬁnance, random walk, risk tolerance, risk-adjusted returns, risk/return, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, Thomas Bayes, transaction costs, value at risk, volatility smile, Wiener process, yield curve, zero-sum game

Readers interested in estimating the parameters of a GARCH model when they come to Chapter II.4 will need to understand maximum likelihood estimation. Section I.3.7 shows how to model the evolution of financial asset prices and returns using a stochastic process in both discrete and continuous time. The translation between discrete and continuous time, and the relationship between the continuous time representation and the discrete time representation of a stochastic process, is very important indeed. The theory of finance requires an understanding of both discrete time and continuous time stochastic processes. Section I.3.8 summarizes and concludes. Some prior knowledge of basic calculus and elementary linear algebra is required to understand this chapter. Specifically, an understanding of Sections I.1.3 and I.2.4 is assumed.

…

But we do 134 Quantitative Methods in Finance not know and so we need to estimate the variance using the maximum likelihood estimator ˆ 2 given by (I.3.132). Then, using ˆ in place of we have ˆ (I.3.135) estse X = √ n and ˆ2 (I.3.136) estse ˆ 2 = √ 2n I.3.7 STOCHASTIC PROCESSES IN DISCRETE AND CONTINUOUS TIME A stochastic process is a sequence of identically distributed random variables. For most of our purposes random variables are continuous, indeed they are often assumed to be normal, but the sequence may be over continuous or discrete time. That is, we consider continuous state processes in both continuous and discrete time. • The study of discrete time stochastic processes is called time series analysis. In the time domain the simplest time series models are based on regression analysis, which is introduced in the next chapter. A simple example of a time series model is the first order autoregression, and this is defined below along with a basic test for stationarity.

…

I.3.7.3 Stochastic Models for Asset Prices and Returns Time series of asset prices behave quite differently from time series of returns. In efficient markets a time series of prices or log prices will follow a random walk. More generally, even in the presence of market frictions and inefficiencies, prices and log prices of tradable assets are integrated stochastic processes. These are fundamentally different from the associated returns, which are generated by stationary stochastic processes. Figures I.3.28 and I.3.29 illustrate the fact that prices and returns are generated by very different types of stochastic process. Figure I.3.28 shows time series of daily prices (lefthand scale) and log prices (right-hand scale) of the Dow Jones Industrial Average (DJIA) DJIA 12000 9.4 Log DJIA 9.3 11000 9.2 10000 9.1 9000 9 8000 8.9 Sep-01 May-01 Jan-01 Sep-00 May-00 Jan-00 Sep-99 May-99 Jan-99 Sep-98 May-98 8.8 Jan-98 7000 Figure I.3.28 Daily prices and log prices of DJIA index 56 This is not the only possible discretization of a continuous increment.

pages: 313 words: 34,042

**
Tools for Computational Finance
** by
Rüdiger Seydel

bioinformatics, Black-Scholes formula, Brownian motion, commoditize, continuous integration, discrete time, implied volatility, incomplete markets, interest rate swap, linear programming, London Interbank Offered Rate, mandelbrot fractal, martingale, random walk, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process, zero-coupon bond

The easiest way to consider stochastic movements is via an additive term, 32 Chapter 1 Modeling Tools for Financial Options dx = a(x, t) + b(x, t)ξt . dt Here we use the notations a: deterministic part, bξt : stochastic part, ξt denotes a generalized stochastic process. An example of a generalized stochastic process is white noise. For a brief deﬁnition of white noise we note that to each stochastic process a generalized version can be assigned [Ar74]. For generalized stochastic processes derivatives of any order can be deﬁned. Suppose that Wt is the generalized version of a Wiener process, then Wt can be diﬀerentiated. Then white noise ξt is d Wt , or vice versa, deﬁned as ξt = Ẇt = dt t Wt = ξs ds. 0 That is, a Wiener process is obtained by smoothing the white noise. The smoother integral version dispenses with using generalized stochastic processes. Hence the integrated form of ẋ = a(x, t) + b(x, t)ξt is studied, t t x(t) = x0 + a(x(s), s)ds + b(x(s), s)ξs ds, t0 t0 and we replace ξs ds = dWs .

…

Here we consider the continuoustime situation. That is, t ∈ IR varies continuously in a time interval I, which typically represents 0 ≤ t ≤ T . A more complete notation for a stochastic process is {Xt , t ∈ I}, or (Xt )0≤t≤T . Let the chance play for all t in the interval 0 ≤ t ≤ T , then the resulting function Xt is called realization or path of the stochastic process. Special properties of stochastic processes have lead to the following names: Gaussian process: All ﬁnite-dimensional distributions (Xt1 , . . . , Xtk ) are Gaussian. Hence speciﬁcally Xt is distributed normally for all t. Markov process: Only the present value of Xt is relevant for its future motion. That is, the past history is fully reﬂected in the present value.4 An example of a process that is both Gaussian and Markov, is the Wiener process. 4 This assumption together with the assumption of an immediate reaction of the market to arriving informations are called hypothesis of the eﬃcient market [Bo98]. 26 Chapter 1 Modeling Tools for Financial Options 11500 11000 10500 10000 9500 9000 8500 8000 7500 7000 0 50 100 150 200 250 300 350 400 450 500 Fig. 1.14.

…

In multi-period models and continuous models ∆ must be adapted dynamically. The general deﬁnition is ∂V (S, t) ; ∆ = ∆(S, t) = ∂S the expression (1.16) is a discretized version. 1.6 Stochastic Processes Brownian motion originally meant the erratic motion of a particle (pollen) on the surface of a ﬂuid, caused by tiny impulses of molecules. Wiener suggested a mathematical model for this motion, the Wiener process. But earlier Bachelier had applied Brownian motion to model the motion of stock prices, which instantly respond to the numerous upcoming informations similar as pollen react to the impacts of molecules. The illustration of the Dow in Figure 1.14 may serve as motivation. A stochastic process is a family of random variables Xt , which are deﬁned for a set of parameters t (−→ Appendix B1). Here we consider the continuoustime situation.

**
Analysis of Financial Time Series
** by
Ruey S. Tsay

Asian financial crisis, asset allocation, Bayesian statistics, Black-Scholes formula, Brownian motion, business cycle, capital asset pricing model, compound rate of return, correlation coefficient, data acquisition, discrete time, frictionless, frictionless market, implied volatility, index arbitrage, Long Term Capital Management, market microstructure, martingale, p-value, pattern recognition, random walk, risk tolerance, short selling, statistical model, stochastic process, stochastic volatility, telemarketer, transaction costs, value at risk, volatility smile, Wiener process, yield curve

ISBN: 0-471-41544-8 CHAPTER 6 Continuous-Time Models and Their Applications Price of a financial asset evolves over time and forms a stochastic process, which is a statistical term used to describe the evolution of a random variable over time. The observed prices are a realization of the underlying stochastic process. The theory of stochastic process is the basis on which the observed prices are analyzed and statistical inference is made. There are two types of stochastic process for modeling the price of an asset. The first type is called the discrete-time stochastic process, in which the price changes at discrete time points. All the processes discussed in the previous chapters belong to this category. For example, the daily closing price of IBM stock on the New York Stock Exchange forms a discrete-time stochastic process. Here the price changes only at the closing of a trading day.

…

For more description on options, see Hull (1997). 6.2 SOME CONTINUOUS-TIME STOCHASTIC PROCESSES In mathematical statistics, a continuous-time continuous stochastic process is defined on a probability space (, F, P), where is a nonempty space, F is a σ -field consisting of subsets of , and P is a probability measure; see Chapter 1 of Billingsley (1986). The process can be written as {x(η, t)}, where t denotes time and is continuous in [0, ∞). For a given t, x(η, t) is a real-valued continuous random variable (i.e., a mapping from to the real line), and η is an element of . For the price of an asset at time t, the range of x(η, t) is the set of non-negative real numbers. For a given η, {x(η, t)} is a time series with values depending on the time t. For simplicity, we 223 STOCHASTIC PROCESSES write a continuous-time stochastic process as {xt } with the understanding that, for a given t, xt is a random variable.

…

As a result, we cannot use the usual intergation in calculus to handle integrals involving a standard Brownian motion when we consider the value of an asset over time. Another approach must be sought. This is the purpose of discussing Ito’s calculus in the next section. 6.2.2 Generalized Wiener Processes The Wiener process is a special stochastic process with zero drift and variance proportional to the length of time interval. This means that the rate of change in expectation is zero and the rate of change in variance is 1. In practice, the mean and variance of a stochastic process can evolve over time in a more complicated manner. Hence, further generalization of stochastic process is needed. To this end, we consider the generalized Wiener process in which the expectation has a drift rate µ and the rate of variance change is σ 2 . Denote such a process by xt and use the notation dy for a small change in the variable y.

pages: 447 words: 104,258

**
Mathematics of the Financial Markets: Financial Instruments and Derivatives Modelling, Valuation and Risk Issues
** by
Alain Ruttiens

algorithmic trading, asset allocation, asset-backed security, backtesting, banking crisis, Black Swan, Black-Scholes formula, Brownian motion, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, discounted cash flows, discrete time, diversification, fixed income, implied volatility, interest rate derivative, interest rate swap, margin call, market microstructure, martingale, p-value, passive investing, quantitative trading / quantitative ﬁnance, random walk, risk/return, Satyajit Das, Sharpe ratio, short selling, statistical model, stochastic process, stochastic volatility, time value of money, transaction costs, value at risk, volatility smile, Wiener process, yield curve, zero-coupon bond

F forward price, or future price (depends on the context) FV future value -ibor generic for LIBOR, EURIBOR, or any other inter-bank market rate K strike price of an option κ kurtosis M month or million, depending on context MD modified duration MtM “Marked to Market” (= valued to the observed current market price) μ drift of a stochastic process N total number of a series (integer number), or nominal (notional) amount (depends on the context) (.) Gaussian (normal) density distribution function N(.) Gaussian (normal) cumulative distribution function P put price P{.} probability of {.} PV present value (.) Poisson density distribution function r generic symbol for a rate of return rf risk-free return ρ(.) correlation of (.) skew skewness S spot price of an asset (equity, currency, etc.), as specified by the context STD(.) standard deviation of (.) σ volatility of a stochastic process t current time, or time in general (depends on the context) t0 initial time T maturity time τ tenor, that is, time interval between current time t and maturity T V(.) variance of (.) (.) stochastic process of (.)

…

Provided F(x) is continuously differentiable, we can determine the corresponding density function f(x) associated to the random variable X as Stochastic Processes A stochastic process can be defined as a collection of random variables defined on the same probability space (Ω, , P) and “indexed” by a set of parameter T, that is, {Xt, t ∈ T}. Within the framework of our chapter, t is the time. For a given outcome or sample ω, Xt(ω) for t ∈ T is called a sample path, realization or trajectory of the process. The space containing all possible values of Xt is called the state space. Further in this chapter, we will only consider a one-dimension state space, namely the set of real numbers , that refers to T, and random variables Xt involved in stochastic processes {Xt, t ∈ T} will be denoted by where “∼” indicates its random nature over time t; these random variables will be such as a price, a rate or a return.

…

., NEFTCI in the further reading at the end of the chapter). 9 Other financial models: from ARMA to the GARCH family The previous chapter dealt with stochastic processes, which consist of (returns) models involving a mixture of deterministic and stochastic components. By contrast, the models developed here present three major differences: These models are deterministic; since they are aiming to model a non-deterministic variable such as a return, the difference between the model output and the actual observed value is a probabilistic error term. By contrast with stochastic processes described by differential equations, these models are built in discrete time, in practice, the periodicity of the modeled return (daily, for example). By contrast with usual Markovian stochastic processes, these models incorporate in the general case a limited number of previous return values, so that they are not Markovian.

pages: 443 words: 51,804

**
Handbook of Modeling High-Frequency Data in Finance
** by
Frederi G. Viens,
Maria C. Mariani,
Ionut Florescu

algorithmic trading, asset allocation, automated trading system, backtesting, Black-Scholes formula, Brownian motion, business process, buy and hold, continuous integration, corporate governance, discrete time, distributed generation, fixed income, Flash crash, housing crisis, implied volatility, incomplete markets, linear programming, mandelbrot fractal, market friction, market microstructure, martingale, Menlo Park, p-value, pattern recognition, performance metric, principal–agent problem, random walk, risk tolerance, risk/return, short selling, statistical model, stochastic process, stochastic volatility, transaction costs, value at risk, volatility smile, Wiener process

The iterative method we will use for this problem was developed by Chadam and Yin in Ref. 22 to study a similar partial integro-differential problem. 13.3.1 STATEMENT OF THE PROBLEM As pointed out in Ref. 17, when modeling high frequency data in applications, a Lévy-like stochastic process appears to be the best ﬁt. When using these models, option prices are found by solving the resulting PIDE. For example, integrodifferential equations appear in exponential Lévy models, where the market price of an asset is represented as the exponential of a Lévy stochastic process. These models have been discussed in several published works such as Refs 17 and 23. 365 13.3 Another Iterative Method In this section, we consider the following integro-differential model for a European call option ∂C σ 2S2 ∂ 2C ∂C (S, t) − rC(S, t) (S, t) + rS (S, t) + ∂t ∂S 2 ∂S 2 ∂C + ν(dy) C(Sey , t) − C(S, t) − S(ey − 1) (S, t) = 0, ∂S (13.28) where the market price of an asset is represented as the exponential of a Lévy stochastic process (see Chapter 12 of Ref. 17).

…

Physica A 2003;318:279–292 [Proceedings of International Statistical Physics Conference, Kolkata]. 19. Mantegna RN, Stanley HE. Stochastic process with ultra-slow convergence to a Gaussian: the truncated Levy ﬂight. Phys Rev Lett 1994;73:2946–2949. 20. Peng CK, Mietus J, Hausdorff JM, Havlin S, Stanley HE, Goldberger AL. Longrange anticorrelations and non-Gaussian behavior of the heartbeat. Phys Rev Lett 1993;70:1343–1346. 21. Peng CK, Buldyrev SV, Havlin S, Simons M, Stanley HE, Goldberger AL. Mosaic organization of DNA nucleotides. Phys Rev E 1994;49:1685–1689. 22. Levy P. Calcul des probabilités. Paris: Gauthier-Villars; 1925. 23. Khintchine AYa, Levy P. Sur les lois stables. C R Acad Sci Paris 1936;202:374–376. 24. Koponen I. Analytic approach to the problem of convergence of truncated Levy ﬂights towards the Gaussian stochastic process. Phys Rev E 1995;52:1197–1199. 25. Podobnik B, Ivanov PCh, Lee Y, Stanley HE.

…

Stable non-Gaussian random processes: stochastic models with inﬁnite variance. New York: Chapman and Hall; 1994. 6. Levy P. Calcul des probabilités. Paris: Gauthier-Villars; 1925. 7. Khintchine AYa, Levy P. Sur les lois stables. C R Acad Sci Paris;1936;202:374. 8. Mantegna RN, Stanley HE. Stochastic process with ultra-slow convergence to a Gaussian: the truncated Levy ﬂight. Phys Rev Lett;1994;73:2946– 2949. 9. Koponen I. Analytic approach to the problem of convergence of truncated Levy ﬂights towards the Gaussian stochastic process. Phys Rev E;1995;52:1197–1199. 10. Weron R. Levy-stable distributions revisited: tail index> 2 does not exclude the Levy-stable regime. Int J Mod Phys C; 2001;12:209–223. Chapter Thirteen Solutions to Integro-Differential Parabolic Problem Arising on Financial Mathematics MARIA C.

**
Mathematical Finance: Core Theory, Problems and Statistical Algorithms
** by
Nikolai Dokuchaev

Black-Scholes formula, Brownian motion, buy and hold, buy low sell high, discrete time, fixed income, implied volatility, incomplete markets, martingale, random walk, short selling, stochastic process, stochastic volatility, transaction costs, volatility smile, Wiener process, zero-coupon bond

If yes, give an example; if no, prove it. such that x and y are rational Problem 1.60 Let Q2 be the set of all pairs 2 numbers. We consider a random direct line L in R such that with probability 1, and that the angle between L and the vector (1, 0) has the uniform distribution on [0, π). Find the probability that the set © 2007 Nikolai Dokuchaev is finite. 2 Basics of stochastic processes In this chapter, some basic facts and definitions from the theory of stochastic (random) processes are given, including filtrations, martingales, Markov times, and Markov processes. 2.1 Definitions of stochastic processes Sometimes it is necessary to consider random variables or vectors that depend on time. Definition 2.1 A sequence of random variables ξt, t=0, 1, 2,…, is said to be a discrete time stochastic (or random) process. be given. A mapping ξ:[0,T]×Ω→R is said to be a Definition 2.2 Let continuous time stochastic (random) process if ξ(t,ω) is a random variable for a.e.

…

It suffices to apply Theorem 4.42 with f≡0, b≡1, then yx,s(t)=w(t)−w(s)+x, and the corresponding operator is In Example 4.49, representation (4.16) is said to be the probabilistic representation of the solution. In particular, it follows that where is the probability density function for N(x, T−s). Note that this function is also well known in the theory of parabolic equations: it is the so-called fundamental solution of the heat equation. The representation of functions of the stochastic processes via solution of parabolic partial differential equations (PDEs) helps to study stochastic processes: one can use numerical methods developed for PDEs (i.e., finite differences, fundamental solutions, etc.). On the other hand, the probabilistic representation of a solution of parabolic PDEs can also help to study PDEs. For instance, one can use Monte Carlo simulation for numerical solution of PDEs. Some theoretical results can also be proved easier with probabilistic representation (for example, the so-called maximum principle for parabolic equations follows from this representation: if φ≥0 and Ψ≥0 in (4.15), then V≥0).

…

British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging in Publication Data A catalog record for this book has been requested ISBN 0-203-96472-1 Master e-book ISBN ISBN10: 0-415-41447-4 (hbk) ISBN10: 0-415-41448-2 (pbk) ISBN10: 0-203-96472-1 (Print Edition) (ebk) ISBN13: 978-0-415-41447-0 (hbk) ISBN13: 978-0-415-41448-7 (pbk) ISBN13: 978-0-203-96472-9 (Print Edition) (ebk) © 2007 Nikolai Dokuchaev Contents Preface vi 1 Review of probability theory 1 2 Basics of stochastic processes 17 3 Discrete time market models 23 4 Basics of Ito calculus and stochastic analysis 49 5 Continuous time market models 75 6 American options and binomial trees 110 7 Implied and historical volatility 132 8 Review of statistical estimation 139 9 Estimation of models for stock prices 168 Legend of notations and abbreviations 182 Selected answers and key figures 183 Bibliography 184 © 2007 Nikolai Dokuchaev Preface Dedicated to Natalia, Lidia, and Mikhail This book gives a systematic, self-sufficient, and yet short presentation of the mainstream topics of Mathematical Finance and related part of Stochastic Analysis and Statistical Finance that covers typical university programs.

**
Risk Management in Trading
** by
Davis Edwards

asset allocation, asset-backed security, backtesting, Black-Scholes formula, Brownian motion, business cycle, computerized trading, correlation coefficient, Credit Default Swap, discrete time, diversified portfolio, fixed income, implied volatility, intangible asset, interest rate swap, iterative process, John Meriwether, London Whale, Long Term Capital Management, margin call, Myron Scholes, Nick Leeson, p-value, paper trading, pattern recognition, random walk, risk tolerance, risk/return, selection bias, shareholder value, Sharpe ratio, short selling, statistical arbitrage, statistical model, stochastic process, systematic trading, time value of money, transaction costs, value at risk, Wiener process, zero-coupon bond

RANDOM NUMBERS Another use of variables is to represent a random quantity. Randomness, in finance, is typically described using notation from probability. Probability is the branch of mathematics that studies how likely or unlikely something is to occur. The probability that an event will occur is represented as a value 64 RISK MANAGEMENT IN TRADING KEY CONCEPT: STOCHASTIC PROCESSES Stochastic is a term that describes a type of random process that evolves over time. In a stochastic process, prices might be modeled as a series whose next value depends on the current value plus a random component. This is slightly different than a completely random process (like the series of numbers obtained by rolling a pair of dice). between a 0 percent chance of occurrence (something will not occur) and a 100 percent chance of occurrence (something will definitely occur).

…

between a 0 percent chance of occurrence (something will not occur) and a 100 percent chance of occurrence (something will definitely occur). In finance, the term stochastic is often used as a synonym for random. Stochastic describes a type of random sequence that evolves over time. In this type of sequence, the value of the next item in the sequence depends on the value of the previous item plus or minus a random value. In finance, stochastic processes are particularly important. This is because prices are often modeled as stochastic processes, and prices are a fundamental input into trading decisions. Common examples of random numbers are the results of throwing dice or flipping a coin. Each roll of the dice or flip of a coin generates a realization of a defined process. The probability of the coin landing on either a head or a tail is 50 percent and the probability of any single number on a regular, six‐sided die is 1/6 (assuming a fair dice roll and fair coin flip).

…

The time that has passed 74 RISK MANAGEMENT IN TRADING 0.1% 0.2% 50/50 chance of +1 or −1 Cumulative Result 10 9 8 7 6 5 4 3 2 1 0 −1 −2 −3 −4 −5 −6 −7 −8 −9 −10 1.0% 0.4% 1.8% 0.8% 1.6% 3.1% 25.0% 50.0% 100.0% 15.6% 25.0% 37.5% 37.5% 25.0% 16.4% 21.9% 23.4% 6.3% 24.6% 27.3% 15.6% 16.4% 20.5% 16.4% 11.7% 10.9% 9.4% 3.1% 24.6% 27.3% 31.3% 25.0% 12.5% 20.5% 24.6% 27.3% 31.3% 11.7% 16.4% 21.9% 23.4% 31.3% 37.5% 50.0% 50.0% 4.4% 7.0% 10.9% 9.4% 6.3% 12.5% 3.1% 5.5% 7.0% 5.5% 3.1% 1.6% 0.8% 4.4% 1.8% 1.0% 0.4% 0.2% 0.1% 0 1 2 3 4 5 6 7 8 9 10 Time FIGURE 3.9 Dispersion in a Random Series For financial mathematics, the Wiener process is often generalized to include a constant drift term that pushes prices upward. The constant drift term is due to risk‐free inflation (and described later in the chapter in the “time value of money” discussion). Continuous time versions of this process are called Generalized Wiener Process or the Ito Process. (See Equation 3.8, A Stochastic Process.) A stochastic process with discrete time steps can be described as: ΔSt = μΔt + σΔWt St or ΔSt = μSt Δt + σSt ΔWt where ΔSt Change in Price. The change in price that will occur St Price. The price of an asset at time t (3.8) Financial Mathematics μ Drift. The drift term that pushes prices upwards. Commonly, this is a constant, but can be generalized to vary over time Δt Change in Time. Typically, finance uses convention that Δt = 1.0 is a one year passage of time.

pages: 571 words: 105,054

**
Advances in Financial Machine Learning
** by
Marcos Lopez de Prado

algorithmic trading, Amazon Web Services, asset allocation, backtesting, bioinformatics, Brownian motion, business process, Claude Shannon: information theory, cloud computing, complexity theory, correlation coefficient, correlation does not imply causation, diversification, diversified portfolio, en.wikipedia.org, fixed income, Flash crash, G4S, implied volatility, information asymmetry, latency arbitrage, margin call, market fragmentation, market microstructure, martingale, NP-complete, P = NP, p-value, paper trading, pattern recognition, performance metric, profit maximization, quantitative trading / quantitative ﬁnance, RAND corporation, random walk, risk-adjusted returns, risk/return, selection bias, Sharpe ratio, short selling, Silicon Valley, smart cities, smart meter, statistical arbitrage, statistical model, stochastic process, survivorship bias, transaction costs, traveling salesman

After Hosking's paper, the literature on this subject has been surprisingly scarce, adding up to eight journal articles written by only nine authors: Hosking, Johansen, Nielsen, MacKinnon, Jensen, Jones, Popiel, Cavaliere, and Taylor. See the references for details. Most of those papers relate to technical matters, such as fast algorithms for the calculation of fractional differentiation in continuous stochastic processes (e.g., Jensen and Nielsen [2014]). Differentiating the stochastic process is a computationally expensive operation. In this chapter we will take a practical, alternative, and novel approach to recover stationarity: We will generalize the difference operator to non-integer steps. 5.4 The Method Consider the backshift operator, B, applied to a matrix of real-valued features {Xt}, where BkXt = Xt − k for any integer k ≥ 0.

…

While assessing the probability of backtest overfitting is a useful tool to discard superfluous investment strategies, it would be better to avoid the risk of overfitting, at least in the context of calibrating a trading rule. In theory this could be accomplished by deriving the optimal parameters for the trading rule directly from the stochastic process that generates the data, rather than engaging in historical simulations. This is the approach we take in this chapter. Using the entire historical sample, we will characterize the stochastic process that generates the observed stream of returns, and derive the optimal values for the trading rule's parameters without requiring a historical simulation. 13.3 The Problem Suppose an investment strategy S invests in i = 1, …I opportunities or bets. At each opportunity i, S takes a position of mi units of security X, where mi ∈ ( − ∞, ∞).

…

Chapters 10 and 16 are dedicated to this station, with the understanding that it would be unreasonable for a book to reveal specific investment strategies. 1.3.1.4 Backtesters This station assesses the profitability of an investment strategy under various scenarios. One of the scenarios of interest is how the strategy would perform if history repeated itself. However, the historical path is merely one of the possible outcomes of a stochastic process, and not necessarily the most likely going forward. Alternative scenarios must be evaluated, consistent with the knowledge of the weaknesses and strengths of a proposed strategy. Team members are data scientists with a deep understanding of empirical and experimental techniques. A good backtester incorporates in his analysis meta-information regarding how the strategy came about. In particular, his analysis must evaluate the probability of backtest overfitting by taking into account the number of trials it took to distill the strategy.

pages: 153 words: 12,501

**
Mathematics for Economics and Finance
** by
Michael Harrison,
Patrick Waldron

Brownian motion, buy low sell high, capital asset pricing model, compound rate of return, discrete time, incomplete markets, law of one price, market clearing, Myron Scholes, Pareto efficiency, risk tolerance, riskless arbitrage, short selling, stochastic process

A random vector is just a vector of random variables. It can also be thought of as a vector-valued function on the sample space Ω. A stochastic process is a collection of random variables or random vectors indexed by time, e.g. {x̃t : t ∈ T } or just {x̃t } if the time interval is clear from the context. For the purposes of this part of the course, we will assume that the index set consists of just a finite number of times i.e. that we are dealing with discrete time stochastic processes. Then a stochastic process whose elements are N -dimensional random vectors is equivalent to an N |T |-dimensional random vector. The (joint) c.d.f. of a random vector or stochastic process is the natural extension of the one-dimensional concept. Random variables can be discrete, continuous or mixed. The expectation (mean, average) of a discrete r.v., x̃, with possible values x1 , x2 , x3 , . . . is given by E [x̃] ≡ ∞ X xi P r (x̃ = xi ) .

…

This framework is sufficient to illustrate the similarities and differences between the most popular approaches. When we consider consumer choice under uncertainty, consumption plans will have to specify a fixed consumption vector for each possible state of nature or state of the world. This just means that each consumption plan is a random vector. Let us review the associated concepts from basic probability theory: probability space; random variables and vectors; and stochastic processes. Let Ω denote the set of all possible states of the world, called the sample space. A collection of states of the world, A ⊆ Ω, is called an event. Let A be a collection of events in Ω. The function P : A → [0, 1] is a probability function if 1. (a) Ω ∈ A (b) A ∈ A ⇒ Ω − A ∈ A (c) Ai ∈ A for i = 1, . . . , ∞ ⇒ S∞ i=1 Ai ∈ A (i.e. A is a sigma-algebra of events) Revised: December 2, 1998 86 5.2.

…

The prices of this, and the other elementary claims, must, by no arbitrage, equal the prices of the corresponding replicating portfolios. 5.5 The Expected Utility Paradigm 5.5.1 Further axioms The objects of choice with which we are concerned in a world with uncertainty could still be called consumption plans, but we will acknowledge the additional structure now described by terming them lotteries. If there are k physical commodities, a consumption plan must specify a k-dimensional vector, x ∈ <k , for each time and state of the world. We assume a finite number of times, denoted by the set T . The possible states of the world are denoted by the set Ω. So a consumption plan or lottery is just a collection of |T | k-dimensional random vectors, i.e. a stochastic process. Again to distinguish the certainty and uncertainty cases, we let L denote the collection of lotteries under consideration; X will now denote the set of possible values of the lotteries in L. Revised: December 2, 1998 94 5.5. THE EXPECTED UTILITY PARADIGM Preferences are now described by a relation on L. We will continue to assume that preference relations are complete, reflexive, transitive, and continuous.

**
The Concepts and Practice of Mathematical Finance
** by
Mark S. Joshi

Black-Scholes formula, Brownian motion, correlation coefficient, Credit Default Swap, delta neutral, discrete time, Emanuel Derman, fixed income, implied volatility, incomplete markets, interest rate derivative, interest rate swap, London Interbank Offered Rate, martingale, millennium bug, quantitative trading / quantitative ﬁnance, short selling, stochastic process, stochastic volatility, the market place, time value of money, transaction costs, value at risk, volatility smile, yield curve, zero-coupon bond

We shall say that the family X of random variables Xt satisfies the stochastic differential equation, dXt = µ(t, Xt)dt + a(t, Xt)dWt, (5.8) The Ito calculus 106 if for any t, we have that Xr+h - Xt - h s(t, Xt) - a(t, Xr)(Wt+h - Wt) is a random variable with mean and variance which are o(h). We shall call such a family of random variables an Ito process or sometimes just a stochastic process. Note that if a is identically zero, we have that Xt+h - Xt - h s(t, Xt) (5.9) is of mean and variance o(h). We have thus essentially recovered the differential equation dXt (5.10) µ(t, Xt). dt The essential aspect of this definition is that if we know X0 and that Xt satisfies the stochastic differential equation, (5.8), then Xt is fully determined. In other terms, the stochastic differential equation has a unique solution. An important corollary of this is that µ and a together with Xo are the only quantities we need to know in order to define a stochastic process. Equally important is the issue of existence - it is not immediately obvious that a family Xt satisfying a given stochastic differential equation exists.

…

Rather surprisingly, this leads to the Black-Scholes price. We therefore have a very powerful alternative method for pricing options. Justifying this procedure requires an excursion into some deep and powerful mathematics. 6.4 The concept of information 141 Before we can proceed to a better understanding of option pricing, we need a better understanding of the nature of stochastic processes. In particular, we need to think a little more deeply about what a stochastic process is. We have talked about a continuous family of processes, Xt, such that X, - XS has a certain distribution. As long as we only look at a finite number of values of t and s this is conceptually fairly clear, but once we start looking at all values at once it as a lot less obvious what these statements mean. One way out is to take the view that each random variable Xt displays some aspect of a single more fundamental variable.

…

The argument we gave above still works; if a portfolio is of zero value and can be positive with positive probability tomorrow then to get the expectation to be zero, there must be a positive probability of negative value tomorrow. Hence, as before arbitrage is impossible. This is still not particularly useful however, as we know that a risky asset will in general grow faster than a riskless bond on average due to the risk aversion of market participants. To get round this problem, we ask what the rate of growth means for a stochastic process. The stochastic process is determined by a probability measure on the sample space which is the space of paths. However, the definition of an arbitrage barely mentions the probability measure. All it says is that it is impossible to set up a portfolio with zero value today which has a positive probability of being of positive value in the future, and a zero probability of being of negative value. The actual magnitude of the positive probability is not mentioned.

pages: 819 words: 181,185

**
Derivatives Markets
** by
David Goldenberg

Black-Scholes formula, Brownian motion, capital asset pricing model, commodity trading advisor, compound rate of return, conceptual framework, correlation coefficient, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, en.wikipedia.org, financial innovation, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, law of one price, locking in a profit, London Interbank Offered Rate, Louis Bachelier, margin call, market microstructure, martingale, Myron Scholes, Norbert Wiener, Paul Samuelson, price mechanism, random walk, reserve currency, risk/return, riskless arbitrage, Sharpe ratio, short selling, stochastic process, stochastic volatility, time value of money, transaction costs, volatility smile, Wiener process, yield curve, zero-coupon bond, zero-sum game

How do we take derivatives of smooth functions of stochastic processes, say F(Xt,t), such as (GBM SDE) where the process is the solution of a stochastic differential equation dXt=μXtdt+σXtdWt with initial value X0? We start with the observation that we can expect to end up with another stochastic process that is also the solution to another stochastic differential equation. This new stochastic differential equation for the total differential of F(Xt,t) will have a new set of drift and diffusion coefficients. The question is what exactly are the drift and diffusion coefficients of dF(Xt,t)? This is one of the problems that K. Itô solved in his famous formula called Itô’s lemma. To understand Itô’s lemma, keep in mind that there are two stochastic processes involved. The first is the underlying process (think of it as the stock).

…

The second equation above says that, Er(S1(ω)|S0)=(1+r′)S0>S0 unless r′=0. Even under risk neutrality (which doesn’t mean zero interest rates), the martingale requirement that Er(S1(ω)|S0)=S0 is clearly violated. Stock prices under risk neutrality are not martingales. However they aren’t very far from martingales. Definition of a Sub (Super) Martingale 1. A discrete-time stochastic process (Xn(ω))n=0,1,2,3,… is called a sub-martingale if E(Xn)<∞, and E(Xn+1(ω)|Xn)>Xn for all n=0,1,2,3,… 2. A discrete-time stochastic process (Xn(ω))n=0,1,2,3,… is called a super-martingale if E(Xn)<∞, , and E(Xn+1(ω)|Xn)<Xn for all n=0,1,2,3,… We expect stock prices to be sub-martingales, not martingales, for two separate and different reasons: 1. All assets, risky or not, have to provide a reward for time and waiting. This reward is the risk-free rate. 2.

…

We will begin with the prototype of all continuous time models, and that is arithmetic Brownian motion (ABM). ABM is the most basic and important stochastic process in continuous time and continuous space, and it has many desirable properties including the strong Markov property, the martingale property, independent increments, normality, and continuous sample paths. Of course, here we want to focus on options pricing rather than the pure mathematical theory. The idea here is to partially prepare you for courses in mathematical finance. The details we have to leave out are usually covered in such courses. 16.1 ARITHMETIC BROWNIAN MOTION (ABM) ABM is a stochastic process {Wt(ω)}t≥0 defined on a sample space (Ω,ℑW,℘W). We won’t go into all the details as to exactly what (Ω,ℑW,℘W) represents but you can think of the probability measure, ℘W, which is called Wiener measure, to be defined in terms of the transition density function p(T,y;t,x) for τ =T–t, Norbert Wiener gave the first rigorous mathematical construction (existence proof) for ABM and, because of this, it is sometimes called the Wiener process.

pages: 345 words: 86,394

**
Frequently Asked Questions in Quantitative Finance
** by
Paul Wilmott

Albert Einstein, asset allocation, beat the dealer, Black-Scholes formula, Brownian motion, butterfly effect, buy and hold, capital asset pricing model, collateralized debt obligation, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, discrete time, diversified portfolio, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, fixed income, fudge factor, implied volatility, incomplete markets, interest rate derivative, interest rate swap, iterative process, lateral thinking, London Interbank Offered Rate, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, market bubble, martingale, Myron Scholes, Norbert Wiener, Paul Samuelson, quantitative trading / quantitative ﬁnance, random walk, regulatory arbitrage, risk/return, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, stochastic volatility, transaction costs, urban planning, value at risk, volatility arbitrage, volatility smile, Wiener process, yield curve, zero-coupon bond

Short Answer Brownian Motion is a stochastic process with stationary independent normally distributed increments and which also has continuous sample paths. It is the most common stochastic building block for random walks in finance. Example Pollen in water, smoke in a room, pollution in a river, are all examples of Brownian motion. And this is the common model for stock prices as well. Long Answer Brownian motion (BM) is named after the Scottish botanist who first described the random motions of pollen grains suspended in water. The mathematics of this process were formalized by Bachelier, in an option-pricing context, and by Einstein. The mathematics of BM is also that of heat conduction and diffusion. Mathematically, BM is a continuous, stationary, stochastic process with independent normally distributed increments.

…

Wilmott magazine, September Halton, JH 1960 On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals. Num. Maths. 2 84-90 Hammersley, JM & Handscomb, DC 1964 Monte Carlo Methods. Methuen, London Harrison, JM & Kreps, D 1979 Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory 20 381-408 Harrison, JM & Pliska, SR 1981 Martingales and stochastic integrals in the theory of continuous trading. Stochastic Processes and their Applications 11 215-260 Haselgrove, CB 1961 A method for numerical integration. Mathematics of Computation 15 323-337 Heath, D, Jarrow, R & Morton, A 1992 Bond pricing and the term structure of interest rates: a new methodology. Econometrica 60 77-105 Ho, T & Lee, S 1986 Term structure movements and pricing interest rate contingent claims. Journal of Finance 42 1129-1142 Itô, K 1951 On stochastic differential equations.

…

Journal of Financial Economics 3 167-79 Haug, EG 2003 Know your weapon, Parts 1 and 2. Wilmott magazine, May and July Haug, EG 2006 The complete Guide to Option Pricing Formulas. McGraw-Hill Lewis, A 2000 Option Valuation under Stochastic Volatility. Finance Press What are the Forward and Backward Equations? Short Answer Forward and backward equations usually refer to the differential equations governing the transition probability density function for a stochastic process. They are diffusion equations and must therefore be solved in the appropriate direction in time, hence the names. Example An exchange rate is currently 1.88. What is the probability that it will be over 2 by this time next year? If you have a stochastic differential equation model for this exchange rate then this question can be answered using the equations for the transition probability density function.

pages: 206 words: 70,924

**
The Rise of the Quants: Marschak, Sharpe, Black, Scholes and Merton
** by
Colin Read

"Robert Solow", Albert Einstein, Bayesian statistics, Black-Scholes formula, Bretton Woods, Brownian motion, business cycle, capital asset pricing model, collateralized debt obligation, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, David Ricardo: comparative advantage, discovery of penicillin, discrete time, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, floating exchange rates, full employment, Henri Poincaré, implied volatility, index fund, Isaac Newton, John Meriwether, John von Neumann, Joseph Schumpeter, Kenneth Arrow, Long Term Capital Management, Louis Bachelier, margin call, market clearing, martingale, means of production, moral hazard, Myron Scholes, Paul Samuelson, price stability, principal–agent problem, quantitative trading / quantitative ﬁnance, RAND corporation, random walk, risk tolerance, risk/return, Ronald Reagan, shareholder value, Sharpe ratio, short selling, stochastic process, Thales and the olive presses, Thales of Miletus, The Chicago School, the scientific method, too big to fail, transaction costs, tulip mania, Works Progress Administration, yield curve

Black saw the description and prediction of interest rates to be a multi-faceted and challenging problem. While he had demonstrated that an options price depends on the underlying stock price mean and volatility, and the risk-free interest rate, the overall market for interest rates is much more multi-dimensional. The interest rate yield curve, which graphs rates against maturities, depends on many markets and instruments, each of which is subject to stochastic processes. His interest and collaboration with Emanuel Derman and Bill Toy resulted in a model of interest rates that was first used profitably by Goldman Sachs through the 1980s, but eventually entered the public domain when they published their work in the Financial Analysts Journal in 1990.2 Their model provided reasonable estimates for both the prices and volatilities of treasury bonds, and is still used today.

…

Black-Scholes model – a model that can determine the price of a European call option based on the assumption that the underlying security follows a geometric Brownian motion with constant drift and volatility. Bond – a financial instrument that provides periodic (typically semi-annual) interest payments and the return of the paid-in capital upon maturity in exchange for a fixed price. Brownian motion – the simplest of the class of continuous-time stochastic processes that describes the random motion of a particle or a security that is buffeted by forces that are normally distributed in strength. Calculus of variations – a mathematical technique that can determine the optimal path of a variable, like savings or consumption, over time. Call – an option to purchase a specified security at a specified future time and price. Capital allocation line – a line drawn on the graph of all possible combinations of risky and risk-free assets that shows the best risk–reward horizon.

…

Keynesian model – a model developed by John Maynard Keynes that demonstrates savings may not necessarily be balanced with new investment and the gross domestic product may differ from that which would result in full employment. Kurtosis – a statistical measure of the distribution of observations about the expected mean as a deviation from that predicted by the normal distribution. Life cycle – the characterization of a process from its birth to death. Life Cycle Model – a model of household consumption behavior from the beginning of its earning capacity to the end of the household. Markov process – a stochastic process with the memorylessness property for which the present state, future state, and past observations are independent. Markowitz bullet – the upper boundary of the efficient frontier of various portfolios when graphed according to risk and return. Martingale – a model of a process for which past events cannot predict future outcomes. Mean – a mathematical technique that can be calculated based on a number of alternative weightings to produce an average for a set of numbers.

pages: 209 words: 13,138

**
Empirical Market Microstructure: The Institutions, Economics and Econometrics of Securities Trading
** by
Joel Hasbrouck

Alvin Roth, barriers to entry, business cycle, conceptual framework, correlation coefficient, discrete time, disintermediation, distributed generation, experimental economics, financial intermediation, index arbitrage, information asymmetry, interest rate swap, inventory management, market clearing, market design, market friction, market microstructure, martingale, price discovery process, price discrimination, quantitative trading / quantitative ﬁnance, random walk, Richard Thaler, second-price auction, selection bias, short selling, statistical model, stochastic process, stochastic volatility, transaction costs, two-sided market, ultimatum game, zero-sum game

Price discreteness, for example, reflects a tick size (minimum pricing increment) that is generally set in level units. For reasons that will be discussed shortly, the drift can be dropped in most microstructure analyses. When µ = 0, pt cannot be forecast beyond its most recent value: E[pt+1 | pt , pt−1 , . . .] = pt . A process with this property is generally described as a martingale. One definition of a martingale is a discrete stochastic process {xt } where E|xt | < ∞ for all t, and E(xt+1 | xt , xt−1 , . . . ) = xt (see Karlin and Taylor (1975) or Ross (1996)). Martingale behavior of asset prices is a classic result arising in many economic models with individual optimization, absence of arbitrage, or security market equilibrium (Cochrane (2005)). The result is generally contingent, however, on assumptions of frictionless trading opportunities, which are not appropriate in most microstructure applications.

…

Placing the price change first is simply an expositional simplification and carries no implications that this variable is first in any causal sense. The chapter treats the general case but uses a particular structural model for purposes of illustration. The structural model is a bivariate model of price changes and trade directions: yt = [pt qt ]′ . 9.1 Modeling Vector Time Series The basic descriptive statistics of a vector stochastic process { yt } are the process mean µ = E[yt ] and the vector autocovariances. The vector autocovariances are defined as the matrices 78 MULTIVARIATE LINEAR MICROSTRUCTURE MODELS Ŵk = E( yt − E [yt ])(yt−k − E [yt ])′ for k = . . . −2, −1, 0, +1, +2, . . . (9.1) In suppressing the dependence of µ and Ŵk on t, we have implicitly invoked an assumption of covariance stationarity. Note that although a univariate autocorrelation has the property that γk = γ−k , the corresponding property in the multivariate case is Ŵk = Ŵ′−k .

…

The buy limit price is denoted Lt . If at time t, pt ≥ Lt , then the agent has effectively submitted a marketable limit order, which achieves immediate execution. A limit order priced at Lt < pt will be executed during period t if pτ ≤ Lt for any time t < τ < t + 1. The situation is depicted in figure 15.2. A limit order priced at Lt executes if the stock price follows path B but not path A. This is a standard problem in stochastic processes, and many exact results are available. The diffusion-barrier notion of execution is at best a first approximation. In many markets, a buy limit order might be executed by a market (or marketable) sell order while the best ask is still well above the limit price. We will subsequently generalize the execution mechanism to allow this. For the moment, though, it might be noted that the present situation is not without precedent.

pages: 855 words: 178,507

**
The Information: A History, a Theory, a Flood
** by
James Gleick

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, bank run, bioinformatics, Brownian motion, butterfly effect, citation needed, Claude Shannon: information theory, clockwork universe, computer age, conceptual framework, crowdsourcing, death of newspapers, discovery of DNA, Donald Knuth, double helix, Douglas Hofstadter, en.wikipedia.org, Eratosthenes, Fellow of the Royal Society, Gödel, Escher, Bach, Henri Poincaré, Honoré de Balzac, index card, informal economy, information retrieval, invention of the printing press, invention of writing, Isaac Newton, Jacquard loom, Jaron Lanier, jimmy wales, Johannes Kepler, John von Neumann, Joseph-Marie Jacquard, lifelogging, Louis Daguerre, Marshall McLuhan, Menlo Park, microbiome, Milgram experiment, Network effects, New Journalism, Norbert Wiener, Norman Macrae, On the Economy of Machinery and Manufactures, PageRank, pattern recognition, phenotype, Pierre-Simon Laplace, pre–internet, Ralph Waldo Emerson, RAND corporation, reversible computing, Richard Feynman, Rubik’s Cube, Simon Singh, Socratic dialogue, Stephen Hawking, Steven Pinker, stochastic process, talking drums, the High Line, The Wisdom of Crowds, transcontinental railway, Turing machine, Turing test, women in the workforce

.♦ To illuminate the structure of the message Shannon turned to some methodology and language from the physics of stochastic processes, from Brownian motion to stellar dynamics. (He cited a landmark 1943 paper by the astrophysicist Subrahmanyan Chandrasekhar in Reviews of Modern Physics.♦) A stochastic process is neither deterministic (the next event can be calculated with certainty) nor random (the next event is totally free). It is governed by a set of probabilities. Each event has a probability that depends on the state of the system and perhaps also on its previous history. If for event we substitute symbol, then a natural written language like English or Chinese is a stochastic process. So is digitized speech; so is a television signal. Looking more deeply, Shannon examined statistical structure in terms of how much of a message influences the probability of the next symbol.

…

His colleagues thought this was a bit “addled”—that Shannon’s work was “more technology than mathematics,”♦ as Kolmogorov recalled it afterward. “It is true,” he said, “that Shannon left to his successors the rigorous ‘justification’ of his ideas in some difficult cases. However, his mathematical intuition was amazingly precise.” Kolmogorov was not as enthusiastic about cybernetics. Norbert Wiener felt a kinship with him—they had both done early work on stochastic processes and Brownian motion. On a visit to Moscow, Wiener said, “When I read the works of Academician Kolmogorov, I feel that these are my thoughts as well, this is what I wanted to say. And I know that Academician Kolmogorov has the same feeling when reading my works.”♦ But the feeling was evidently not shared. Kolmogorov steered his colleagues toward Shannon instead. “It is easy to understand that as a mathematical discipline cybernetics in Wiener’s understanding lacks unity,” he said, “and it is difficult to imagine productive work in training a specialist, say a postgraduate student, in cybernetics in this sense.”♦ He already had real results to back up his instincts: a useful generalized formulation of Shannon entropy, and an extension of his information measure to processes in both discrete and continuous time.

…

classification, 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, 3.1, 3.2 Clausius, Rudolf, 9.1, 9.2, 9.3 Clauson-Thue, William, 5.1, 5.2, 5.3 Clement, Joseph, 4.1, 4.2 clocks, synchronization of, 1.1, 5.1, 5.2, 5.3, 5.4 cloud, information, 14.1, 14.2 clustering Clytemnestra code attempts to reduce cost of telegraphy, 5.1, 5.2 Babbage’s interest in cipher and compression systems for telegraphy, 5.1, 5.2, 5.3, 5.4, 5.5, 5.6 Enigma, 7.1, 7.2, 7.3 genetic, 10.1, 10.2, 10.3, 10.4, 10.5, 10.6, 10.7, 10.8, 10.9, 10.10 in Jacquard loom operations Morse, prl.1, 1.1, 1.2, 1.3, 1.4, 5.1, 5.2, 5.3, 5.4, 6.1, 11.1 as noise for printing telegraph Shannon’s interest in, prl.1, 6.1, 7.1 telegraphy before Morse code, 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 see also cryptography coding theory, 8.1, 8.2, 10.1, 12.1 cognitive science, 8.1, 8.2, 8.3, 8.4 Colebrooke, Henry collective consciousness, epl.1, epl.2, epl.3, epl.4, epl.5 Colossus computing machine Columbus, Christopher combinatorial analysis, 6.1, 10.1, 10.2 communication by algorithm with alien life-form, 12.1, 12.2, 12.3, 12.4, 12.5 Babbage’s mechanical notation for describing, 4.1, 4.2, 5.1 constrained channels of, 2.1, 2.2 disruptive effects of new technologies in, 15.1, 15.2 emergence of global consciousness, epl.1, epl.2, epl.3 evolution of electrical technologies for, 5.1, 5.2, 6.1, 6.2 fundamental problem of, prl.1, 7.1, 7.2, 8.1 human evolution and, prl.1, prl.2 implications of technological evolution of, 15.1, 15.2 information overload and, epl.1, epl.2 knowledge needs for, 12.1, 12.2, 12.3 in origins of governance Shannon’s diagram of, 7.1, 7.2, 7.3 as stochastic process symbolic logic to describe systems of system elements, 7.1, 7.2 in Twitter, epl.1, epl.2 see also talking drums; telegraphy; telephony; transmission of information compact disc, prl.1, 8.1, epl.1 complexity, 12.1, 12.2, 12.3, 12.4, 12.5, 12.6, 12.7, 12.8, 12.9 compression of information; see data compression “Computable Numbers, On” (Turing), 7.1, 7.2, 12.1 computation in Babylonian mathematics, 2.1, 2.2 computable and uncomputable numbers, 7.1, 7.2, 7.3, 7.4, 12.1, 12.2, 12.3 of differential equations, 4.1, 4.2 in evolution of complex structures human computers, 4.1, 4.2, 4.3 thermodynamics of, 13.1, 13.2, 13.3, 13.4 Turing machine for, 7.1, 7.2, 7.3, 7.4, 7.5, 7.6 see also calculators; computers computer(s) analog and digital, 8.1, 8.2 chess-playing, 8.1, 8.2 comparison to humans, 8.1, 8.2 cost of memory storage cost of work of, 13.1, 13.2 early mechanical, prl.1, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 8.1 growth of memory and processing speed of, 14.1, 14.2, 14.3, 14.4 inductive learning in perception of thinking by, 8.1, 8.2, 8.3, 8.4 public awareness of quantum-based, 13.1, 13.2, 13.3, 13.4 Shannon’s information theory in, prl.1, 6.1, 7.1, 7.2, 8.1, 8.2 significance of information theory in development of spread of memes through Turing’s conceptualization of, 8.1, 8.2, 8.3 universe as, 14.1, 14.2 see also calculators; computation; programming Conference on Cybernetics, 8.1, 8.2, 8.3, 8.4, 8.5, 8.6, 8.7, 8.8, 8.9, 8.10, 8.11, 8.12 Connolly, Sean J.

pages: 338 words: 106,936

**
The Physics of Wall Street: A Brief History of Predicting the Unpredictable
** by
James Owen Weatherall

Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Asian financial crisis, bank run, beat the dealer, Benoit Mandelbrot, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, business cycle, butterfly effect, buy and hold, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, collateralized debt obligation, collective bargaining, dark matter, Edward Lorenz: Chaos theory, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, John Nash: game theory, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, martingale, Myron Scholes, new economy, Paul Lévy, Paul Samuelson, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative ﬁnance, random walk, Renaissance Technologies, risk-adjusted returns, Robert Gordon, Robert Shiller, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, The Chicago School, The Myth of the Rational Market, tulip mania, Vilfredo Pareto, volatility smile

Although discussing such debates is far from the scope of this book, I should note that the arguments offered here for how one should think of the status of mathematical models in finance are closely connected to more general discussions concerning the status of mathematical or physical theories quite generally. “. . . named after Scottish botanist Robert Brown . . .”: Brown’s observations were published as Brown (1828). “The mathematical treatment of Brownian motion . . .”: More generally, Brownian motion is an example of a random or “stochastic” process. For an overview of the mathematics of stochastic processes, see Karlin and Taylor (1975, 1981). “. . . it was his 1905 paper that caught Perrin’s eye”: Einstein published four papers in 1905. One of them was the one I refer to here (Einstein 1905b), but the other three were equally remarkable. In Einstein (1905a), he first suggests that light comes in discrete packets, now called quanta or photons; in Einstein (1905c), he introduces his special theory of relativity; and in Einstein (1905d), he proposes the famous equation e = mc2

…

The Code-Breakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet. New York: Scribner. Kaplan, Ian. 2002. “The Predictors by Thomas A. Bass: A Retrospective.” This is a comment on The Predictors by a former employee of the Prediction Company. Available at http://www.bearcave.com/bookrev/predictors2.html. Karlin, Samuel, and Howard M. Taylor. 1975. A First Course in Stochastic Processes. 2nd ed. San Diego, CA: Academic Press. — — — . 1981. A Second Course in Stochastic Processes. San Diego, CA: Academic Press. Katzmann, Robert A. 2008. Daniel Patrick Moynihan: The Intellectual in Public Life. Washington, DC: Woodrow Wilson Center Press. Kelly, J., Jr. 1956. “A New Interpretation of Information Rate.” IRE Transactions on Information Theory 2 (3, September): 185–89. Kelly, Kevin. 1994a. “Cracking Wall Street.”

…

“Consumer Prices, the Consumer Price Index, and the Cost of Living.” Journal of Economic Perspectives 12 (1, Winter): 3–26. Bosworth, Barry P. 1997. “The Politics of Immaculate Conception.” The Brookings Review, June, 43–44. Bouchaud, Jean-Philippe, and Didier Sornette. 1994. “The Black-Scholes Option Pricing Problem in Mathematical Finance: Generalization and Extensions for a Large Class of Stochastic Processes.” Journal de Physique 4 (6): 863–81. Bower, Tom. 1984. Klaus Barbie, Butcher of Lyons. London: M. Joseph. Bowman, D. D., G. Ouillion, C. G. Sammis, A. Sornette, and D. Sornette. 1998. “An Observational Test of the Critical Earthquake Concept.” Journal of Geophysical Research 103: 24359–72. Broad, William J. 1992. “Defining the New Plowshares Those Old Swords Will Make.” The New York Times, February 5.

pages: 695 words: 194,693

**
Money Changes Everything: How Finance Made Civilization Possible
** by
William N. Goetzmann

Albert Einstein, Andrei Shleifer, asset allocation, asset-backed security, banking crisis, Benoit Mandelbrot, Black Swan, Black-Scholes formula, Bretton Woods, Brownian motion, business cycle, capital asset pricing model, Cass Sunstein, collective bargaining, colonial exploitation, compound rate of return, conceptual framework, corporate governance, Credit Default Swap, David Ricardo: comparative advantage, debt deflation, delayed gratification, Detroit bankruptcy, disintermediation, diversified portfolio, double entry bookkeeping, Edmond Halley, en.wikipedia.org, equity premium, financial independence, financial innovation, financial intermediation, fixed income, frictionless, frictionless market, full employment, high net worth, income inequality, index fund, invention of the steam engine, invention of writing, invisible hand, James Watt: steam engine, joint-stock company, joint-stock limited liability company, laissez-faire capitalism, Louis Bachelier, mandelbrot fractal, market bubble, means of production, money market fund, money: store of value / unit of account / medium of exchange, moral hazard, Myron Scholes, new economy, passive investing, Paul Lévy, Ponzi scheme, price stability, principal–agent problem, profit maximization, profit motive, quantitative trading / quantitative ﬁnance, random walk, Richard Thaler, Robert Shiller, Robert Shiller, shareholder value, short selling, South Sea Bubble, sovereign wealth fund, spice trade, stochastic process, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, time value of money, too big to fail, trade liberalization, trade route, transatlantic slave trade, tulip mania, wage slave

Mandelbrot was a student of Paul Lévy’s—the son of the man who gave Bachelier bad marks at his examination at the École Polytechnique in 1900. Lévy’s research focused on “stochastic processes”: mathematical models that describe the behavior of some variable through time. For example, we saw in Chapter 15 that Jules Regnault proposed and tested a stochastic process that varied randomly, which resulted in a rule about risk increasing with the square root of time. Likewise, Louis Bachelier more formally developed a random-walk stochastic process. Paul Lévy formalized these prior random walk models into a very general family of stochastic processes referred to as Lévy processes. Brownian motion was just one process in the family of Lévy processes—and perhaps the best behaved of them. Other stochastic processes have such things as discontinuous jumps and unusually large shocks (which might, for example, explain the crash of 1987, when the US stock market lost 22.6% of its value in a single day).

…

One of his major contributions to the literature on finance (published in 1966) was a proof that an efficient market implies that stock prices may not follow a random walk, but that they must be unpredictable. It was a nice refinement of Regnault’s hypothesis articulated almost precisely a century prior. Although Mandelbrot ultimately developed a fractal-based option-pricing model with two of his students that allowed for extreme events and a more general stochastic process, for various reasons Mandelbrot never saw it adopted in practice to any great extent. I suspect that this is because the solution, while potentially useful, is complicated and contradicts most other tools that quantitative financiers use. With Mandelbrot’s models, it is all or nothing. You have to take a leap beyond the world of Brownian motion and throw out old friends like Bernoulli’s law of large numbers.

…

Benoit Mandelbrot believed he had discovered a deep structure to the world in general and financial markets in particular. His insights, however, can be traced directly back to the special tradition of mathematical inquiry that has its roots in the Enlightenment. I think this is what most excited him about his work—thinking of it in historical context as a culmination of applications of probability to markets. Although not all quants are aware of it, when they use a stochastic process (like Brownian motion) to price a security or figure out a hedge, they are drawing from a very deep well of mathematical knowledge that would not have existed but for the emergence of financial markets in Europe. Yes, the models that modern quants have applied to markets can go wrong. Models are crude attempts to characterize a reality that is complex and continually evolving. Despite the crashes—or perhaps because of them—financial markets have continually challenged the best and brightest minds with puzzles that hold the promise of intellectual and pecuniary rewards.

**
High-Frequency Trading
** by
David Easley,
Marcos López de Prado,
Maureen O'Hara

algorithmic trading, asset allocation, backtesting, Brownian motion, capital asset pricing model, computer vision, continuous double auction, dark matter, discrete time, finite state, fixed income, Flash crash, High speed trading, index arbitrage, information asymmetry, interest rate swap, latency arbitrage, margin call, market design, market fragmentation, market fundamentalism, market microstructure, martingale, natural language processing, offshore financial centre, pattern recognition, price discovery process, price discrimination, price stability, quantitative trading / quantitative ﬁnance, random walk, Sharpe ratio, statistical arbitrage, statistical model, stochastic process, Tobin tax, transaction costs, two-sided market, yield curve

., 2012, “High Frequency Trading and Volatility”, SSRN Working Paper. Brunnermeier, M., and L. H. Pedersen, 2005, “Predatory Trading”, Journal of Finance 40(4), pp. 1825–63. Carlin, B., M. Sousa Lobo and S. Viswanathan, 2007, “Episodic Liquidity Crises: Cooperative and Predatory Trading”, Journal of Finance 42(5), pp. 2235–74. Clark, P. K., 1970, “A Subordinated Stochastic Process Model of Cotton Futures Prices”, PhD Dissertation, Harvard University. Clark, P. K., 1973, “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices”, Econometrica 41(1), pp. 135–55. Donefer, B. S., 2010, “Algos Gone Wild: Risk in the World of Automated Trading Strategies”, The Journal of Trading 5, pp. 31–4. Easley, D., N. Kiefer, M. O’Hara and J. Paperman, 1996, “Liquidity, Information, and Infrequently Traded Stocks”, Journal of Finance 51, pp. 1405–36.

…

John Wiley and Sons, Chichester. 18 i i i i i i “Easley” — 2013/10/8 — 11:31 — page 19 — #39 i i THE VOLUME CLOCK: INSIGHTS INTO THE HIGH-FREQUENCY PARADIGM Linton, O., and M. O’Hara, 2012, “The Impact of Computer Trading on Liquidity, Price Efficiency/Discovery and Transactions Costs”, in Foresight: The Future of Computer Trading in Financial Markets. An International Perspective, Final Project Report. The Government Office for Science, London. Mandelbrot, B., 1973, “Comments on ‘A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices by Peter K. Clark’ ”, Econometrica 41(1), pp. 157–59. Mandelbrot, B., and M. Taylor, 1967, “On the Distribution of Stock Price Differences”, Operations Research 15(6), pp. 1057–62. NANEX, 2010, “Analysis of the ‘Flash Crash’ ”, June 18. URL: http://www.nanex.net/ 20100506/FlashCrashAnalysis_CompleteText.html. NANEX, 2011, “Strange Days June 8’th, 2011 – NatGas Algo”.

…

Algorithmic approaches to execution problems are fairly well studied, and often apply methods from the stochastic control literature (Bertsimas and Lo 1998; Bouchaud et al 2002; Cont and Kukanov 2013; Guéant et al 2012; Kharroubi and Pham 2010). The aforementioned papers seek to solve problems similar to ours, ie, to execute a certain number of shares over some fixed period as cheaply as possible, but approach it from another direction. They typically start with an assumption that the underlying “true” stock price is generated by some known stochastic process. There is also a known impact function that specifies how arriving liquidity demand pushes market prices away from this true value. Having this information, as well as time and volume constraints, it is then possible to compute the optimal strategy explicitly. This can be done either in closed form or numerically (often using dynamic programming, the basis of reinforcement learning). There are also interesting game-theoretic variants of execution problems in the presence of an arbitrageur (Moallemi et al 2012), and examinations of the tension between exploration and exploitation (Park and van Roy 2012).

**
Monte Carlo Simulation and Finance
** by
Don L. McLeish

Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, discrete time, distributed generation, finite state, frictionless, frictionless market, implied volatility, incomplete markets, invention of the printing press, martingale, p-value, random walk, Sharpe ratio, short selling, stochastic process, stochastic volatility, survivorship bias, the market place, transaction costs, value at risk, Wiener process, zero-coupon bond, zero-sum game

This process Zs is, both in discrete and continuous time, a martingale. MODELS IN CONTINUOUS TIME 67 Wiener Process 3 2.5 2 W(t) 1.5 1 0.5 0 -0.5 -1 0 1 2 3 4 5 t 6 7 8 9 Figure 2.6: A sample path of the Wiener process Models in Continuous Time We begin with some oversimplified rules of stochastic calculus which can be omitted by those with a background in Brownian motion and diﬀusion. First, we define a stochastic process Wt called the standard Brownian motion or Wiener process having the following properties; 1. For each h > 0, the increment W (t+h)−W (t) has a N (0, h) distribution and is independent of all preceding increments W (u) − W (v), t > u > v > 0. 2. W (0 ) = 0 . [FIGURE 2.6 ABOUT HERE] The fact that such a process exists is by no means easy to see. It has been an important part of the literature in Physics, Probability and Finance at least since the papers of Bachelier and Einstein, about 100 years ago.

…

And when the drift term a(Xt , t ) is linear in Xt , the solution of an ordinary diﬀerential equation will allow the calculation of the expected value of the process and this is the first and most basic description of its behaviour. The MODELS IN CONTINUOUS TIME 77 appendix provides an elementary review of techniques for solving partial and ordinary diﬀerential equations. However, that the information about a stochastic process obtained from a deterministic object such as a ordinary or partial diﬀerential equation is necessarily limited. For example, while we can sometimes obtain the marginal distribution of the process at time t it is more diﬃcult to obtain quantities such as the joint distribution of variables which depending on the path of the process, and these are important in valuing certain types of exotic options such as lookback and barrier options.

…

Solving deterministic diﬀerential equations can sometimes provide a solution to a specific problem such as finding the arbitrage-free price of a derivative. In general, for more complex features of the derivative such as the distribution of return, important for considerations such as the Value at Risk, we need to obtain a solution {Xt , 0 < t < T }to an equation of the above form which is a stochastic process. Typically this can only be done by simulation. One of the simplest methods of simulating such a process is motivated through a crude interpretation of the above equation in terms of discrete time steps, that is that a small increment Xt+h − Xt in the process is approximately normally distributed with mean given by a(Xt , t)hand variance given by σ 2 (Xt , t)h. We generate these increments sequentially, beginning with an assumed value for X0 , and then adding to obtain an approximation to the value of the process at discrete times t = 0, h, 2h, 3h, . . ..

**
Mathematics for Finance: An Introduction to Financial Engineering
** by
Marek Capinski,
Tomasz Zastawniak

Black-Scholes formula, Brownian motion, capital asset pricing model, cellular automata, delta neutral, discounted cash flows, discrete time, diversified portfolio, fixed income, interest rate derivative, interest rate swap, locking in a profit, London Interbank Offered Rate, margin call, martingale, quantitative trading / quantitative ﬁnance, random walk, short selling, stochastic process, time value of money, transaction costs, value at risk, Wiener process, zero-coupon bond

This results in the following bond prices at time 1: 101.14531 in the up state and 100.9999 in the down state. (The latter is the same as for the par bond.) Expectation with respect to the risk-neutral probability gives the initial bond price 100.05489, so the ﬂoor is worth 0.05489. Bibliography Background Reading: Probability and Stochastic Processes Ash, R. B. (1970), Basic Probability Theory, John Wiley & Sons, New York. Brzeźniak, Z. and Zastawniak, T. (1999), Basic Stochastic Processes, Springer Undergraduate Mathematics Series, Springer-Verlag, London. Capiński, M. and Kopp, P. E. (1999), Measure, Integral and Probability, Springer Undergraduate Mathematics Series, Springer-Verlag, London. Capiński, M. and Zastawniak, T. (2001), Probability Through Problems, Springer-Verlag, New York. Chung, K. L. (1974), A Course in Probability Theory, Academic Press, New York.

…

Erdmann Oxford University L.C.G. Rogers University of Cambridge E. Süli Oxford University J.F. Toland University of Bath Other books in this series A First Course in Discrete Mathematics I. Anderson Analytic Methods for Partial Differential Equations G. Evans, J. Blackledge, P. Yardley Applied Geometry for Computer Graphics and CAD D. Marsh Basic Linear Algebra, Second Edition T.S. Blyth and E.F. Robertson Basic Stochastic Processes Z. Brzeźniak and T. Zastawniak Elementary Differential Geometry A. Pressley Elementary Number Theory G.A. Jones and J.M. Jones Elements of Abstract Analysis M. Ó Searcóid Elements of Logic via Numbers and Sets D.L. Johnson Essential Mathematical Biology N.F. Britton Fields, Flows and Waves: An Introduction to Continuum Models D.F. Parker Further Linear Algebra T.S. Blyth and E.F. Robertson Geometry R.

pages: 407 words: 104,622

**
The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution
** by
Gregory Zuckerman

affirmative action, Affordable Care Act / Obamacare, Albert Einstein, Andrew Wiles, automated trading system, backtesting, Bayesian statistics, beat the dealer, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, blockchain, Brownian motion, butter production in bangladesh, buy and hold, buy low sell high, Claude Shannon: information theory, computer age, computerized trading, Credit Default Swap, Daniel Kahneman / Amos Tversky, diversified portfolio, Donald Trump, Edward Thorp, Elon Musk, Emanuel Derman, endowment effect, Flash crash, George Gilder, Gordon Gekko, illegal immigration, index card, index fund, Isaac Newton, John Meriwether, John Nash: game theory, John von Neumann, Loma Prieta earthquake, Long Term Capital Management, loss aversion, Louis Bachelier, mandelbrot fractal, margin call, Mark Zuckerberg, More Guns, Less Crime, Myron Scholes, Naomi Klein, natural language processing, obamacare, p-value, pattern recognition, Peter Thiel, Ponzi scheme, prediction markets, quantitative hedge fund, quantitative trading / quantitative ﬁnance, random walk, Renaissance Technologies, Richard Thaler, Robert Mercer, Ronald Reagan, self-driving car, Sharpe ratio, Silicon Valley, sovereign wealth fund, speech recognition, statistical arbitrage, statistical model, Steve Jobs, stochastic process, the scientific method, Thomas Bayes, transaction costs, Turing machine

Members of Axcom’s team viewed investing through a math prism and understood financial markets to be complicated and evolving, with behavior that is difficult to predict, at least over long stretches—just like a stochastic process. It’s easy to see why they saw similarities between stochastic processes and investing. For one thing, Simons, Ax, and Straus didn’t believe the market was truly a “random walk,” or entirely unpredictable, as some academics and others argued. Though it clearly had elements of randomness, much like the weather, mathematicians like Simons and Ax would argue that a probability distribution could capture futures prices as well as any other stochastic process. That’s why Ax thought employing such a mathematical representation could be helpful to their trading models. Perhaps by hiring Carmona, they could develop a model that would produce a range of likely outcomes for their investments, helping to improve their performance.

pages: 571 words: 124,448

**
Building Habitats on the Moon: Engineering Approaches to Lunar Settlements
** by
Haym Benaroya

3D printing, biofilm, Black Swan, Brownian motion, Buckminster Fuller, carbon-based life, centre right, clean water, Colonization of Mars, Computer Numeric Control, conceptual framework, data acquisition, Elon Musk, fault tolerance, gravity well, inventory management, Johannes Kepler, low earth orbit, orbital mechanics / astrodynamics, performance metric, RAND corporation, risk tolerance, Ronald Reagan, stochastic process, telepresence, telerobotics, the scientific method, urban planning, X Prize, zero-sum game

Each is a random variable, signifying that there are uncertainties about both, and here they are modeled using normal probability density functions. When the stress exceeds the strength, the system has failed. A measure of the reliability (probability of failure) is given by the overlapped area, shown hatched. The random variable is a static property – the shape of the density function does not change with time. Where the density function is time-dependent, the variable is called a random, or stochastic, process. Before examining some commonly used densities, we define an averaging procedure known as the mathematical expectation for probabilistic variables. 10.3 Mathematical Expectation The single most important descriptor of a random variable is its mean or expected value. This defines the most likely value of a variable. However, random variables may have the same mean, but their spread of possible values, or their variance, can be considerably different.

…

As we know, and as we will discuss in more detail subsequently, the complete structure can behave in ways that are unpredictable, if based only on the behavior of its components. Our reliability estimates are guesses about the future, not extrapolations from past data. More on this later. Now that we have an understanding of the autocorrelation, we proceed to study its Fourier transform, the spectral density . 10.6 Power Spectrum A measure of the ‘energy’ of the stochastic process X(t) is given by its power spectrum , or spectral density , S XX (ω), which is the Fourier transform of its autocorrelation function: and thus: (10.14) These equations are known as the Wiener-Khintchine formulas. Since R XX (−τ) = R XX (τ), S XX (ω) is not a complex function but a real and even function. For τ = 0: where S XX (ω) ≥ 0 since, as a measure of energy, it must be positive semi-definite.

…

Happel (1993): Indigenous materials for lunar construction. Applied Mechanics Reviews, 46(6), pp.313–325. Footnotes 1From the Greek we also have the stochastic (στoκoς) process. 2An axiom is a rule that is assumed to be true, and upon which further rules and facts are deduced. For engineering, the deduced facts must conform to reality. An excellent book on the basics of probabilistic modeling is Probability, Random Variables, and Stochastic Processes, A. Papoulis, McGraw-Hill, 1965. © Springer International Publishing AG 2018 Haym BenaroyaBuilding Habitats on the MoonSpringer Praxis Bookshttps://doi.org/10.1007/978-3-319-68244-0_11 11. Reliability and damage Haym Benaroya1 (1)Professor of Mechanical & Aerospace Engineering, Rutgers University, New Brunswick, New Jersey, USA “We need to make sure it survives for a while.”

**
Commodity Trading Advisors: Risk, Performance Analysis, and Selection
** by
Greg N. Gregoriou,
Vassilios Karavas,
François-Serge Lhabitant,
Fabrice Douglas Rouah

Asian financial crisis, asset allocation, backtesting, buy and hold, capital asset pricing model, collateralized debt obligation, commodity trading advisor, compound rate of return, constrained optimization, corporate governance, correlation coefficient, Credit Default Swap, credit default swaps / collateralized debt obligations, discrete time, distributed generation, diversification, diversified portfolio, dividend-yielding stocks, fixed income, high net worth, implied volatility, index arbitrage, index fund, interest rate swap, iterative process, linear programming, London Interbank Offered Rate, Long Term Capital Management, market fundamentalism, merger arbitrage, Mexican peso crisis / tequila crisis, p-value, Pareto efficiency, Ponzi scheme, quantitative trading / quantitative ﬁnance, random walk, risk-adjusted returns, risk/return, selection bias, Sharpe ratio, short selling, stochastic process, survivorship bias, systematic trading, technology bubble, transaction costs, value at risk, zero-sum game

Faff and Hallahan (2001) argue that survivorship bias is more likely to cause performance reversals than performance persistence. The data used show considerable kurtosis (see Table 3.1). However, this kurtosis may be caused by heteroskedasticity (returns of some funds are more variable than others). REGRESSION TEST OF PERFORMANCE PERSISTENCE To measure performance persistence, a model of the stochastic process that generates returns is required. The process considered is: rit = αi + βi rt + εit , ε it ~ N(0, σ i2 ) i = 1, K , n and t = 1, K , T (3.1) where rit = return of fund (or CTA) i in month t rt = average fund returns in month t slope parameter bi = differences in leverage. The model allows each fund to have a different variance, which is consistent with past research. We also considered models that assumed that bi is zero, with either fixed effects (dummy variables) for time or random effects instead.

…

This demonstrates that most of the nonnormality shown in Table 3.1 is due to heteroskedasticity. MONTE CARLO STUDY In their method, EGR ranked funds by their mean return or modified Sharpe ratio in a first period, and then determined whether the funds that ranked high in the first period also ranked high in the second period. We use Monte Carlo simulation to determine the power and size of hypothesis tests with EGR’s method when data follow the stochastic process given in equation 3.1. Data were generated by specifying values of α, β, and σ. The simulation used 1,000 replications and 120 simulated funds. The mean return over all funds, r̄t, is derived from the values of α and β as: Σα i Σε it + n n rt = Σβ i 1− n where all sums are from i = 1 to n. A constant value of α simulates no performance persistence. For the data sets generated with persistence present, α was generated randomly based on the mean and variance of β’s in each of the three data sets.

…

Chicago Mercantile Exchange. (1999) “Question and Answer Report: Managed Futures Accounts.” Report No. M584/10M/1299. www.cve.com. Christoffersen, P. (2003) Elements of Financial Risk Management. San Diego, CA: Academic Press. Chung, S. Y. (1999) “Portfolio Risk Measurement: A Review of Value at Risk.” Journal of Alternative Investments, Vol. 2, No. 1, pp. 34–42. Clark, P. K. (1973) “A Subordinated Stochastic Process Model with Finite Variance for Speculative Prices.” Econometrica, Vol. 41, No. 1, pp. 135–155. Clayton, U. (2003) A Guide to the Law of Securitisation in Australia. Sydney, Australia: Clayton Company. Cooley, P. L., R. L. Roenfeldt, and N. K. Modani. (1977) “Interdependence of Market Risk Measures.” Journal of Business, Vol. 50, No. 3, pp. 356–363. Cootner, P. (1967) “Speculation and Hedging.”

**
How I Became a Quant: Insights From 25 of Wall Street's Elite
** by
Richard R. Lindsey,
Barry Schachter

Albert Einstein, algorithmic trading, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, asset allocation, asset-backed security, backtesting, bank run, banking crisis, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, business cycle, business process, butter production in bangladesh, buy and hold, buy low sell high, capital asset pricing model, centre right, collateralized debt obligation, commoditize, computerized markets, corporate governance, correlation coefficient, creative destruction, Credit Default Swap, credit default swaps / collateralized debt obligations, currency manipulation / currency intervention, discounted cash flows, disintermediation, diversification, Donald Knuth, Edward Thorp, Emanuel Derman, en.wikipedia.org, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, full employment, George Akerlof, Gordon Gekko, hiring and firing, implied volatility, index fund, interest rate derivative, interest rate swap, John von Neumann, linear programming, Loma Prieta earthquake, Long Term Capital Management, margin call, market friction, market microstructure, martingale, merger arbitrage, Myron Scholes, Nick Leeson, P = NP, pattern recognition, Paul Samuelson, pensions crisis, performance metric, prediction markets, profit maximization, purchasing power parity, quantitative trading / quantitative ﬁnance, QWERTY keyboard, RAND corporation, random walk, Ray Kurzweil, Richard Feynman, Richard Stallman, risk-adjusted returns, risk/return, shareholder value, Sharpe ratio, short selling, Silicon Valley, six sigma, sorting algorithm, statistical arbitrage, statistical model, stem cell, Steven Levy, stochastic process, systematic trading, technology bubble, The Great Moderation, the scientific method, too big to fail, trade route, transaction costs, transfer pricing, value at risk, volatility smile, Wiener process, yield curve, young professional

Like many mathematicians and physicists, I found the mathematics of the Black-Scholes options pricing formula incredibly interesting. For starters, after years of specializing in pure mathematics, I was starting from scratch in a totally new area. It allowed me to start to learn basic mathematics instead of delving deeper and deeper into advanced subjects. I literally had to start from scratch and learn probability theory and then the basics of stochastic processes, things I knew nothing at all about. Not to mention I knew nothing about financial markets, derivatives, or JWPR007-Lindsey 122 May 7, 2007 16:55 h ow i b e cam e a quant anything at all to do with finance. It was exciting to learn so much from scratch. In the midst of reading about Black-Scholes, I was also deeply involved with writing the book with Victor Ginzburg from the University of Chicago.

…

Richard Grinold, who was my prethesis advisor, gave me a copy of the HJM paper a couple of weeks before the seminar and told me to dig into it. This represents some of the best academic advice I have ever received since I am not sure that I would have immediately realized the model’s importance and potential for further work by myself. The rest, in some sense, is history. I really enjoyed the paper because I was struggling to understand some of the rather abstract questions in stochastic process theory that it dealt with, and I quickly decided to work on the HJM model for my dissertation. Broadly speaking, the HJM paradigm still represents the state of the art in interest rate derivatives pricing, so having been working with it from the very beginning is definitely high on my list of success factors later in life. In my five years at Berkeley, I met a few other people of critical importance to my career path, and life in general.

…

At Columbia College, I decided to enroll in its three-two program, which meant that I spent three years studying the contemporary civilization and humanities core curriculum, as well as the hard sciences, and then two years at the Columbia School of Engineering. There, I found a home in operations research, which allowed me to study computer science and applied mathematics, including differential equations, stochastic processes, statistical quality control, and mathematical programming. While studying for my master’s in operations research at Columbia, I had the opportunity to work at the Rand Institute, where math and computer science were applied to real-world problems. There I was involved in developing a large-scale simulation model designed to optimize response times for the New York City Fire Department. My interest in applied math led me to Carnegie-Mellon’s Graduate School of Industrial Administration, which had a strong operations research faculty.

**
Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing (Writing Science)
** by
Thierry Bardini

Apple II, augmented reality, Bill Duvall, conceptual framework, Donald Davies, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, Grace Hopper, hiring and firing, hypertext link, index card, information retrieval, invention of hypertext, Jaron Lanier, Jeff Rulifson, John von Neumann, knowledge worker, Leonard Kleinrock, Menlo Park, Mother of all demos, new economy, Norbert Wiener, Norman Mailer, packet switching, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, RFC: Request For Comment, Sapir-Whorf hypothesis, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, stochastic process, Ted Nelson, the medium is the message, theory of mind, Turing test, unbiased observer, Vannevar Bush, Whole Earth Catalog

In the conceptual world, both the transmission and the trans- formation of what Whorf called "culturally ordained forms and categories" IS the process by which people learn. The crucial point in Bateson's synthesis lay in the characterization of all such processes as "stochastic": Both genetic change and the process called learnIng (including the somatic changes induced by the envIronment) are stochastic processes. In each case there is, I believe, a stream of events that is random in certain aspects and in each case there is a nonrandom selective process which causes certain of the random com- ponents to "surVIve" longer than others. Without the random, there can be no new thIng. . . . We face, then, two great stochastic systems that are partly in interaction and partly isolated from each other. One system IS withIn the individual and is called learnIng; the other is immanent In heredIty and in populations and IS called evolutIon.

…

In all three of these features, David stresses "software over hardware," or "the touch typist's memory of a particular arrangement of the keys" over this particular arrangement of the keys, and concludes "this, then, was a situation in which the precise details of timing in the developmental se- quence had made it profitable in the short run to adapt machines to the habit of men (or to women, as was increasingly the case) rather than the other way around. And things have been this way ever since" (ibid., 336).6 Thus, it was by institutionalization as an incorporating practice that the QWERTY standard became established. The establishment of a commercial education network favoring the QWERTY was the decisive factor, the source of the" historical accident" that governed the stochastic process that secured forever the supremacy of the QWERTY. It is indeed because of such an "acci- dent" that the six or seven years during which Remington enjoyed the early advantage of being the sole owner of the typewriter patent also saw its selling agents establish profitable and durable business associations with the com- mercial education business. These early business ties soon gave place to an or- ganized and institutional network of associations that secured Remington's position in the typewriter business.

…

See also Hypertext Atari, 103 Atlas computer, 126, 25 2n6 Augmentation Research Center (ARC), 275 276 Index 145-47,157; staff, 121-22; spa- tial organization of laboratory, 122- 23; Framework ActivIty (FRAMAC), 194-95,211, 259nI6; Personal and Organizational Development Activ- ity (PODAC), 194-201, 259nnI6- 18; Line Activity (LINAC), 194,211, 259nI6; and est, 201-8, 260nnI9- 20; as "breakthrough lab," 211-13; Engelbart's eulogy, 214 -as NIC, see under ARPA: Network Information Center Augmented knowledge workshop, I 16, 21 9 AutomatIon, 18 - I 9, 24 0n 5 Automobile, 18, 3 I Baby boomers, 125 Bandwidth, see under Information Baran, Paul, 18 4, 257n4 Bass, Walter, I98ff; and est, 202, 204, 260nI9 Batch processing, 4 Bates, Roger, 1°9,120,123,156 Bateson, Gregory, 17, 26, 52, 56, 102, 135, 228- 2 9, 23 6nI 3, 24 0n 3, 242nI8; on coevolution, 56, 24 2 - 43 n 24; on stochastIc process, 56, 24 2n2 4 Baudot, Maurice-Emile, 65, 67 f , 79 BBN (Bolt, Beranek and Newman), 30, 12 4, 19 1 , 247 nI ,25 8n 7 BCC (Berkeley Computer CorporatIon), 155 f , 25 6n 9 Beam pen, 89. See also LIght pen "Behavior, Purpose and Teleology," 25 Bell Laboratories, 24 7 n 5 Benedict, Henry H., 78 Bergson, Henri-Louis, 48 Berkeley Computer CorporatIon (BCC), I55 f , 25 6n 9 Berman, Melvyn, 1°9-10 Bewley, William, 177 Bigelow, Julian, 25 Bliss, James C., 61-62, 222-23, 244nI Boaz, Franz, 24 0n 3 .

pages: 105 words: 18,832

**
The Collapse of Western Civilization: A View From the Future
** by
Naomi Oreskes,
Erik M. Conway

anti-communist, correlation does not imply causation, creative destruction, en.wikipedia.org, energy transition, Intergovernmental Panel on Climate Change (IPCC), invisible hand, laissez-faire capitalism, market fundamentalism, mass immigration, means of production, oil shale / tar sands, Pierre-Simon Laplace, road to serfdom, Ronald Reagan, stochastic process, the built environment, the market place

T h e F r e n z y o F F o s s i l F u e l s 17 This was consistent with the expectation—based on physical theory—that warmer sea surface temperatures in regions of cyclogenesis could, and likely would, drive either more hurricanes or more intense ones. However, they backed away from this conclusion under pressure from their scientific colleagues. Much of the argument surrounded the concept of statistical significance. Given what we now know about the dominance of nonlinear systems and the distribution of stochastic processes, the then-dominant notion of a 95 percent confidence limit is hard to fathom. Yet overwhelming evidence suggests that twentieth-century scientists believed that a claim could be accepted only if, by the standards of Fisherian statistics, the possibility that an observed event could have happened by chance was less than 1 in 20. Many phenomena whose causal mechanisms were physically, chemically, or biologically linked to warmer temperatures were dis-missed as “unproven” because they did not adhere to this standard of demonstration.

pages: 306 words: 82,765

**
Skin in the Game: Hidden Asymmetries in Daily Life
** by
Nassim Nicholas Taleb

availability heuristic, Benoit Mandelbrot, Bernie Madoff, Black Swan, Brownian motion, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, cellular automata, Claude Shannon: information theory, cognitive dissonance, complexity theory, David Graeber, disintermediation, Donald Trump, Edward Thorp, equity premium, financial independence, information asymmetry, invisible hand, knowledge economy, loss aversion, mandelbrot fractal, mental accounting, microbiome, moral hazard, Murray Gell-Mann, offshore financial centre, p-value, Paul Samuelson, Ponzi scheme, price mechanism, principal–agent problem, Ralph Nader, random walk, rent-seeking, Richard Feynman, Richard Thaler, Ronald Coase, Ronald Reagan, Rory Sutherland, Silicon Valley, Steven Pinker, stochastic process, survivorship bias, The Nature of the Firm, transaction costs, urban planning, Yogi Berra

Adaptation of Theorem 1 to Brownian Motion The implications of simplified discussion do not change whether one uses richer models, such as a full stochastic process subjected to an absorbing barrier. And of course in a natural setting the eradication of all previous life can happen (i.e., Xt can take extreme negative value), not just a stopping condition. The Peters and Gell-Mann argument also cancels the so-called equity premium puzzle if you add fat tails (hence outcomes vastly more severe pushing some level equivalent to ruin) and absence of the fungibility of temporal and ensemble. There is no puzzle. The problem is invariant in real life if one uses a Brownian-motion-style stochastic process subjected to an absorbing barrier. In place of the simplified representation we would have, for an process subjected to L, an absorbing barrier from below, in the arithmetic version: or, for a geometric process: where Z is a random variable.

**
Learn Algorithmic Trading
** by
Sebastien Donadio

active measures, algorithmic trading, automated trading system, backtesting, Bayesian statistics, buy and hold, buy low sell high, cryptocurrency, DevOps, en.wikipedia.org, fixed income, Flash crash, Guido van Rossum, latency arbitrage, locking in a profit, market fundamentalism, market microstructure, martingale, natural language processing, p-value, paper trading, performance metric, prediction markets, quantitative trading / quantitative ﬁnance, random walk, risk tolerance, risk-adjusted returns, Sharpe ratio, short selling, sorting algorithm, statistical arbitrage, statistical model, stochastic process, survivorship bias, transaction costs, type inference, WebSocket, zero-sum game

He has built and deployed extremely low latency, high throughput automated trading systems for trading exchanges around the world, across multiple asset classes. He specializes in statistical arbitrage market-making, and pairs trading strategies for the most liquid global futures contracts. He works as a Senior Quantitative Developer at a trading firm in Chicago. He holds a Masters in Computer Science from the University of Southern California. His areas of interest include Computer Architecture, FinTech, Probability Theory and Stochastic Processes, Statistical Learning and Inference Methods, and Natural Language Processing. About the reviewers Nataraj Dasgupta is the VP of Advanced Analytics at RxDataScience Inc. He has been in the IT industry for more than 19 years and has worked in the technical & analytics divisions of Philip Morris, IBM, UBS Investment Bank, and Purdue Pharma. He led the Data Science team at Purdue, where he developed the company's award-winning Big Data and Machine Learning platform.

…

In the next chapter, we will review and implement some simple regression and classification methods and understand the advantages of applying supervised statistical learning methods to trading. Predicting the Markets with Basic Machine Learning In the last chapter, we learned how to design trading strategies, create trading signals, and implement advanced concepts, such as seasonality in trading instruments. Understanding those concepts in greater detail is a vast field comprising stochastic processes, random walks, martingales, and time series analysis, which we leave to you to explore at your own pace. So what's next? Let's look at an even more advanced method of prediction and forecasting: statistical inference and prediction. This is known as machine learning, the fundamentals of which were developed in the 1800s and early 1900s and have been worked on ever since. Recently, there has been a resurgence in interest in machine learning algorithms and applications owing to the availability of extremely cost-effective processing power and the easy availability of large datasets.

pages: 111 words: 1

**
Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets
** by
Nassim Nicholas Taleb

Antoine Gombaud: Chevalier de Méré, availability heuristic, backtesting, Benoit Mandelbrot, Black Swan, commoditize, complexity theory, corporate governance, corporate raider, currency peg, Daniel Kahneman / Amos Tversky, discounted cash flows, diversified portfolio, endowment effect, equity premium, fixed income, global village, hedonic treadmill, hindsight bias, Kenneth Arrow, Long Term Capital Management, loss aversion, mandelbrot fractal, mental accounting, meta analysis, meta-analysis, Myron Scholes, Paul Samuelson, quantitative trading / quantitative ﬁnance, QWERTY keyboard, random walk, Richard Feynman, road to serfdom, Robert Shiller, Robert Shiller, selection bias, shareholder value, Sharpe ratio, Steven Pinker, stochastic process, survivorship bias, too big to fail, Turing test, Yogi Berra

The Tools The notion of alternative histories discussed in the last chapter can be extended considerably and subjected to all manner of technical refinement. This brings us to the tools used in my profession to toy with uncertainty. I will outline them next. Monte Carlo methods, in brief, consist of creating artificial history using the following concepts. First, consider the sample path. The invisible histories have a scientific name, alternative sample paths, a name borrowed from the field of mathematics of probability called stochastic processes. The notion of path, as opposed to outcome, indicates that it is not a mere MBA-style scenario analysis, but the examination of a sequence of scenarios along the course of time. We are not just concerned with where a bird can end up tomorrow night, but rather with all the various places it can possibly visit during the time interval. We are not concerned with what the investor’s worth would be in, say, a year, but rather of the heart-wrenching rides he may experience during that period.

…

Starting at $100, in one scenario it can end up at $20 having seen a high of $220; in another it can end up at $145 having seen a low of $10. Another example is the evolution of your wealth during an evening at a casino. You start with $1,000 in your pocket, and measure it every fifteen minutes. In one sample path you have $2,200 at midnight; in another you barely have $20 left for a cab fare. Stochastic processes refer to the dynamics of events unfolding with the course of time. Stochastic is a fancy Greek name for random. This branch of probability concerns itself with the study of the evolution of successive random events—one could call it the mathematics of history. The key about a process is that it has time in it. What is a Monte Carlo generator? Imagine that you can replicate a perfect roulette wheel in your attic without having recourse to a carpenter.

**
Fifty Challenging Problems in Probability With Solutions
** by
Frederick Mosteller

Isaac Newton, John von Neumann, prisoner's dilemma, RAND corporation, stochastic process

As is well known, no strategy can give him a higher probability of achieving his goal, and the probability is this high if and only if he makes sure either to lose x or win y eventually. The Leeser Paradise The Lesser Paradise resembles the Golden Paradise with the imoortant difference that before leaving the haH the gambler must pay an income tax ·First pUblished, 1965 Reprinted by Dover Publications. Inc in 1976 under the title Inequalities for stochastic processes 56 of t 100% (0 < t < 1) on any net positive income that he has won there. It is therefore no harder or easier for him to win y dollars with an initial fortune of x than it is for his brother in the Golden Paradise to win y/(I - t) dollars. The greatest probability with which he can achieve his goal is therefore (I - t)x (I) (1 - t)x +y The Paradise Lost Here, the croupier collects the tax of !

pages: 518 words: 107,836

**
How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy)
** by
Benjamin Peters

Albert Einstein, American ideology, Andrei Shleifer, Benoit Mandelbrot, bitcoin, Brownian motion, Claude Shannon: information theory, cloud computing, cognitive dissonance, computer age, conceptual framework, continuation of politics by other means, crony capitalism, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, Dissolution of the Soviet Union, Donald Davies, double helix, Drosophila, Francis Fukuyama: the end of history, From Mathematics to the Technologies of Life and Death, hive mind, index card, informal economy, information asymmetry, invisible hand, Jacquard loom, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, linear programming, mandelbrot fractal, Marshall McLuhan, means of production, Menlo Park, Mikhail Gorbachev, mutually assured destruction, Network effects, Norbert Wiener, packet switching, Pareto efficiency, pattern recognition, Paul Erdős, Peter Thiel, Philip Mirowski, RAND corporation, rent-seeking, road to serfdom, Ronald Coase, scientific mainstream, Steve Jobs, Stewart Brand, stochastic process, technoutopianism, The Structural Transformation of the Public Sphere, transaction costs, Turing machine

During World War II, Wiener researched ways to integrate human gunner and analog computer agency in antiaircraft artillery fire-control systems, vaulting his wartime research on the feedback processes among humans and machines into a general science of communication and control, with the gun and gunner ensemble (the man and the antiaircraft gun cockpit) as the original image of the cyborg.5 To designate this new science of control and feedback mechanisms, Wiener coined the neologism cybernetics from the Greek word for steersman, which is a predecessor to the English term governor (there is a common consonant-vowel structure between cybern- and govern—k/g + vowel + b/v + ern). Wiener’s popular masterworks ranged further still, commingling complex mathematical analysis (especially noise and stochastic processes), exposition on the promise and threat associated with automated information technology, and various speculations of social, political, and religious natures.6 For Wiener, cybernetics was a working out of the implications of “the theory of messages” and the ways that information systems organized life, the world, and the cosmos. He found parallel structures in the communication and control systems operating in animal neural pathways, electromechanical circuits, and information flows in larger social systems.7 The fact that his work speaks in general mathematical terms also sped his work’s reception and eventual embrace by a wide range of readers, including Soviet philosopher-critics, as examined later.

…

Because the coauthors were sensitive to how language, especially foreign terms, packs in questions of international competition, the coauthors attempted to keep their language as technical and abstract as possible, reminding the reader that the cybernetic mind-machine analogy was central to the emerging science but should be understood only “from a functional point of view,” not a philosophical one.76 The technical and abstract mathematical language of Wiener’s cybernetics thus served as a political defense against Soviet philosopher-critics and as ballast for generalizing the coauthors’ ambitions for scientists in other fields. They employed a full toolbox of cybernetic terminology, including signal words such as homeostasis, feedback, entropy, reflex, and the binary digit. They also repeated Wiener and Shannon’s emphases on probabilistic, stochastic processes as the preferred mathematical medium for scripting behavioral patterns onto abstract logical systems, including a whole section that elaborated on the mind-machine analogy with special emphasis on the central processor as capable of memory, responsiveness, and learning.77Wiener’s call for cyberneticists with “Leibnizian catholicity” of scientific interests was tempered into its negative form—a warning against disciplinary isolationism.78 On the last page of the article, the coauthors smoothed over the adoption of Wiener, an American, as foreign founder of Soviet cybernetics by summarizing and stylizing Wiener’s “sharp critique of capitalist society,” his pseudo-Marxist prediction of a “new industrial revolution” that would arise out of the “chaotic conditions of the capitalist market,” and his widely publicized postwar fear of “the replacement of common workers with mechanical robots.”79 A word play in Russian animates this last phrase: the Russian word for worker, or rabotnik, differs only by a vowel transformation from robot, the nearly universal term coined in 1927 by the playwright Karel Capek from the Czech word for “forced labor.”80 The first industrial revolution replaced the hand with the machine, or the rabotnik with the robot, and Wiener’s science, the coauthors dreamed, would help usher in a “second industrial revolution” in which the labor of the human mind could be carried out by intelligent machines, thus freeing, as Marx had intimated a century earlier, the mind to higher pursuits.

pages: 396 words: 112,748

**
Chaos: Making a New Science
** by
James Gleick

Benoit Mandelbrot, business cycle, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Murray Gell-Mann, Norbert Wiener, pattern recognition, Richard Feynman, Stephen Hawking, stochastic process, trade route

It seems to have been the issue on which many different fields of science were stuck—they were stuck on this aspect of the nonlinear behavior of systems. Now, nobody would have thought that the right background for this problem was to know particle physics, to know something about quantum field theory, and to know that in quantum field theory you have these structures known as the renormalization group. Nobody knew that you would need to understand the general theory of stochastic processes, and also fractal structures. “Mitchell had the right background. He did the right thing at the right time, and he did it very well. Nothing partial. He cleaned out the whole problem.” Feigenbaum brought to Los Alamos a conviction that his science had failed to understand hard problems—nonlinear problems. Although he had produced almost nothing as a physicist, he had accumulated an unusual intellectual background.

…

Astute readers, though, could tell that I preferred Joe Ford’s more freewheeling “cornucopia” style of definition—“Dynamics freed at last from the shackles of order and predictability…”—and still do. But everything evolves in the direction of specialization, and strictly speaking, “chaos” is now a very particular thing. When Yaneer Bar-Yam wrote a kilopage textbook, Dynamics of Complex Systems, in 2003, he took care of chaos proper in the first section of the first chapter. (“The first chapter, I have to admit, is 300 pages, okay?” he says.) Then came Stochastic Processes, Modeling Simulation, Cellular Automata, Computation Theory and Information Theory, Scaling, Renormalization, and Fractals, Neural Networks, Attractor Networks, Homogenous Systems, Inhomogenous Systems, and so on. Bar-Yam, the son of a high-energy physicist, had studied condensed matter physics and become an engineering professor at Boston University, but he left in 1997 to found the New England Complex Systems Institute.

pages: 119 words: 10,356

**
Topics in Market Microstructure
** by
Ilija I. Zovko

Brownian motion, computerized trading, continuous double auction, correlation coefficient, financial intermediation, Gini coefficient, information asymmetry, market design, market friction, market microstructure, Murray Gell-Mann, p-value, quantitative trading / quantitative ﬁnance, random walk, stochastic process, stochastic volatility, transaction costs

Quantitative Finance, 2:346–353, 2002. 100 BIBLIOGRAPHY W. S. Choi, S. B. Lee, and P. I. Yu. Estimating the permanent and transitory components of the bid/ask spread. In C.-F. e. Lee, editor, Advances in investment analysis and portfolio management. Volume 5. Elsevier, 1998. T. Chordia and B. Swaminathan. Trading volume and crossautocorrelations in stock returns. Journal of Finance, LV(2), April 2000. P. K. Clark. Subordinated stochastic process model with finite variance for speculative prices. Econometrica, 41(1):135–155, 1973. K. J. Cohen, S. F. Maier, R. A. Schwartz, and D. K. Whitcomb. Transaction costs, order placement strategy, and existence of the bid-ask spread. Journal of Political Economy, 89(2):287–305, 1981. K. J. Cohen, R. M. Conroy, and S. F. Maier. Order flow and the quality of the market. In Y. Amihud, T. Ho, and R.

pages: 425 words: 122,223

**
Capital Ideas: The Improbable Origins of Modern Wall Street
** by
Peter L. Bernstein

"Robert Solow", Albert Einstein, asset allocation, backtesting, Benoit Mandelbrot, Black-Scholes formula, Bonfire of the Vanities, Brownian motion, business cycle, buy and hold, buy low sell high, capital asset pricing model, corporate raider, debt deflation, diversified portfolio, Eugene Fama: efficient market hypothesis, financial innovation, financial intermediation, fixed income, full employment, implied volatility, index arbitrage, index fund, interest rate swap, invisible hand, John von Neumann, Joseph Schumpeter, Kenneth Arrow, law of one price, linear programming, Louis Bachelier, mandelbrot fractal, martingale, means of production, money market fund, Myron Scholes, new economy, New Journalism, Paul Samuelson, profit maximization, Ralph Nader, RAND corporation, random walk, Richard Thaler, risk/return, Robert Shiller, Robert Shiller, Ronald Reagan, stochastic process, Thales and the olive presses, the market place, The Predators' Ball, the scientific method, The Wealth of Nations by Adam Smith, Thorstein Veblen, transaction costs, transfer pricing, zero-coupon bond, zero-sum game

Paul Cootner, one of the leading finance scholars of the 1960s, once delivered this accolade: “So outstanding is his work that we can say that the study of speculative prices has its moment of glory at its moment of conception.”1 Bachelier laid the groundwork on which later mathematicians constructed a full-fledged theory of probability. He derived a formula that anticipated Einstein’s research into the behavior of particles subject to random shocks in space. And he developed the now universally used concept of stochastic processes, the analysis of random movements among statistical variables. Moreover, he made the first theoretical attempt to value such financial instruments as options and futures, which had active markets even in 1900. And he did all this in an effort to explain why prices in capital markets are impossible to predict! Bachelier’s opening paragraphs contain observations about “fluctuations on the Exchange” that could have been written today.

…

(LOR) Leland-Rubinstein Associates Leverage Leveraged buyouts Liquidity management market money Preference theory stock “Liquidity Preference as Behavior Toward Risk” (Tobin) Linear programming Loading charges: see Brokerage commissions London School of Economics (LSE) London Stock Exchange Macroeconomics Management Science Marginal utility concept “Market and Industry Factors in Stock Price Performance” (King) Market theories (general discussion). See also specific theories and types of securities competitive disaster avoidance invisible hand linear regression/econometric seasonal fluctuations stochastic process Mathematical economics Mathematical Theory of Non-Uniform Gases, The Maximum expected return concept McCormick Harvester Mean-Variance Analysis Mean-Variance Analysis in Portfolio Choice and Capital Markets (Markowitz) “Measuring the Investment Performance of Pension Funds,” report Mellon Bank Merck Merrill Lynch Minnesota Mining MIT MM Theory “Modern Portfolio Theory. How the New Investment Technology Evolved” Money Managers, The (“Adam Smith”) Money market funds Mortgages government-guaranteed prepaid rates on “‘Motionless’ Motion of Swift’s Flying Island, The” (Merton) Multiple manager risk analysis (MULMAN) Mutual funds individual investment in performance analysis of portfolio management and Value Line National Bureau of Economic Research National General Naval Research Logistics Quarterly New School for Social Research New York Stock Exchange volume of trading New York Times averages “Noise” (Black) Noise trading asset prices and inefficiency of October, 1987, crash OPEC countries Operations Research Optimal capital structure Optimal investment strategy: see Diversification; Portfolio(s), optimal “Optimization of a Quadratic Function Subject to Linear Constraints, The” (Markowitz) Optimization theory Options call contracts expected return on implicit out-of-the-money/in-the-money pricing formulas put valuation Options markets over-the-counter Pacific Stock Exchange Paul A.

pages: 247 words: 43,430

**
Think Complexity
** by
Allen B. Downey

Benoit Mandelbrot, cellular automata, Conway's Game of Life, Craig Reynolds: boids flock, discrete time, en.wikipedia.org, Frank Gehry, Gini coefficient, Guggenheim Bilbao, Laplace demon, mandelbrot fractal, Occupy movement, Paul Erdős, peer-to-peer, Pierre-Simon Laplace, sorting algorithm, stochastic process, strong AI, Thomas Kuhn: the structure of scientific revolutions, Turing complete, Turing machine, Vilfredo Pareto, We are the 99%

, Stanley Milgram sorting, Analysis of Basic Python Operations, Analysis of Basic Python Operations source node, Dijkstra spaceships, Structures, Life Patterns spanning cluster, Percolation special creation, Falsifiability spectral density, Spectral Density spherical cow, The Axes of Scientific Models square, Fractals stable sort, Analysis of Basic Python Operations Stanford Large Network Dataset Collection, Zipf, Pareto, and Power Laws state, Cellular Automata, Stephen Wolfram, Sand Piles stochastic process, The Axes of Scientific Models stock market, SOC, Causation, and Prediction StopIteration, Iterators __str__, Representing Graphs, Representing Graphs strategy, Prisoner’s Dilemma string concatenation, Analysis of Basic Python Operations string methods, Analysis of Basic Python Operations Strogatz, Steven, Paradigm Shift?, Watts and Strogatz The Structure of Scientific Revolutions, Paradigm Shift?

pages: 523 words: 143,139

**
Algorithms to Live By: The Computer Science of Human Decisions
** by
Brian Christian,
Tom Griffiths

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, algorithmic trading, anthropic principle, asset allocation, autonomous vehicles, Bayesian statistics, Berlin Wall, Bill Duvall, bitcoin, Community Supported Agriculture, complexity theory, constrained optimization, cosmological principle, cryptocurrency, Danny Hillis, David Heinemeier Hansson, delayed gratification, dematerialisation, diversification, Donald Knuth, double helix, Elon Musk, fault tolerance, Fellow of the Royal Society, Firefox, first-price auction, Flash crash, Frederick Winslow Taylor, George Akerlof, global supply chain, Google Chrome, Henri Poincaré, information retrieval, Internet Archive, Jeff Bezos, Johannes Kepler, John Nash: game theory, John von Neumann, Kickstarter, knapsack problem, Lao Tzu, Leonard Kleinrock, linear programming, martingale, Nash equilibrium, natural language processing, NP-complete, P = NP, packet switching, Pierre-Simon Laplace, prediction markets, race to the bottom, RAND corporation, RFC: Request For Comment, Robert X Cringely, Sam Altman, sealed-bid auction, second-price auction, self-driving car, Silicon Valley, Skype, sorting algorithm, spectrum auction, Stanford marshmallow experiment, Steve Jobs, stochastic process, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, urban planning, Vickrey auction, Vilfredo Pareto, Walter Mischel, Y Combinator, zero-sum game

Like the famous Heisenberg uncertainty principle of particle physics, which says that the more you know about a particle’s momentum the less you know about its position, the so-called bias-variance tradeoff expresses a deep and fundamental bound on how good a model can be—on what it’s possible to know and to predict. This notion is found in various places in the machine-learning literature. See, for instance, Geman, Bienenstock, and Doursat, “Neural Networks and the Bias/Variance Dilemma,” and Grenander, “On Empirical Spectral Analysis of Stochastic Processes.” in the Book of Kings: The bronze snake, known as Nehushtan, gets destroyed in 2 Kings 18:4. “pay good money to remove the tattoos”: Gilbert, Stumbling on Happiness. duels less than fifty years ago: If you’re not too fainthearted, you can watch video of a duel fought in 1967 at http://passerelle-production.u-bourgogne.fr/web/atip_insulte/Video/archive_duel_france.swf. as athletes overfit their tactics: For an interesting example of very deliberately overfitting fencing, see Harmenberg, Epee 2.0.

…

Nature 363 (1993): 315–319. Gould, Stephen Jay. “The Median Isn’t the Message.” Discover 6, no. 6 (1985): 40–42. Graham, Ronald L., Eugene L. Lawler, Jan Karel Lenstra, and Alexander H. G. Rinnooy Kan. “Optimization and Approximation in Deterministic Sequencing and Scheduling: A Survey.” Annals of Discrete Mathematics 5 (1979): 287–326. Grenander, Ulf. “On Empirical Spectral Analysis of Stochastic Processes.” Arkiv för Matematik 1, no. 6 (1952): 503–531. Gridgeman, T. “Geometric Probability and the Number π.” Scripta Mathematika 25, no. 3 (1960): 183–195. Griffiths, Thomas L., Charles Kemp, and Joshua B. Tenenbaum. “Bayesian Models of Cognition.” In The Cambridge Handbook of Computational Cognitive Modeling. Edited by Ron Sun. Cambridge, UK: Cambridge University Press, 2008. Griffiths, Thomas L., Falk Lieder, and Noah D.

pages: 665 words: 146,542

**
Money: 5,000 Years of Debt and Power
** by
Michel Aglietta

bank run, banking crisis, Basel III, Berlin Wall, bitcoin, blockchain, Bretton Woods, British Empire, business cycle, capital asset pricing model, capital controls, cashless society, central bank independence, collapse of Lehman Brothers, collective bargaining, corporate governance, David Graeber, debt deflation, dematerialisation, Deng Xiaoping, double entry bookkeeping, energy transition, eurozone crisis, Fall of the Berlin Wall, falling living standards, financial deregulation, financial innovation, Financial Instability Hypothesis, financial intermediation, floating exchange rates, forward guidance, Francis Fukuyama: the end of history, full employment, German hyperinflation, income inequality, inflation targeting, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), invention of writing, invisible hand, joint-stock company, Kenneth Arrow, Kickstarter, liquidity trap, margin call, means of production, money market fund, moral hazard, Nash equilibrium, Network effects, Northern Rock, oil shock, planetary scale, plutocrats, Plutocrats, price stability, purchasing power parity, quantitative easing, race to the bottom, reserve currency, secular stagnation, seigniorage, shareholder value, special drawing rights, special economic zone, stochastic process, the payments system, the scientific method, too big to fail, trade route, transaction costs, transcontinental railway, Washington Consensus

This mimetic model’s strength is that it reveals the emergence, from amid this general confusion, of a polarisation around one single object of desire recognised by all (see Box 1.1).21 Box 1.1 Theorem of mimetic convergence In a population of N agents (i = 1, N), on date t each person has a belief ui(t) regarding the debt that represents absolute liquidity. i chooses his belief in t+1 by copying an agent j at random, with the probability pij for j = 1, N. So we have Pr{ui (t+1) = uj(t)} = pij with Σpij = 1 for each i. The mimetic interdependency is formalised as a Markovian stochastic process defined by the matrix Such that the dynamic process is written U (t+1) = PU(t) The theorem shows that -If the graph associated with P is strongly correspondent (matrix P does not break down into independent sub-matrices); -and aperiodic (the process of revising beliefs is not cyclical); -the mimetic contagion converges towards unanimity around a belief, which can be any of the initial beliefs.

…

These rules are ways of describing a discretionary monetary policy, which is constrained by situations of uncertainty. These are systems of constrained discretion, in which the rule is used as a safeguard. Box 6.1 Interest rate rules 1) The Wicksellian norm which the Rilskbank used to break out of inflation in the 1930s set a target for price levels and not inflation rates. It is associated with the rates rule it = īt + φpt in which pt = the log of the price index that is be stabilised. īt follows a stochastic process that is independent of price movements but is correlated to the exogenous fluctuations in the natural rate rt. The relationship defining the equilibrium nominal rate is it = rt + Etpt + 1 – pt. Eliminating it we get: If we separate out the processes followed by rt and īt then pt has a single solution: It follows that prices fluctuate around a long-term level: The long-term value of the general level of prices is independent from demand for money.

pages: 442 words: 39,064

**
Why Stock Markets Crash: Critical Events in Complex Financial Systems
** by
Didier Sornette

Asian financial crisis, asset allocation, Berlin Wall, Bretton Woods, Brownian motion, business cycle, buy and hold, capital asset pricing model, capital controls, continuous double auction, currency peg, Deng Xiaoping, discrete time, diversified portfolio, Elliott wave, Erdős number, experimental economics, financial innovation, floating exchange rates, frictionless, frictionless market, full employment, global village, implied volatility, index fund, information asymmetry, intangible asset, invisible hand, John von Neumann, joint-stock company, law of one price, Louis Bachelier, mandelbrot fractal, margin call, market bubble, market clearing, market design, market fundamentalism, mental accounting, moral hazard, Network effects, new economy, oil shock, open economy, pattern recognition, Paul Erdős, Paul Samuelson, quantitative trading / quantitative ﬁnance, random walk, risk/return, Ronald Reagan, Schrödinger's Cat, selection bias, short selling, Silicon Valley, South Sea Bubble, statistical model, stochastic process, stocks for the long run, Tacoma Narrows Bridge, technological singularity, The Coming Technological Singularity, The Wealth of Nations by Adam Smith, Tobin tax, total factor productivity, transaction costs, tulip mania, VA Linux, Y2K, yield curve

General proof that properly anticipated prices are random. Samuelson has proved a general theorem showing that the concept that prices are unpredictable can actually be deduced rigorously [357] from a model that hypothesizes that a stock’s present price pt is set at the expected discounted value of its future dividends dt dt+1 dt+2 (which are supposed to be random variables generated according to any general (but known) stochastic process): pt = dt + 1 dt+1 + 1 2 dt+2 + 1 2 3 dt+3 + · · · (3) where the factors i = 1 − r < 1, which can ﬂuctuate from one time period to the next, account for the depreciation of a future price calculated at present due to the nonzero consumption price index r. We see that pt = dt + 1 pt+1 , and thus the expectation Ept+1 of pt+1 conditioned on the knowledge of the present price pt is Ept+1 = pt − dt (4) 1 This shows that, barring the drift due to the inﬂation and the dividend, the price increment does not have a systematic component or memory of the past and is thus random.

…

Inductive reasoning and bounded rationality (The El Farol Problem), American Economic Review (Papers and Proceedings) 84. 18. Arthur, W., Lane, D., and Durlauf, S., Editors (1997). The economy as an evolving complex system II (Addison-Wesley, Redwood City). 19. Arthur, W. B. (1987). Self-reinforcing mechanisms in economics, Center for Economic Policy Research 111, 1–20. 20. Arthur, W. B., Ermoliev, Y. M., and Kaniovsky, Y. M. (1984). Strong laws for a class of path-dependent stochastic processes with applications, in Proceedings of the International Conference on Stochastic Optimization, A. Shiryaev and R. Wets, editors (Springer-Verlag, New York), pp. 287–300. 21. Arthur, W. B., Holland, J. H., LeBaron, B., Palmer, R., and Taylor, P. (1997). Asset pricing under endogenous expectations in an artiﬁcial stock market, in The Economy as an Evolving Complex System II, W. Arthur, D. Lane, and S.

pages: 651 words: 180,162

**
Antifragile: Things That Gain From Disorder
** by
Nassim Nicholas Taleb

Air France Flight 447, Andrei Shleifer, banking crisis, Benoit Mandelbrot, Berlin Wall, Black Swan, business cycle, Chuck Templeton: OpenTable:, commoditize, creative destruction, credit crunch, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, discrete time, double entry bookkeeping, Emanuel Derman, epigenetics, financial independence, Flash crash, Gary Taubes, George Santayana, Gini coefficient, Henri Poincaré, high net worth, hygiene hypothesis, Ignaz Semmelweis: hand washing, informal economy, invention of the wheel, invisible hand, Isaac Newton, James Hargreaves, Jane Jacobs, joint-stock company, joint-stock limited liability company, Joseph Schumpeter, Kenneth Arrow, knowledge economy, Lao Tzu, Long Term Capital Management, loss aversion, Louis Pasteur, mandelbrot fractal, Marc Andreessen, meta analysis, meta-analysis, microbiome, money market fund, moral hazard, mouse model, Myron Scholes, Norbert Wiener, pattern recognition, Paul Samuelson, placebo effect, Ponzi scheme, principal–agent problem, purchasing power parity, quantitative trading / quantitative ﬁnance, Ralph Nader, random walk, Ray Kurzweil, rent control, Republic of Letters, Ronald Reagan, Rory Sutherland, selection bias, Silicon Valley, six sigma, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, stochastic process, stochastic volatility, Thales and the olive presses, Thales of Miletus, The Great Moderation, the new new thing, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Malthus, too big to fail, transaction costs, urban planning, Vilfredo Pareto, Yogi Berra, Zipf's Law

Next we turn to a central distinction between the things that like stress and other things that don’t. 1 Cato was the statesman who, three books ago (Fooled by Randomness), expelled all philosophers from Rome. 2 This little bit of effort seems to activate the switch between two distinct mental systems, one intuitive and the other analytical, what psychologists call “system 1” and “system 2.” 3 There is nothing particularly “white” in white noise; it is simply random noise that follows a Normal Distribution. 4 The obvious has not been tested empirically: Can the occurrence of extreme events be predicted from past history? Alas, according to a simple test: no, sorry. 5 Set a simple filtering rule: all members of a species need to have a neck forty centimeters long in order to survive. After a few generations, the surviving population would have, on average, a neck longer than forty centimeters. (More technically, a stochastic process subjected to an absorbing barrier will have an observed mean higher than the barrier.) 6 The French have a long series of authors who owe part of their status to their criminal record—which includes the poet Ronsard, the writer Jean Genet, and many others. CHAPTER 3 The Cat and the Washing Machine Stress is knowledge (and knowledge is stress)—The organic and the mechanical—No translator needed, for now—Waking up the animal in us, after two hundred years of modernity The bold conjecture made here is that everything that has life in it is to some extent antifragile (but not the reverse).

…

My dream—the solution—is that we would have a National Entrepreneur Day, with the following message: Most of you will fail, disrespected, impoverished, but we are grateful for the risks you are taking and the sacrifices you are making for the sake of the economic growth of the planet and pulling others out of poverty. You are at the source of our antifragility. Our nation thanks you. 1 A technical comment on why the adaptability criterion is innocent of probability (the nontechnical reader should skip the rest of this note). The property in a stochastic process of not seeing at any time period t what would happen in time after t, that is, any period higher than t, hence reacting with a lag, an incompressible lag, is called nonanticipative strategy, a requirement of stochastic integration. The incompressibility of the lag is central and unavoidable. Organisms can only have nonanticipative strategies—hence nature can only be nonpredictive. This point is not trivial at all, and has even confused probabilists such as the Russian School represented by Stratonovich and the users of his method of integration, who fell into the common mental distortion of thinking that the future sends some signal detectable by us.

pages: 607 words: 185,487

**
Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed
** by
James C. Scott

agricultural Revolution, business cycle, clean water, colonial rule, commoditize, deskilling, facts on the ground, germ theory of disease, informal economy, invention of writing, invisible hand, Jane Jacobs, Kenneth Arrow, land reform, land tenure, Louis Pasteur, new economy, New Urbanism, Potemkin village, price mechanism, profit maximization, road to serfdom, Silicon Valley, stochastic process, the built environment, The Death and Life of Great American Cities, the scientific method, Thorstein Veblen, urban decay, urban planning, urban renewal, working poor

.",, Kollontay's point of departure, like Luxemburg's, is an assumption about what kinds of tasks are the making of revolutions and the creating of new forms of production. For both of them, such tasks are voyages in uncharted waters. There may be some rules of thumb, but there can be no blueprints or battle plans drawn up in advance; the numerous unknowns in the equation make a one-step solution inconceivable. In more technical language, such goals can be approached only by a stochastic process of successive approximations, trial and error, experiment, and learning through experience. The kind of knowledge required in such endeavors is not deductive knowledge from first principles but rather what Greeks of the classical period called nietis, a concept to which we shall return. Usually translated, inadequately, as "cunning," metis is better understood as the kind of knowledge that can be acquired only by long practice at similar but rarely identical tasks, which requires constant adaptation to changing circumstances.

…

Metis is not merely the specification of local values (such as the local mean temperature and rainfall) made in order to successfully apply a generic formula to a local case. Taking language as a parallel, I believe that the rule of thumb is akin to formal grammar, whereas metis is more like actual speech. Metis is no more derivative of general rules than speech is derivative of grammar. Speech develops from the cradle by imitation, use, trial and error. Learning a mother tongue is a stochastic process-a process of successive, selfcorrecting approximations. We do not begin by learning the alphabet, individual words, parts of speech, and rules of grammar and then trying to use them all in order to produce a grammatically correct sentence. Moreover, as Oakeshott indicates, a knowledge of the rules of speech by themselves is compatible with a complete inability to speak intelligible sentences.

**
Longevity: To the Limits and Beyond (Research and Perspectives in Longevity)
** by
Jean-Marie Robine,
James W. Vaupel,
Bernard Jeune,
Michel Allard

computer age, conceptual framework, demographic transition, Drosophila, epigenetics, life extension, longitudinal study, phenotype, stem cell, stochastic process

Finch' Summary In this essay, I inquire about little explored sources of non-genetic factors in individual life spans that are displayed between individuals with identical genotypes in controlled laboratory environments. The numbers of oocytes found in the ovaries of inbred mice, for example, show a > 5-fold range between individuals. Smaller, but still extensive variations are also indicated for hippocampal neurons. These variations in cell number can be attributed to stochastic processes during organogenesis, i.e. "developmental noise in cell fate determination." They may be of general importance to functional changes during aging, as argued for reproductive senescence in females which is strongly linked to the time of oocyte depletion. More generally, I hypothesize that variations in cell numbers during development result in individual differences in the reserve cell numbers which, in turn, set critical thresholds for dysfunctions and sources of morbidity during aging.

pages: 321

**
Finding Alphas: A Quantitative Approach to Building Trading Strategies
** by
Igor Tulchinsky

algorithmic trading, asset allocation, automated trading system, backtesting, barriers to entry, business cycle, buy and hold, capital asset pricing model, constrained optimization, corporate governance, correlation coefficient, credit crunch, Credit Default Swap, discounted cash flows, discrete time, diversification, diversified portfolio, Eugene Fama: efficient market hypothesis, financial intermediation, Flash crash, implied volatility, index arbitrage, index fund, intangible asset, iterative process, Long Term Capital Management, loss aversion, market design, market microstructure, merger arbitrage, natural language processing, passive investing, pattern recognition, performance metric, popular capitalism, prediction markets, price discovery process, profit motive, quantitative trading / quantitative ﬁnance, random walk, Renaissance Technologies, risk tolerance, risk-adjusted returns, risk/return, selection bias, sentiment analysis, shareholder value, Sharpe ratio, short selling, Silicon Valley, speech recognition, statistical arbitrage, statistical model, stochastic process, survivorship bias, systematic trading, text mining, transaction costs, Vanguard fund, yield curve

The disadvantage is that each of these approaches presumes some specific data model. Trend analysis is an example of applications of statistical models in alpha research. In particular, a hidden Markov model is frequently utilized for that purpose, based on the belief that price movements of the stock market are not totally random. In a statistics framework, the hidden Markov model is a composition of two or more stochastic processes: a hidden Markov chain, which accounts for the temporal variability, and an observable process, which accounts for the spectral variability. In this approach, the pattern of the stock market behavior is determined based on these probability values at a particular time. The goal is to figure out the hidden state sequence given the observation sequence, extract the long-term probability distribution, and identify the current trend relative to that distribution.

pages: 357 words: 98,854

**
Epigenetics Revolution: How Modern Biology Is Rewriting Our Understanding of Genetics, Disease and Inheritance
** by
Nessa Carey

Albert Einstein, British Empire, Build a better mousetrap, conceptual framework, discovery of penicillin, double helix, Drosophila, epigenetics, Fellow of the Royal Society, life extension, mouse model, phenotype, selective serotonin reuptake inhibitor (SSRI), stem cell, stochastic process, Thomas Kuhn: the structure of scientific revolutions, twin studies

But over decades all these mild abnormalities in gene expression, resulting from a slightly inappropriate set of chromatin modifications, may lead to a gradually increasing functional impairment. Clinically, we don’t recognise this until it passes some invisible threshold and the patient begins to show symptoms. The epigenetic variation that occurs in developmental programming is at heart a predominantly random process, normally referred to as ‘stochastic’. This stochastic process may account for a significant amount of the variability that develops between the MZ twins who opened this chapter. Random fluctuations in epigenetic modifications during early development lead to non-identical patterns of gene expression. These become epigenetically set and exaggerated over the years, until eventually the genetically identical twins become phenotypically different, sometimes in the most dramatic of ways.

pages: 356 words: 105,533

**
Dark Pools: The Rise of the Machine Traders and the Rigging of the U.S. Stock Market
** by
Scott Patterson

algorithmic trading, automated trading system, banking crisis, bash_history, Bernie Madoff, butterfly effect, buttonwood tree, buy and hold, Chuck Templeton: OpenTable:, cloud computing, collapse of Lehman Brothers, computerized trading, creative destruction, Donald Trump, fixed income, Flash crash, Francisco Pizarro, Gordon Gekko, Hibernia Atlantic: Project Express, High speed trading, Joseph Schumpeter, latency arbitrage, Long Term Capital Management, Mark Zuckerberg, market design, market microstructure, pattern recognition, pets.com, Ponzi scheme, popular electronics, prediction markets, quantitative hedge fund, Ray Kurzweil, Renaissance Technologies, Sergey Aleynikov, Small Order Execution System, South China Sea, Spread Networks laid a new fibre optics cable between New York and Chicago, stealth mode startup, stochastic process, transaction costs, Watson beat the top human players on Jeopardy!, zero-sum game

The following ad for Getco, for instance, appeared in January 2012: CHICAGO, IL: Work with inter-disciplinary teams of traders & technologists & use trading models to trade profitably on major electronic exchanges; use statistical & mathematical approaches & develop new models to leverage trading capabilities. Must have Master’s in Math, Statistics, Physical Science, Computer Science, or Engineering w/min GPA of 3.4/4.0. Must have proven graduate level coursework in 2 or more of the following: Stochastic Processes, Statistical Methods, Mathematical Finance, Applied Numerical Methods, Machine Learning. Then, in the summer of 2011, a new contender for the high-frequency crown had emerged. Virtu Financial, the computer trading outfit that counted former Island attorney and Nasdaq executive Chris Concannon as a partner, merged with EWT, a California speed-trading operation that operated on exchanges around the world.

pages: 411 words: 108,119

**
The Irrational Economist: Making Decisions in a Dangerous World
** by
Erwann Michel-Kerjan,
Paul Slovic

"Robert Solow", Andrei Shleifer, availability heuristic, bank run, Black Swan, business cycle, Cass Sunstein, clean water, cognitive dissonance, collateralized debt obligation, complexity theory, conceptual framework, corporate social responsibility, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-subsidies, Daniel Kahneman / Amos Tversky, endowment effect, experimental economics, financial innovation, Fractional reserve banking, George Akerlof, hindsight bias, incomplete markets, information asymmetry, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Isaac Newton, iterative process, Kenneth Arrow, Loma Prieta earthquake, London Interbank Offered Rate, market bubble, market clearing, money market fund, moral hazard, mortgage debt, Pareto efficiency, Paul Samuelson, placebo effect, price discrimination, price stability, RAND corporation, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, source of truth, statistical model, stochastic process, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, ultimatum game, University of East Anglia, urban planning, Vilfredo Pareto

The existing literature is based on a completely standard expected utility modelling, whereby the welfare of each future generation is evaluated by computing its expected utility based on a probability distribution for the GDP per capita that it will enjoy. A major difficulty, however, is that these probability distributions are ambiguous, in the sense that they are not based on scientific arguments, or on a database large enough to make them completely objective. Indeed, more than one stochastic process is compatible with existing methods for describing economic growth. The Ellsberg paradox tells us that most human beings are averse to ambiguity, which means that they tend to overestimate the probability of the worst-case scenario when computing their subjective expected utility. This suggests that agents systematically violate Savage’s “Sure Thing Principle” (Savage, 1954). More precisely, it seems that the way we evaluate uncertain prospects depends on how precise our information about the underlying probabilities is.

pages: 354 words: 26,550

**
High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems
** by
Irene Aldridge

algorithmic trading, asset allocation, asset-backed security, automated trading system, backtesting, Black Swan, Brownian motion, business cycle, business process, buy and hold, capital asset pricing model, centralized clearinghouse, collapse of Lehman Brothers, collateralized debt obligation, collective bargaining, computerized trading, diversification, equity premium, fault tolerance, financial intermediation, fixed income, high net worth, implied volatility, index arbitrage, information asymmetry, interest rate swap, inventory management, law of one price, Long Term Capital Management, Louis Bachelier, margin call, market friction, market microstructure, martingale, Myron Scholes, New Journalism, p-value, paper trading, performance metric, profit motive, purchasing power parity, quantitative trading / quantitative ﬁnance, random walk, Renaissance Technologies, risk tolerance, risk-adjusted returns, risk/return, Sharpe ratio, short selling, Small Order Execution System, statistical arbitrage, statistical model, stochastic process, stochastic volatility, systematic trading, trade route, transaction costs, value at risk, yield curve, zero-sum game

In the Garman (1976) model, the market has one monopolistic market maker (dealer). The market maker is responsible for deciding on and then setting bid and ask prices, receiving all orders, and clearing trades. The market maker’s objective is to maximize profits while avoiding bankruptcy or failure. The latter arise whenever the market maker has no inventory or cash. Both buy and sell orders arrive as independent stochastic processes. The model solution for optimal bid and ask prices lies in the estimation of the rates at which a unit of cash (e.g., a dollar or a “clip” of 10 million in FX) “arrives” to the market maker when a customer comes in to buy securities (pays money to the dealer) and “departs” the market maker when a customer comes in to sell (the dealer pays the customer). Suppose the probability of an arrival, a customer order to buy a security at the market ask price pa is denoted λa .

pages: 370 words: 107,983

**
Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All
** by
Robert Elliott Smith

Ada Lovelace, affirmative action, AI winter, Alfred Russel Wallace, Amazon Mechanical Turk, animal electricity, autonomous vehicles, Black Swan, British Empire, cellular automata, citizen journalism, Claude Shannon: information theory, combinatorial explosion, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, desegregation, discovery of DNA, Douglas Hofstadter, Elon Musk, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, low skilled workers, Mark Zuckerberg, mass immigration, meta analysis, meta-analysis, mutually assured destruction, natural language processing, new economy, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, women in the workforce

v=GGgiGtJk7MA 2The Pew Research Center, 2017, Public Trust in Government: 1958–2017, www.people-press.org/2017/12/14/public-trust-in-government-1958-2017/ 3Gallup, 2018, Confidence in Institutions, https://news.gallup.com/poll/1597/confidence-institutions.aspx 4This in fact led to a protracted conversation on the difference between UK, European and American methods of presenting odds, which led to a wasted afternoon of my graduate studies, a sleepless night working all the relationships out, an inferior mid-term exam score in my Stochastic Processes course and hard work to get an A in the end. So I have omitted this for the reader’s benefit. 5Colin E. Beech, 2008, The Grail and the Golem: The Sociology of Aleatory Artifacts. PhD dissertation. Rensselaer Polytechnic Institute, Troy, NY. Advisor(s) Sal Restivo. AAI3342844. https://dl.acm.org/citation.cfm?id=1627267 6Prakash Gorroochurn, 2012, Some Laws and Problems of Classical Probability and How Cardano Anticipated Them.

**
Data Mining: Concepts and Techniques: Concepts and Techniques
** by
Jiawei Han,
Micheline Kamber,
Jian Pei

bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, distributed generation, finite state, information retrieval, iterative process, knowledge worker, linked data, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, random walk, recommendation engine, RFID, semantic web, sentiment analysis, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, Thomas Bayes, web application

A straightforward adaptation of a clustering method for outlier detection can be very costly, and thus does not scale up well for large data sets. Clustering-based outlier detection methods are discussed in detail in Section 12.5. 12.3. Statistical Approaches As with statistical methods for clustering, statistical methods for outlier detection make assumptions about data normality. They assume that the normal objects in a data set are generated by a stochastic process (a generative model). Consequently, normal objects occur in regions of high probability for the stochastic model, and objects in the regions of low probability are outliers. The general idea behind statistical methods for outlier detection is to learn a generative model fitting the given data set, and then identify those objects in low-probability regions of the model as outliers. However, there are many different ways to learn generative models.

…

The kernel density approximation of the probability density function is(12.9) where K() is a kernel and h is the bandwidth serving as a smoothing parameter. Once the probability density function of a data set is approximated through kernel density estimation, we can use the estimated density function to detect outliers. For an object, o, gives the estimated probability that the object is generated by the stochastic process. If is high, then the object is likely normal. Otherwise, o is likely an outlier. This step is often similar to the corresponding step in parametric methods. In summary, statistical methods for outlier detection learn models from data to distinguish normal data objects from outliers. An advantage of using statistical methods is that the outlier detection may be statistically justifiable.

pages: 298 words: 43,745

**
Understanding Sponsored Search: Core Elements of Keyword Advertising
** by
Jim Jansen

AltaVista, barriers to entry, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, correlation does not imply causation, en.wikipedia.org, first-price auction, information asymmetry, information retrieval, intangible asset, inventory management, life extension, linear programming, longitudinal study, megacity, Nash equilibrium, Network effects, PageRank, place-making, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social web, software as a service, stochastic process, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, Vilfredo Pareto, yield management

Indianapolis, IN: Wiley. â•‡ [2] Peterson, E. 2004. Web Analytics Demystified: A Marketer’s Guide to Understanding How Your Web Site Affects Your Business. New York: Celilo Group Media. â•‡ [3] Pedrick, J. H. and Zufryden, F. S. 1991. “Evaluating the Impact of Advertising Media Plans: A Model of Consumer Purchase Dynamics Using Single Source Data.” Marketing Science, vol. 10(2), pp. 111–130. â•‡ [4] Penniman, W. D. 1975. “A Stochastic Process Analysis of Online User Behavior.” In The Annual Meeting of the American Society for Information Science, Washington, DC, pp. 147–148. â•‡ [5] Meister, D. and Sullivan, D. 1967. “Evaluation of User Reactions to a Prototype On-Line Information Retrieval System: Report to NASA by the Bunker-Ramo Corporation. Report Number NASA CR-918.” Bunker-Ramo Corporation, Oak Brook, IL. â•‡ [6] Directors, A.

pages: 523 words: 112,185

**
Doing Data Science: Straight Talk From the Frontline
** by
Cathy O'Neil,
Rachel Schutt

Amazon Mechanical Turk, augmented reality, Augustin-Louis Cauchy, barriers to entry, Bayesian statistics, bioinformatics, computer vision, correlation does not imply causation, crowdsourcing, distributed generation, Edward Snowden, Emanuel Derman, fault tolerance, Filter Bubble, finite state, Firefox, game design, Google Glasses, index card, information retrieval, iterative process, John Harrison: Longitude, Khan Academy, Kickstarter, Mars Rover, Nate Silver, natural language processing, Netflix Prize, p-value, pattern recognition, performance metric, personalized medicine, pull request, recommendation engine, rent-seeking, selection bias, Silicon Valley, speech recognition, statistical model, stochastic process, text mining, the scientific method, The Wisdom of Crowds, Watson beat the top human players on Jeopardy!, X Prize

The degrees themselves aren’t giving us a real understanding of how interconnected a given node is, though, so in the next iteration, add the degrees of all the neighbors of a given node, again scaled. Keep iterating on this, adding degrees of neighbors one further step out each time. In the limit as this iterative process goes on forever, we’ll get the eigenvalue centrality vector. A First Example of Random Graphs: The Erdos-Renyi Model Let’s work out a simple example where a network can be viewed as a single realization of an underlying stochastic process. Namely, where the existence of a given edge follows a probability distribution, and all the edges are considered independently. Say we start with nodes. Then there are pairs of nodes, or dyads, which can either be connected by an (undirected) edge or not. Then there are possible observed networks. The simplest underlying distribution one can place on the individual edges is called the Erdos-Renyi model, which assumes that for every pair of nodes , an edge exists between the two nodes with probability .

pages: 561 words: 120,899

**
The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy
** by
Sharon Bertsch McGrayne

Bayesian statistics, bioinformatics, British Empire, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, double helix, Edmond Halley, Fellow of the Royal Society, full text search, Henri Poincaré, Isaac Newton, Johannes Kepler, John Markoff, John Nash: game theory, John von Neumann, linear programming, longitudinal study, meta analysis, meta-analysis, Nate Silver, p-value, Pierre-Simon Laplace, placebo effect, prediction markets, RAND corporation, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman: Challenger O-ring, Robert Mercer, Ronald Reagan, speech recognition, statistical model, stochastic process, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, uranium enrichment, Yom Kippur War

Venter, Gary G. (fall 1987) Credibility. CAS Forum 81–147. Chapter 7. From Tool to Theology Armitage P. (1994) Dennis Lindley: The first 70 years. In Aspects of Uncertainty: A Tribute to D. V. Lindley, eds., PR Freeman and AFM Smith. John Wiley and Sons. Banks, David L. (1996) A Conversation with I. J. Good. Statistical Science (11) 1–19. Dubins LE, Savage LJ. (1976) Inequalities for Stochastic Processes (How to Gamble If You Must). Dover. Box, George EP, et al. (2006) Improving Almost Anything. Wiley. Box GEP, Tiao GC. (1973) Bayesian Inference in Statistical Analysis. Addison-Wesley. Cramér, H. (1976). Half of a century of probability theory: Some personal recollections. Annals of Probability (4) 509–46. D’Agostini, Giulio. (2005) The Fermi’s Bayes theorem. Bulletin of the International Society of Bayesian Analysis (1) 1–4.

**
The Trade Lifecycle: Behind the Scenes of the Trading Process (The Wiley Finance Series)
** by
Robert P. Baker

asset-backed security, bank run, banking crisis, Basel III, Black-Scholes formula, Brownian motion, business continuity plan, business process, collapse of Lehman Brothers, corporate governance, credit crunch, Credit Default Swap, diversification, fixed income, hiring and firing, implied volatility, interest rate derivative, interest rate swap, locking in a profit, London Interbank Offered Rate, margin call, market clearing, millennium bug, place-making, prediction markets, short selling, statistical model, stochastic process, the market place, the payments system, time value of money, too big to fail, transaction costs, value at risk, Wiener process, yield curve, zero-coupon bond

The market risk calculation is in theory attempting to replicate every possible combination of market data. Some simplifications have to be made because each piece of market data is technically a random variable and its connection (or correlation) to other market data is very hard, if not impossible, to determine. 3. Decide the calculation methodology There are two basic approaches – stochastic or historical. Stochastic processes If a piece of market data is assumed to be a normal distribution then we can ascribe different probabilities to different values. For example: 1% probability of 140, 5% probability of 155, 50% probability of 182 and so on. This removes the need for a large amount of data but ignores correlation between different market data. Historical data We go back over a certain period of market data, apply every day-on-day change in all market data to the set of trades under examination and for each day we get a different total value.

pages: 611 words: 130,419

**
Narrative Economics: How Stories Go Viral and Drive Major Economic Events
** by
Robert J. Shiller

agricultural Revolution, Albert Einstein, algorithmic trading, Andrei Shleifer, autonomous vehicles, bank run, banking crisis, basic income, bitcoin, blockchain, business cycle, butterfly effect, buy and hold, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, central bank independence, collective bargaining, computerized trading, corporate raider, correlation does not imply causation, cryptocurrency, Daniel Kahneman / Amos Tversky, debt deflation, disintermediation, Donald Trump, Edmond Halley, Elon Musk, en.wikipedia.org, Ethereum, ethereum blockchain, full employment, George Akerlof, germ theory of disease, German hyperinflation, Gunnar Myrdal, Gödel, Escher, Bach, Hacker Ethic, implied volatility, income inequality, inflation targeting, invention of radio, invention of the telegraph, Jean Tirole, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, litecoin, market bubble, money market fund, moral hazard, Northern Rock, nudge unit, Own Your Own Home, Paul Samuelson, Philip Mirowski, plutocrats, Plutocrats, Ponzi scheme, publish or perish, random walk, Richard Thaler, Robert Shiller, Robert Shiller, Ronald Reagan, Rubik’s Cube, Satoshi Nakamoto, secular stagnation, shareholder value, Silicon Valley, speech recognition, Steve Jobs, Steven Pinker, stochastic process, stocks for the long run, superstar cities, The Rise and Fall of American Growth, The Wealth of Nations by Adam Smith, theory of mind, Thorstein Veblen, traveling salesman, trickle-down economics, tulip mania, universal basic income, Watson beat the top human players on Jeopardy!, We are the 99%, yellow journalism, yield curve, Yom Kippur War

., 2007. 7. Long et al., 2008. 8. JSTOR catalogs over nine million scholarly articles and books in all fields, and 7% of these are in business or economics, but 25% of the articles with “ARIMA,” “ARMA,” or “autoregressive” are in business or economics. 9. Moving average models are sometimes justified by reference to the Wold decomposition theorem (1954), which shows that any covariance stationary stochastic process can be modeled as a moving average of noise terms plus a deterministic component. But there is no justification for assuming that simple variants of ARIMA models are so general. We may be better able to do economic forecasting in some cases if we represent these error terms or driving variables as the result of co-epidemics of narratives about which we have some information. 10. See Nsoesie et al., 2013. 11.

pages: 1,535 words: 337,071

**
Networks, Crowds, and Markets: Reasoning About a Highly Connected World
** by
David Easley,
Jon Kleinberg

Albert Einstein, AltaVista, clean water, conceptual framework, Daniel Kahneman / Amos Tversky, Douglas Hofstadter, Erdős number, experimental subject, first-price auction, fudge factor, George Akerlof, Gerard Salton, Gerard Salton, Gödel, Escher, Bach, incomplete markets, information asymmetry, information retrieval, John Nash: game theory, Kenneth Arrow, longitudinal study, market clearing, market microstructure, moral hazard, Nash equilibrium, Network effects, Pareto efficiency, Paul Erdős, planetary scale, prediction markets, price anchoring, price mechanism, prisoner's dilemma, random walk, recommendation engine, Richard Thaler, Ronald Coase, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, Simon Singh, slashdot, social web, Steve Jobs, stochastic process, Ted Nelson, The Market for Lemons, The Wisdom of Crowds, trade route, transaction costs, ultimatum game, Vannevar Bush, Vickrey auction, Vilfredo Pareto, Yogi Berra, zero-sum game

How vicious are cycles of intransitive choice? Theory and Decision, 24:119–145, 1988. [41] Albert-László Barabási and Réka Albert. Emergence of scaling in random networks. Science, 286:509–512, 1999. [42] Albert-László Barabási and Zoltan Oltvai. Network biology: Understanding the cell’s functional organization. Nature Reviews Genetics, 5:101–113, 2004. [43] A. D. Barbour and D. Mollison. Epidemics and random graphs. In Stochastic Processes in Epidemic Theory, volume 86 of Lecture Notes in Biomathematics, pages 86–89. Springer, 1990. [44] John A. Barnes. Social Networks. Number 26 in Modules in Anthropology. Addison Wesley, 1972. [45] Chris Barrett and E. Mutambatsere. Agricultural markets in developing countries. In Lawrence E. Blume and Steven N. Durlauf, editors, The New Palgrave Dictionary of Economics. Oxford University Press, second edition, 2008

…

Mathematics of Operations Research, 28(2):294–308, 2003. [239] Peter D. Killworth and H. Russell Bernard. Reverse small world experiment. Social Networks, 1:159–192, 1978. [240] Peter D. Killworth, Eugene C. Johnsen, H. Russell Bernard, Gene Ann Shelley, and Christopher McCarty. Estimating the size of personal networks. Social Networks, 12(4):289–312, December 1990. [241] John F. C. Kingman. The coalescent. Stochastic Processes and their Applications, 13:235–248, 1982. [242] Aniket Kittur and Robert E. Kraut. Harnessing the wisdom of crowds in Wikipedia: Quality through coordination. In Proc. CSCW’08: ACM Conference on Computer-Supported Cooperative Work, 2008. BIBLIOGRAPHY 817 [243] Jon Kleinberg. Authoritative sources in a hyperlinked environment. Journal of the ACM, 46(5):604–632, 1999. A preliminary version appears in the Proceedings of the 9th ACM-SIAM Symposium on Discrete Algorithms, Jan. 1998

pages: 696 words: 143,736

**
The Age of Spiritual Machines: When Computers Exceed Human Intelligence
** by
Ray Kurzweil

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Buckminster Fuller, call centre, cellular automata, combinatorial explosion, complexity theory, computer age, computer vision, cosmological constant, cosmological principle, Danny Hillis, double helix, Douglas Hofstadter, Everything should be made as simple as possible, first square of the chessboard / second half of the chessboard, fudge factor, George Gilder, Gödel, Escher, Bach, I think there is a world market for maybe five computers, information retrieval, invention of movable type, Isaac Newton, iterative process, Jacquard loom, John Markoff, John von Neumann, Lao Tzu, Law of Accelerating Returns, mandelbrot fractal, Marshall McLuhan, Menlo Park, natural language processing, Norbert Wiener, optical character recognition, ought to be enough for anybody, pattern recognition, phenotype, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, Robert Metcalfe, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, social intelligence, speech recognition, Steven Pinker, Stewart Brand, stochastic process, technological singularity, Ted Kaczynski, telepresence, the medium is the message, There's no reason for any individual to have a computer in his home - Ken Olsen, traveling salesman, Turing machine, Turing test, Whole Earth Review, Y2K

Engines of Change: The American Industrial Revolution, 1790-1860. Washington, D.C.: Smithsonian Institution Press, 1986. Hoage, R. J. and Larry Goldman. Animal Intelligence: Insights into the Animal Mind. Washington, D.C.: Smithsonian Institution Press, 1986. Hodges, Andrew. Alan Turing: The Enigma. New York: Simon and Schuster, 1983. Hoel, Paul G., Sidney C. Port, and Charles J. Stone. Introduction to Stochastic Processes. Boston: Houghton-Mifflin, 1972. Hofstadter, Douglas R. Gödel, Escher, Bach: An Eternal Golden Braid. New York: Basic Books, 1979. _________. Metamagical Themas: Questing for the Essence of Mind and Pattern. New York: Basic Books, 1985. Hofstadter, Douglas R. and Daniel C. Dennett. The Mind’s I: Fantasies and Reflections on Self and Soul. New York: Basic Books, 1981. Hofstadter, Douglas R., Gray Clossman, and Marsha Meredith.

**
Principles of Protocol Design
** by
Robin Sharp

accounting loophole / creative accounting, business process, discrete time, fault tolerance, finite state, Gödel, Escher, Bach, information retrieval, loose coupling, MITM: man-in-the-middle, packet switching, RFC: Request For Comment, stochastic process

For example, you might like to pursue all the references to the Alternating Bit Protocol in the literature, starting with the ones given in connection with Protocol 5. This will lead you into the area of other proof techniques for protocols, as well as illustrating how new mechanisms develop as time goes by. Finally, you might like to investigate quantitative properties of some protocols, such as their throughput and delay in the presence of varying loads of traffic. Generally speaking, this requires a knowledge of queueing theory and the theory of stochastic processes. This is not a subject which we pay more than passing attention to in this book. However, some protocols, especially multiplexing protocols, have been the subject of intensive investigation from this point of view. Good discussions of the general theory required are found in [73], while [11] relates the theory more explicitly to the analysis of network protocols. 118 4 Basic Protocol Mechanisms Exercises 4.1.

pages: 1,331 words: 163,200

**
Hands-On Machine Learning With Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems
** by
Aurélien Géron

Amazon Mechanical Turk, Anton Chekhov, combinatorial explosion, computer vision, constrained optimization, correlation coefficient, crowdsourcing, don't repeat yourself, Elon Musk, en.wikipedia.org, friendly AI, ImageNet competition, information retrieval, iterative process, John von Neumann, Kickstarter, natural language processing, Netflix Prize, NP-complete, optical character recognition, P = NP, p-value, pattern recognition, pull request, recommendation engine, self-driving car, sentiment analysis, SpamAssassin, speech recognition, stochastic process

Whereas PG algorithms directly try to optimize the policy to increase rewards, the algorithms we will look at now are less direct: the agent learns to estimate the expected sum of discounted future rewards for each state, or the expected sum of discounted future rewards for each action in each state, then uses this knowledge to decide how to act. To understand these algorithms, we must first introduce Markov decision processes (MDP). Markov Decision Processes In the early 20th century, the mathematician Andrey Markov studied stochastic processes with no memory, called Markov chains. Such a process has a fixed number of states, and it randomly evolves from one state to another at each step. The probability for it to evolve from a state s to a state s′ is fixed, and it depends only on the pair (s,s′), not on past states (the system has no memory). Figure 16-7 shows an example of a Markov chain with four states. Suppose that the process starts in state s0, and there is a 70% chance that it will remain in that state at the next step.

pages: 634 words: 185,116

**
From eternity to here: the quest for the ultimate theory of time
** by
Sean M. Carroll

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, cellular automata, Claude Shannon: information theory, Columbine, cosmic microwave background, cosmological constant, cosmological principle, dark matter, dematerialisation, double helix, en.wikipedia.org, gravity well, Harlow Shapley and Heber Curtis, Henri Poincaré, Isaac Newton, Johannes Kepler, John von Neumann, Lao Tzu, Laplace demon, lone genius, low earth orbit, New Journalism, Norbert Wiener, pets.com, Pierre-Simon Laplace, Richard Feynman, Richard Stallman, Schrödinger's Cat, Slavoj Žižek, Stephen Hawking, stochastic process, the scientific method, wikimedia commons

That kind of wave function, concentrated entirely on a single possible observational outcome, is known as an “eigenstate.” Once the system is in that eigenstate, you can keep making the same kind of observation, and you’ll keep getting the same answer (unless something kicks the system out of the eigenstate into another superposition). We can’t say with certainty which eigenstate the system will fall into when an observation is made; it’s an inherently stochastic process, and the best we can do is assign a probability to different outcomes. We can apply this idea to the story of Miss Kitty. According to the Copenhagen interpretation, our choice to observe whether she stopped by the food bowl or the scratching post had a dramatic effect on her wave function, no matter how sneaky we were about it. When we didn’t look, she was in a superposition of the two possibilities, with equal amplitude; when she then moved on to the sofa or the table, we added up the contributions from each of the intermediate steps, and found there was interference.

pages: 728 words: 182,850

**
Cooking for Geeks
** by
Jeff Potter

3D printing, A Pattern Language, carbon footprint, centre right, Community Supported Agriculture, Computer Numeric Control, crowdsourcing, Donald Knuth, double helix, en.wikipedia.org, European colonialism, fear of failure, food miles, functional fixedness, hacker house, haute cuisine, helicopter parent, Internet Archive, iterative process, Kickstarter, Parkinson's law, placebo effect, random walk, Rubik’s Cube, slashdot, stochastic process, the scientific method

It’s possible to break up the collagen chemically, too: lysosomal enzymes will attack the structure and "break the covalent bonds" in chem-speak, but this isn’t so useful to know in the kitchen. Note For fun, try marinating a chunk of meat in papaya, which contains an enzyme, papain, that acts as a meat tenderizer by hydrolyzing collagen. One piece of information that is critical to understand in the kitchen, however, is that hydrolysis takes time. The structure has to literally untwist and break up, and due to the amount of energy needed to break the bonds and the stochastic processes involved, this reaction takes longer than simply denaturing the protein. Hydrolyzing collagen not only breaks down the rubbery texture of the denatured structure, but also converts a portion of it to gelatin. When the collagen hydrolyzes, it breaks into variously sized pieces, the smaller of which are able to dissolve into the surrounding liquid, creating gelatin. It’s this gelatin that gives dishes such as braised ox tail, slow-cooked short ribs, and duck confit their distinctive mouthfeel.

pages: 733 words: 179,391

**
Adaptive Markets: Financial Evolution at the Speed of Thought
** by
Andrew W. Lo

"Robert Solow", Albert Einstein, Alfred Russel Wallace, algorithmic trading, Andrei Shleifer, Arthur Eddington, Asian financial crisis, asset allocation, asset-backed security, backtesting, bank run, barriers to entry, Berlin Wall, Bernie Madoff, bitcoin, Bonfire of the Vanities, bonus culture, break the buck, Brownian motion, business cycle, business process, butterfly effect, buy and hold, capital asset pricing model, Captain Sullenberger Hudson, Carmen Reinhart, collapse of Lehman Brothers, collateralized debt obligation, commoditize, computerized trading, corporate governance, creative destruction, Credit Default Swap, credit default swaps / collateralized debt obligations, cryptocurrency, Daniel Kahneman / Amos Tversky, delayed gratification, Diane Coyle, diversification, diversified portfolio, double helix, easy for humans, difficult for computers, Ernest Rutherford, Eugene Fama: efficient market hypothesis, experimental economics, experimental subject, Fall of the Berlin Wall, financial deregulation, financial innovation, financial intermediation, fixed income, Flash crash, Fractional reserve banking, framing effect, Gordon Gekko, greed is good, Hans Rosling, Henri Poincaré, high net worth, housing crisis, incomplete markets, index fund, interest rate derivative, invention of the telegraph, Isaac Newton, James Watt: steam engine, job satisfaction, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Meriwether, Joseph Schumpeter, Kenneth Rogoff, London Interbank Offered Rate, Long Term Capital Management, longitudinal study, loss aversion, Louis Pasteur, mandelbrot fractal, margin call, Mark Zuckerberg, market fundamentalism, martingale, merger arbitrage, meta analysis, meta-analysis, Milgram experiment, money market fund, moral hazard, Myron Scholes, Nick Leeson, old-boy network, out of africa, p-value, paper trading, passive investing, Paul Lévy, Paul Samuelson, Ponzi scheme, predatory finance, prediction markets, price discovery process, profit maximization, profit motive, quantitative hedge fund, quantitative trading / quantitative ﬁnance, RAND corporation, random walk, randomized controlled trial, Renaissance Technologies, Richard Feynman, Richard Feynman: Challenger O-ring, risk tolerance, Robert Shiller, Robert Shiller, Sam Peltzman, Shai Danziger, short selling, sovereign wealth fund, Stanford marshmallow experiment, Stanford prison experiment, statistical arbitrage, Steven Pinker, stochastic process, stocks for the long run, survivorship bias, Thales and the olive presses, The Great Moderation, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Malthus, Thorstein Veblen, Tobin tax, too big to fail, transaction costs, Triangle Shirtwaist Factory, ultimatum game, Upton Sinclair, US Airways Flight 1549, Walter Mischel, Watson beat the top human players on Jeopardy!, WikiLeaks, Yogi Berra, zero-sum game

In a 1973 article on the mathematical underpinnings of financial speculation, Samuelson included a wonderful tribute to Bachelier: Notes to Chapter 1 • 423 Since illustrious French geometers almost never die, it is possible that Bachelier still survives in Paris supplementing his professorial retirement pension by judicious arbitrage in puts and calls. But my widespread lecturing on him over the last 20 years has not elicited any information on the subject. How much Poincaré, to whom he dedicates the thesis, contributed to it, I have no knowledge. Finally, as Bachelier’s cited works suggest, he seems to have had something of a one-track mind. But what a track! The rather supercilious references to him, as an unrigorous pioneer in stochastic processes and stimulator of work in that area by more rigorous mathematicians such as Kolmogorov, hardly does Bachelier justice. His methods can hold their own in rigor with the best scientific work of his time, and his fertility was outstanding. Einstein is properly revered for his basic, and independent, discovery of the theory of Brownian motion 5 years after Bachelier. But years ago when I compared the two texts, I formed the judgment (which I have not checked back on) that Bachelier’s methods dominated Einstein’s in every element of the vector.

pages: 823 words: 220,581

**
Debunking Economics - Revised, Expanded and Integrated Edition: The Naked Emperor Dethroned?
** by
Steve Keen

"Robert Solow", accounting loophole / creative accounting, banking crisis, banks create money, barriers to entry, Benoit Mandelbrot, Big bang: deregulation of the City of London, Black Swan, Bonfire of the Vanities, business cycle, butterfly effect, capital asset pricing model, cellular automata, central bank independence, citizen journalism, clockwork universe, collective bargaining, complexity theory, correlation coefficient, creative destruction, credit crunch, David Ricardo: comparative advantage, debt deflation, diversification, double entry bookkeeping, en.wikipedia.org, Eugene Fama: efficient market hypothesis, experimental subject, Financial Instability Hypothesis, fixed income, Fractional reserve banking, full employment, Henri Poincaré, housing crisis, Hyman Minsky, income inequality, information asymmetry, invisible hand, iterative process, John von Neumann, Kickstarter, laissez-faire capitalism, liquidity trap, Long Term Capital Management, mandelbrot fractal, margin call, market bubble, market clearing, market microstructure, means of production, minimum wage unemployment, money market fund, open economy, Pareto efficiency, Paul Samuelson, place-making, Ponzi scheme, profit maximization, quantitative easing, RAND corporation, random walk, risk tolerance, risk/return, Robert Shiller, Robert Shiller, Ronald Coase, Schrödinger's Cat, scientific mainstream, seigniorage, six sigma, South Sea Bubble, stochastic process, The Great Moderation, The Wealth of Nations by Adam Smith, Thorstein Veblen, time value of money, total factor productivity, tulip mania, wage slave, zero-sum game

The impact of this power inversion can be seen in the physicist Joe McCauley’s observations about the need to reform economics education: The real problem with my proposal for the future of economics departments is that current economics and finance students typically do not know enough mathematics to understand (a) what econophysicists are doing, or (b) to evaluate the neo-classical model (known in the trade as ‘The Citadel’) critically enough to see, as Alan Kirman put it, that ‘No amount of attention to the walls will prevent The Citadel from being empty.’ I therefore suggest that the economists revise their curriculum and require that the following topics be taught: calculus through the advanced level, ordinary differential equations (including advanced), partial differential equations (including Green functions), classical mechanics through modern nonlinear dynamics, statistical physics, stochastic processes (including solving Smoluchowski–Fokker–Planck equations), computer programming (C, Pascal, etc.) and, for complexity, cell biology. Time for such classes can be obtained in part by eliminating micro- and macro-economics classes from the curriculum. The students will then face a much harder curriculum, and those who survive will come out ahead. So might society as a whole. (McCauley 2006: 607–8) This amplifies a point that, as a critic of economics with a reasonable grounding in mathematics myself, has long set me apart from most other critics: neoclassical economics is not bad because it is mathematical per se, but because it is bad mathematics. 16 | DON’T SHOOT ME, I’M ONLY THE PIANO Why mathematics is not the problem Many critics of economics have laid the blame for its manifest failures at the feet of mathematics.

pages: 798 words: 240,182

**
The Transhumanist Reader
** by
Max More,
Natasha Vita-More

23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Douglas Engelbart, Drosophila, en.wikipedia.org, endogenous growth, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, pattern recognition, Pepto Bismol, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Ronald Reagan, scientific worldview, silicon-based life, Singularitarianism, social intelligence, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce, zero-sum game

If we estimate about 102 bytes of information to encode these details (which may be low), we have 1016 bytes, considerably more than the 109 bytes that you mentioned. One might ask: How do we get from 107 bytes that specify the brain in the genome to 1016 bytes in the mature brain? This is not hard to understand, since we do this type of meaningful data expansion routinely in our self-organizing software paradigms. For example, a genetic algorithm can be efficiently coded, but in turn creates data far greater in size than itself using a stochastic process, which in turn self-organizes in response to a complex environment (the problem space). The result of this process is meaningful information far greater than the original program. We know that this is exactly how the creation of the brain works. The genome specifies initially semi-random interneuronal connection wiring patterns in specific regions of the brain (random within certain constraints and rules), and these patterns (along with the neurotransmitter-concentration levels) then undergo their own internal evolutionary process to self-organize to reflect the interactions of that person with their experiences and environment.

pages: 920 words: 233,102

**
Unelected Power: The Quest for Legitimacy in Central Banking and the Regulatory State
** by
Paul Tucker

Andrei Shleifer, bank run, banking crisis, barriers to entry, Basel III, battle of ideas, Ben Bernanke: helicopter money, Berlin Wall, Bretton Woods, business cycle, capital controls, Carmen Reinhart, Cass Sunstein, central bank independence, centre right, conceptual framework, corporate governance, diversified portfolio, Fall of the Berlin Wall, financial innovation, financial intermediation, financial repression, first-past-the-post, floating exchange rates, forensic accounting, forward guidance, Fractional reserve banking, Francis Fukuyama: the end of history, full employment, George Akerlof, incomplete markets, inflation targeting, information asymmetry, invisible hand, iterative process, Jean Tirole, Joseph Schumpeter, Kenneth Arrow, Kenneth Rogoff, liberal capitalism, light touch regulation, Long Term Capital Management, means of production, money market fund, Mont Pelerin Society, moral hazard, Northern Rock, Pareto efficiency, Paul Samuelson, price mechanism, price stability, principal–agent problem, profit maximization, quantitative easing, regulatory arbitrage, reserve currency, risk tolerance, risk-adjusted returns, road to serfdom, Robert Bork, Ronald Coase, seigniorage, short selling, Social Responsibility of Business Is to Increase Its Profits, stochastic process, The Chicago School, The Great Moderation, The Market for Lemons, the payments system, too big to fail, transaction costs, Vilfredo Pareto, Washington Consensus, yield curve, zero-coupon bond, zero-sum game

FRAMING A STANDARD FOR SYSTEM RESILIENCE: POLITICS, TRADE-OFFS, AND PUBLIC DEBATE If the public policy purpose of a central banking stability mandate should be continuity of services from the system as a whole, thus avoiding the worst costs of “bust,” the core of the regime must be a monitorable standard of resilience. That much is entailed by the first Design Precept, cast as a revived “nondelegation doctrine” in part III (chapter 14). The big questions are what it means in principle and in practice. Roughly speaking, policy makers need to determine the severity of shock that the system should be able to withstand. In principle, that would be driven by three things: A view of the underlying (stochastic) process generating the first-round losses from end borrowers that hit the system A picture (or model) of the structure of the financial system through which those losses and other shocks are transmitted around the system A tolerance for systemic crisis The first and second are properly objects of scientific inquiry by technocrats and researchers. The third is different. Whereas the central belief of monetary economics relevant to the design of policy institutions is that there is no long-run trade-off to speak of between economic activity and inflation, we do not yet know enough to judge whether prosperity would be damaged by totally eliminating the risk-taking structures that can threaten periodic bouts of instability.6 As I recollect former UK Treasury secretary George Osborne putting it, no one wants the stability of the graveyard.

pages: 764 words: 261,694

**
The Elements of Statistical Learning (Springer Series in Statistics)
** by
Trevor Hastie,
Robert Tibshirani,
Jerome Friedman

Bayesian statistics, bioinformatics, computer age, conceptual framework, correlation coefficient, G4S, greed is good, linear programming, p-value, pattern recognition, random walk, selection bias, speech recognition, statistical model, stochastic process, The Wisdom of Crowds

Bayesian Data Analysis, CRC Press, Boca Raton, FL. Geman, S. and Geman, D. (1984). Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images, IEEE Transactions on Pattern Analysis and Machine Intelligence 6: 721–741. Genkin, A., Lewis, D. and Madigan, D. (2007). Large-scale Bayesian logistic regression for text categorization, Technometrics 49(3): 291–304. Genovese, C. and Wasserman, L. (2004). A stochastic process approach to false discovery rates, Annals of Statistics 32(3): 1035–1061. Gersho, A. and Gray, R. (1992). Vector Quantization and Signal Compression, Kluwer Academic Publishers, Boston, MA. Girosi, F., Jones, M. and Poggio, T. (1995). Regularization theory and neural network architectures, Neural Computation 7: 219–269. Golub, G. and Van Loan, C. (1983). Matrix Computations, Johns Hopkins University Press, Baltimore.

**
The Art of Computer Programming: Fundamental Algorithms
** by
Donald E. Knuth

discrete time, distributed generation, Donald Knuth, fear of failure, Fermat's Last Theorem, G4S, Gerard Salton, Isaac Newton, Jacquard loom, Johannes Kepler, John von Neumann, linear programming, linked data, Menlo Park, probability theory / Blaise Pascal / Pierre de Fermat, sorting algorithm, stochastic process, Turing machine

Suppose each arc e of G has been assigned a probability p(e), where the probabilities satisfy the conditions 0 < p(e) < 1; ^ p(e) = 1 for 1 < j < n. init(e)=Vj Consider a random path, which starts at V\ and subsequently chooses branch e of G with probability p(e), until Vn is reached; the choice of branch taken at each step is to be independent of all previous choices. 2.3.4.2 ORIENTED TREES 381 For example, consider the graph of exercise 2.3.4.1-7, and assign the respective probabilities 1, \, \, |, 1, f, \, \, \ to arcs ei, e2,.. •, e9. Then the path "Start-A- B-C-A-D-B-C-Stop" is chosen with probability l-|-l-|-|-|-l-i = tIs- Such random paths are called Markov chains, after the Russian mathematician Andrei A. Markov, who first made extensive studies of stochastic processes of this kind. The situation serves as a model for certain algorithms, although our requirement that each choice must be independent of the others is a very strong assumption. The purpose of this exercise is to analyze the computation time for algorithms of this kind. The analysis is facilitated by considering the n x n matrix A — (aij), where aij = ^2p{e) summed over all arcs e that go from Vi to Vj.

pages: 1,799 words: 532,462

**
The Codebreakers: The Comprehensive History of Secret Communication From Ancient Times to the Internet
** by
David Kahn

anti-communist, British Empire, Claude Shannon: information theory, computer age, cuban missile crisis, Fellow of the Royal Society, Honoré de Balzac, index card, interchangeable parts, invention of the telegraph, Isaac Newton, Johannes Kepler, John von Neumann, Louis Daguerre, Maui Hawaii, Norbert Wiener, out of africa, pattern recognition, place-making, popular electronics, positional goods, Republic of Letters, Searching for Interstellar Communications, stochastic process, the scientific method, trade route, Turing machine, union organizing, yellow journalism, zero-sum game

N.S.A. leads even such firms as I.B.M. and Remington Rand in important areas of computer development, such as time-sharing, and industry has adopted many N.S.A.-designed features. The second section, STED (for “Standard Technical Equipment Development”) conducts basic cryptographic research. It looks for new principles of encipherment. It ascertains whether new developments in technology, such as the transistor and the tunnel diode, have cryptographic applications. Using such esoteric tools as Galois field theory, stochastic processes, and group, matrix, and number theory, it will construct a mathematical model of a proposed cipher machine and will simulate its operation on a computer, thus producing the cipher without having to build the hardware. Rotor principles have often been tested for cryptographic strength in this way. It devises audio scramblers, from the ultra-secure types for high officials to the walkie-talkies of platoon commanders, as well as video scramblers for reconnaissance television and for facsimile.