In Defense of a Quant (Part II): The Ups and Downs of the Normal Distribution
Click to Print This Page
In my previous column, I mentioned that particular attacks on the deficiency of mathematical finance are centered on the alleged misuse of “simplistic” normal distribution by the quants. Nassim Taleb never fails to contrast the disdainful and primitive “Brownian” paradigm with the supposedly more sophisticated “fractal” point of view, which he attributes to Benoît Mandelbrot. I quote only two passages from the Black Swan: “I find it ludicrous to present the uncertainty principle as having anything to do with uncertainty. Why? First, this uncertainty is Gaussian.” Or another pick: “So selecting the Gaussian while invoking some general law appears to be convenient. The Gaussian is used as a default distribution for that very reason.”
The set of zeroes of a standard Brownian motion is a fractal with fractional, or the Hausdorff-Besikovich dimension d=1/2 (a point has 0 and an interval has 1). It is uncountable, i.e., it has the same number of points as real numbers in the set-theoretical sense but is nowhere dense and perfect, i.e., it coincides with the set of its limiting points. One with a mathematical bent is encouraged to visualize this object. For the one without, it suffices to imagine Paris Hilton’s (or the Emperor’s) new clothes, which not only have an infinite but an uncountable number of threads per unit area and yet do not cover a single square inch of her bodice.
Another theorem about normal distributions says that the partial sum of the normally distributed, independent identically distributed (i.i.d.) random variables εt converges to the random walk:
Here [τ∙T] is the integer part of the real number τ∙T. This statement implies that every one-to-one mapping of the interval between 0 and 1 on another interval produces a new random walk.1 That is to say, instead of a very limited (affine) class of transformations maintaining self-similarity for your average Mandelbrot fractal, the Brownian motion displays self-similarity for all monotonic continuous mappings. The above theorem has a central significance for modern econometrics. For the explanation I refer the reader to the papers by the recently deceased Nobelist Clive Granger (Granger and Newbold, 1974).
For me, probability and complex analysis are fascinating subjects because the objects initially defined in a very general way turn out to have very special properties. One of the cornerstones of the probability is the Central Limit Theorem (CLT). In layman terms, it states that the sum of many independent (or even weakly dependent) random variables converges to the normal distribution. Sir Francis Galton, a Victorian polymath, described the CLT in the following exalted terms: “I know of scarcely of anything so apt to impress the imagination as the wonderful form of cosmic order expressed by the ‘Law of Frequency of Error’” (Galton, 1889).
The CLT inspired the Nobel-endowed creators of Capital Asset Pricing Model (CAPM). They imagined the returns of an efficient stock market as being influenced by a large number of firm-specific economic factors, which should sum up to something resembling normal distribution.
Our skeptics might point out that there are infinitely more sums of random variables, which converge to something that is not normal. For such skeptics, mathematicians proved a remarkable theorem when they discovered that any random process from a rather wide class (let us call this class Ç) can be represented as a stochastic integral of a predictable process—practically, a system of weights—over a Wiener process (Jarrow et al, 2006). This may not seem very exciting, but all money is additive and, hence, any contingent claim is a linear functional on the space of an asset’s paths. Furthermore, a pricing formula for an arbitrary asset process from Ç can be represented by a Black-Scholes-type equation. So any derivative, on any asset, valuation of which evolves due to a fairly general random process, can be expressed as an integral of a certain form. Peter Carr from the Courant Institute and Morgan Stanley, in 2003, outlined a small class of such problems, which admit an explicit analytic solution (Carr and Wu, 2003).
If these strictures were not enough, two remarkable papers from Harrison and Kreps (1979) and Harrison and Pliska (1981) delivered a set of theorems that demonstrated the equivalence of Ç and the class of arbitrage-free or self-financing asset processes. While many quantitative financiers would gladly dispose of the Brownian motion, the absence of arbitrage, or a free lunch, is a cornerstone principle few could do without. It is very practical, too—even the combination of the star Wall Street trader Meriwether with the star Harvard academic Merton at the rudder of Long Term Capital Management—could not game securities' mispricing for very long.
In view of these discoveries, the researcher wishing to reject Brownian diffusions as asset processes must first invent an alternative mechanism, which censors arbitrage. This is not impossible but it requires a very radical conceptual revision of our current understanding of financial economics.
The second bell for the hopes that some screwball statistical process can rescue financial mathematics rang in 2001. Heylette Geyman, Marc Yor, and Dilip Madan proved what I nicknamed the “Enigmatic Theorem” (Geyman et al, 2001). Namely, any economically feasible process must contain only pure jumps without any drift component. Despite its weirdness, that is exactly what we observe in exchanges where the quotes evolve due to discrete changes in the trading book. There is no reason to assume, after Bachelier, Black, Scholes, and Merton, that stock or commodities prices drift in some Platonic space of ideas between executions of orders.
In another remarkable observation, Geyman, Yor, and Madan noticed that a pure-jump asset process of a no-arbitrage class can be transformed into a pure diffusion by a time-changing (Feller) transformation. The following circle has finally closed.
My narrative is not suggesting that everything is all right with quantitative finance. It is meant to caution how deliberate are its strictures and how radical must be our digression from its present conceptual structure to cure its real or perceived ills.
Taleb, N. N. 2010. Blackswan: The Impact of the Highly Improbable. New York. Random House.
Granger, C. and P. Newbold. 1974. "Spurious Regressions in Econometrics," Journal of Econometrics, 2, 111–120.
------------ 1977. Forecasting Economic Time Series, Academic Press.
Jarrow, R., Protter, P., and Shimbo, K. 2006. "Asset Price Bubbles in a Complete Market," Advances in Mathematical Finance, In Honor of Dilip B. Madan: 105–130.
Galton, F. R. S. 1889. Natural Inheritance.
Carr, P. and Wu, L. 2003. "The Finite Moment Logstable Process And Option Pricing," Journal of Finance, 58(2), 753–778.
Harrison, J. M. and Kreps, D. M. 1979. "Martingales and Arbitrage in Multiperiod Securities Markets," Journal of Economic Theory, 20, 381–408.
Harrison, J. M. and S. R. Pliska. 1981. Stochastic Processes and Their Applications, 11, 215–260.
Geman, H., D. Madan and M. Yor. 2001. "Asset Prices are Brownian Motion: Only in Business Time," in M. Avellaneda (ed.), Quantitative Analysis in Financial Markets: Collected Papers of the New York University Mathematical Finance Seminar, World Scientific, Singapore. Vol. 2, 103–146.
–Peter B. Lerner, MBA, PhD is a semiretired financial researcher currently teaching one business class in Manhattan and residing in Ithaca, NY. The second, more technical, installment of this article will appear if the author survives his planned trip to Moscow in mid-September.
1. I do not define random walk here but I assume that almost everybody read or at least heard of B. Malkiel’s Random Walk down the Wall Street.