Abstract(4 page PDF)
Common intuitions are that adding thin-tailed variables with finite variance has a linear, sublinear, or asymptotically linear effect on the total combination, from the additivity of the variance, leading to convergence of averages. However it does not take into account the most minute model error or imprecision in the measurement of probability. We show how adding random variables from any distribution makes the total error (from initial measurement of probability) diverge; it grows in a convex manner. There is a point in which adding a single variable doubles the total error. We show the effect in probability (via copulas) and payoff space (via sums of r.v.). Higher dimensional systems –if unconstrained – become eventually totally unpredictable in the presence of the slightest error in measurement regardless of the probability distribution of the individual components. The results presented are distribution free and hold for any continuous probability distribution with support in R . Finally we offer a framework to gauge the tradeoff between added dimension and error (or which reduction in the error at the level of the probability is necessary for added dimension
Monday, July 13, 2015
Nassim Taleb "Error, Dimensionality, and Predictability"
This is a personal bookmark of his draft copy on modeling and propagating modeling errors that I hope to come back to when it is published later this year.