Tuesday, October 13, 2015

Credit Suisse's Mauboussin: "Sharpening Your Forecasting Skills"

We've looked at this a few times, links below, but here Mauboussin cuts to the chase.
From Credit Suisse, September 28, 2015:

Foresight Is a Measurable Skill That You Can Cultivate
Philip Tetlock’s study of hundreds of experts making thousands of predictions over two decades found that the average prediction was “little better than guessing.” That’s the bad news.

Tetlock, along with his colleagues, participated in a forecasting tournament sponsored by the U.S. intelligence community. That work identified “superforecasters,” people who consistently make superior predictions. That’s the good news.

The key to superforecasters is how they think. They are actively openminded,
intellectually humble, numerate, thoughtful updaters, and hard

Superforecasters achieve better results when they are part of a team.
But since there are pros and cons to working in teams, training is

Instruction in methods to reduce bias in forecasts improves outcomes.

There must be a close link between training and implementation.

The best leaders recognize that proper, even bold, action requires good thinking 
Introduction: The Bad News and the Good NewsWhat if you had the opportunity to learn how to improve the quality of your forecasts, measured as the distance between forecasts and outcomes, by 60 percent? Interested? Superforecasting: The Art and Science of Prediction by Philip Tetlock and Dan Gardner is a book that shows how a small number of
“superforecasters” achieved that level of skill. If you are in the forecasting business—which is likely if you’rereading this—you should take a moment to buy it now. You’ll find that it’s a rare book that is both grounded in science and highly practical.

Phil Tetlock is a professor of psychology and political science at the University of Pennsylvania who has spent decades studying the predictions of experts. Specifically, he enticed 284 experts to make more than 27,000 predictions on political, social, and economic outcomes over a 21-year span ended in 2004. The period included six presidential elections and three wars. These forecasters had crack credentials, including more than a dozen years of relevant work experience and lots of advanced degrees—nearly all had postgraduate training and half had PhDs.

Tetlock then did something very unusual. He kept track of their predictions. The results, summarized in his book Expert Political Judgment, were not encouraging.2 The predictions of the average expert were “little better than guessing,” which is a polite way to say that “they were roughly as accurate as a dart-throwing chimpanzee.” When confronted with the evidence of their futility, the experts did what the rest of us do: they put up their psychological defense shields. They noted that they almost called it right, or that their prediction carried so much weight that it affected the outcome, or that they were correct about the prediction but simply
off on timing. Overall, Tetlock’s results provide lethal ammunition for those who debunk the value of experts.

Below the headline of expert ineffectiveness were some more subtle findings. One was an inverse correlation between fame and accuracy. While famous experts had among the worst records of prediction, they demonstrated “skill at telling a compelling story.” To gain fame it helps to tell “tight, simple, clear stories that grab and hold audiences.” These pundits are often wrong but never in doubt....MUCH MORE
HT: The Guru Investor

Previously on Shoulda seen it coming:

"Superforecasting: The Art and Science of Prediction"

Edge Magazine's Master Class In Forecasting With Phillip Tetlock

Dec. 2012 
"How To Win At Forecasting" (Philip Tetlock and the Intelligence Advanced Research Projects Agency)

We've linked to Edge a few times. The Observer called it "The World's Smartest Website" but sometimes they're a bit too precious for my taste. This isn't one of those times.
"IARPA: It's like DARPA but for spies!" 
"IARPA's mission [is] to invest in high-risk/high-payoff research programs that have the potential to provide the United States with an overwhelming intelligence advantage over future adversaries."
– FBI National Press Release, 2009
Sept. 2013
Daniel Kahnman's Favorite Paper: "On the Psychology of Prediction"

June 2014 
Elite Forecasters and The Best Way to Predict the Future 

Sept. 2014 
"U.S. Intelligence Community Explores More Rigorous Ways to Forecast Events"

"Pseudo-Mathematics and Financial Charlatanism...."
“What should one do: predict specifics, or forecast broad trends that necessarily miss specifics?”
"Thinking Clearly About Forecasting"
How to Predict a Nation's GDP per Capita at r=.97 Using "Economic Freedom and average citizenry IQ -- plus slight tweaks from trading block membership and oil"

And many more.