Nassim Nicholas Taleb, author of The Black Swan: The Impact of the Highly Improbable, introduces an engaging lesson in business forecasting from Dance with Chance: Making Luck Work for You, by Spyros Makridakis, Robin Hogarth, and Anil Gaba.
Just about every human decision about the future is tainted by a gap — the difference between what we think we know and what we actually know. The more expert we are, the wider the gap is likely to be. The story below, an excerpt from the book Dance with Chance, is a classic example of an expert-busting enterprise. …
— Nassim Nicholas Taleb
Excerpted from chapter 9 of Dance with Chance: Making Luck Work for You
As an expert in statistics, working in a business school during the 1970s, one of the authors…couldn’t fail to notice that executives were deeply preoccupied with forecasting. … It bugged the professor greatly that practitioners were making these predictions without recourse to the latest, most theoretically sophisticated methods developed by statisticians like himself. …
Every decent statistician knows the value of a good example, so the professor and his research assistant collected many sets of economic and business data over time from a wide range of economic and business sources. In fact they hunted down 111 different time series, which they analyzed and used to make forecasts … [Each] series was split into two parts: … The researchers pretended that the later part hadn’t happened yet and proceeded to fit various statistical techniques, both simple and statistically sophisticated, to the earlier data. …
Horror of horrors, the … simple … techniques turned out to be more accurate than the statisticians’ clever, statistically sophisticated methods. To be honest, neither [were] particularly great …
One of the simplest methods, known as “single exponential smoothing,” in fact appeared to be one of the most accurate. Indeed, for 61.8% of the time it was more accurate than the so-called Box-Jenkins technique, which represented the pinnacle of theoretically based statistical forecasting technology back in the 1970s. …
… If there’s one thing that makes up for an academic proving himself wrong, it’s the opportunity to show that other eminent authorities are wrong too. So the professor submitted a paper on his surprising and important findings to a prestigious, learned journal … The paper was rejected on the grounds that the results didn’t square with statistical theory! … [Another] journal did decide to publish the paper, but they insisted on including comments from the leading statisticians of the day. … Among the many criticisms was a suggestion that the poor performance of the sophisticated methods was due to the inability of the author to apply them properly.
…[The] valiant statistician and his faithful assistant set out to prove their critics wrong. This time around they collected and made forecasts for even more sets of data (1,001 in total, …), from the worlds of business, economics and finance. As before, the series were separated into two parts: the first used to develop forecasting models and make predictions; and the second used to measure the accuracy of the various methods. … Instead of doing all the work himself, the author asked the most renowned experts in their fields … to forecast the 1,001 series. All in all, fourteen experts participated and compared the accuracy of seventeen methods.
… The findings were exactly the same as in his previous research. Simpler methods were at least as accurate as their complex and statistically sophisticated cousins. The only difference was that there were no experts to criticize, as most of the world’s leading authorities had taken part.
… the basic conclusion — supported by many other academic studies over the past three decades — remains steadfast. That is, when forecasting, always use the KISS principle: Keep It Simple, Statistician.
— Spyros Makridakis, Robin Hogarth, and Anil Gaba
Tuesday, September 15, 2009
The Statistician Who Ate Humble Pie
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment