Book review: Superforecasting: The Art and Science of Prediction by Philip E.Tetlock and Dan Gardner
I listened to the audiobook while driving interstate
Phil Tetlock is a professor of psychology and political science at the University of Pennsylvania
spent decades studying the predictions of experts
He did a study of 284 experts 27,000 predictions on political, social, and economic outcomes over a 21-year span
forecasters had good credentials, relevant work experience and advanced degrees
Tetlocks first book is “Expert Political Judgment”
He concluded that experts did “little better than guessing”. (the bad news)
A government agency was created: Intelligence Advanced Research Projects Activity (IARPA), was assembled and decided to create a forecasting tournament
Tetlock started the Good Judgment Project (GJP), 2,800 GJP volunteers in the first year of the tournament, the top 2 percent were called “superforecasters.”
foresight is a real and measurable skill, and these skills can be learned and cultivated. (the good news)
The secret of the success of the GJP is that it was carried out with scientific rigor
About 70 percent of super-forecasters remain accurate and don’t progress to the mean, from one year to the next, they got better.
Experts
Traditional experts rarely measure their accuracy and keep score. Feedback is required in any system to make it more accurate. Closed loop feedback control system.
Pundits avoid scoring their accuracy because it doesn’ help their career.
The more famous they were, the less accurate they were
“tight, simple, clear stories that grab and hold audiences.” Famous people are better at selling their opinions than they are at predictions.
The experts were better at storytelling, persuasion skills
Intuition
“illusions of knowledge.” and the fallacy of Intuition
Daniel Kahneman and Thinking fast and slow, intuition can lead to incorrect conclusions
Reference Blink by Malcolm Gladwell: Fast thinking (intuition) can be trained over time
An overreliance on intuition leads to poor decisions “we move too fast from confusion and uncertainty to a clear and confident conclusion without spending any time in between.” i.e. Daniel Kahneman, thinking fast. – jumping to conclusions. Why we want to know how the movie ends.
based on a famous essay on thinking styles by the philosopher Isaiah Berlin. Foxes know a little about a lot of things, and hedgehogs know one big thing
Make prediction a science by measuring it and studying results based and scientific techniques that work. Understand the system
Score your accuracy
Glenn Brier, a meteorologist, developed the Brier score. It ranges from 0 (perfectly correct) to 2 (perfectly incorrect). 0.5 is precisely random guessing. Brier curve is non-linear. I describe it as logarithmic.
Forecasting is possible, even though the majority of people are bad at it.
Freakonomics episodes on prediction:
http://freakonomics.com/podcast/how-to-be-less-terrible-at-predicting-the-future-a-new-freakonomics-radio-podcast/
http://freakonomics.com/podcast/new-freakonomics-radio-podcast-the-folly-of-prediction/
http://freakonomics.com/2013/10/30/more-predictions-from-bad-to-worse/
Note: If harder predictions are rewarded more than easy ones (reward for risk taking), future predictions can improve via scoring. Like how the olympics scores higher for more dififcult moves.
A model superforecaster has 4 traits:
Philosophy – live with uncertainty and percentages of likelihood, have a healthy sense of humility, never believe in “fate”
Thinking style – good problem solving and logic skills, intelligence (but not super-intelligence), Rationality Quotent (RQ), open mindedness, openness to experience, constantly improving, embrace feedback, good with numbers but doesn’t overcomplicate things
Methods – pragmatic thinking, like the fox – jack of all trades, not committed to any one idea (agnostic), higher resolutio