Philip Tetlock has done more than any other academic to help us understand the process of forecasting and making predictions. He has shown us why experts don’t do well, and, with his latest work, has found the secret sauce of ‘Superforecasting‘.
Philip Tetlock was born in 1954 and grew up in Toronto. He studied psychology, gaining his BA and MA at University of British Columbia, before moving to the US, to research decision-making for his PhD at Yale.
His career has been entirely academic, with posts at University of California, Berkley (Assistant Professor, 1979-1995), Ohio State University (Chair of Psychology and Political Science, 1996-2001), a return to UC Berkley (Chair at the Haas Business School, 2002-2011), and currently, he is Annenberg University Professor at the University of Pennsylvania, where he is jointly appointed between the School of Psychology, Political Science, and the Wharton Business School.
Tetlock’s early books are highly academic, but he started to come to prominence with the publication, in 2005, of ‘Expert Political Judgment: How Good Is It? How Can We Know?‘ This book has become highly influential, by documenting the results of Tetlock’s research into the forecasting and decision making of experts. The bottom line is that the more prominent the expert: the poorer their ability to forecast accurately.
Tetlock’s most recent book, 2015’s ‘Superforecasting: The Art and Science of Prediction‘ is one of those few magic books that can change your view of the world, make you smarter, make you feel wiser, and inspire you at the same time. It is co-written with journalist Dan Gardner (whose earlier books cover Tetlock’s work [Future Babble], and that of Daniel Kahneman [Risk]) and so is also highly readable.
The Tetlock Two-step
In ‘Expert Political Judgment‘, Tetlock is a pessimist. He finds substantial evidence to warn us not to accept the predictions of pundits and experts. They are rarely more accurate than a chimp with a dartboard (okay, he actually compares them to random guessing).
Ten years later, in ‘Superforecasting’, Tetlock is an optimist. He still rejects the predictions of experts, but he has found light at the end of the predictions tunnel. The people he calls ‘Superforecasters’ are good at prediction; far better than experts, far better than chance, and highly consistent too.
If you want to understand how to make accurate predictions and reliable decisions; you need to understand Tetlock’s work.
Hedgehogs and Foxes: The Failure of Experts
In a long series of thorough tests of forecasting ability, Tetlock discovered a startling truth. Experts rarely perform better than chance. Simple computer algorithms that extrapolate the status quo often outperformed them. The best human predictors were those with lesser narrow expertise and a broader base of knowledge. In particular, the higher the public profile of the expert, the poorer their performance as a forecaster.
This led Tetlock to borrow a metaphor from philosopher Isiah Berlin: The fox knows many things but the hedgehog knows one big thing. The experts are hedgehogs: they know one thing very well, but are often outsmarted by the generalists who recognise the limitations of their knowledge and therefore take a more nuanced view. This is often because experts create for themselves a big theory that they are then seduced into thinking will explain everything. Foxes don’t have a grand theory. So they synthesise many different points of view, and therefore see the strengths and weaknesses of each one, better than the hedgehogs.
One result of Tetlock’s work was that the US Government’s Intelligence Advanced Research Projects Activity (IARPA) set up a forecasting tournament. This is an ‘Intelligence Community’ think tank. Eventually, Tetlock moved from helping design and manage the tournament, to participating.
Superforecasting: The Triumph of Collective Reflection
Tetlock, along with his wife (University of Pennsylvania Psychology and Marketing Professor, Barbara Mellers) created and co-led the Good Judgment Project. This was a collaborative team that was able to win the IARPA tournament consistently.
The book, Superforecasting, documents what Tetlock learned about how to forecast well. He identified ‘Superforecasters’ as people who can consistently make better predictions than other pundits. Superforecasters think in a different way. They are more thoughtful, reflective, open-minded and intellectually humble. But despite their humility, they tend to be widely read, hard-working, and highly numerate.
In a recent (at time of writing – https://twitter.com/PTetlock/status/738667852568350720 – 3 jJune 2016) Tweet, Tetlock said of Trump University’s ‘Talk Like a winner’ guidelines :
Guidelines for “talking like a winner” are roughly the direct opposite of those for thinking like a superforecaster
The other characteristics that enable superforecasting, which you can implement in your own organisation’s decision-making, are:
- Screen forecasters for high levels of open-mindedness, rationality and fluid intelligence (reasoning skills), and low levels of superstitious thinking (Tetlock has developed a ‘Rationality Quotient’ or RQ). Also choose people with a ‘Growth Mindset’ and ‘Grit‘.
- Collect forecasters together to work as a team
- Aim to maximise diversity of experiences, backgrounds, and perspectives
- Train them in how to work as a team effectively
- Good questions get good answers, so focus early effort on framing the question well to reduce bias and increase precision
- Understand biases and how to counter them
- Embrace and acknowledge uncertainty
- Take a subtle approach and use high levels of precision in estimating probabilities of events
- Adopt multiple models, and compare the predictions each one offers to gain deeper insights
- Start to identify the best performers, and allocate higher weight to their estimates
- Reflect on outcomes and draw lessons to help revise your processes and update your forecasts