Strategy and decisions in a time of uncertainty
Imagine the intelligence arm of a national government, with all its reach, secrecy and resources, throwing a competition open to Joe and Jane Public, with no domain-specific knowledge, and asking them to forecast answers to geopolitical and economic trends. Imagine individuals and teams competing to see who could get the most accurate prediction on the type of challenging questions which the intelligence community deal with every day, who track trends which have potential to trigger events which could have an impact on their national security. A forecasting competition? And competing with an organisation whose day job is intelligence? Strange idea, isn’t it?
Actually, you don’t have to imagine it at all, because it’s true. The government in question is the US, and an agency within National Intelligence was seeking to improve its forecasting ability, by understanding the characteristics which make for good forecasting. And it set up a tournament in which 5 teams led by researchers would compete, each answering the same questions in the same time frame and using the same publically available data to research the field. The questions included things like: “On 15 September 2014, will the Arctic sea ice extent be less than that of 15 September 2013?”or: “Will the London Gold Market Fixing price of gold (USD per ounce) exceed $1850 on 30 September 2011?” Interesting, but hardly mainstream.
But what is mainstream are the findings about what makes a good forecaster, and this and much more is described in the book Superforecasting by Tetlock and Gardner. We are all forecasters. Every business is in the forecasting business, in that it needs to sense and make sense of trends in its environment, to set its strategy for the events to come and perhaps to shape the events to come. We all need to look into the future – about which there is no hard data, and which is intrinsically uncertain – and place our personal and business bets about what we think will happen. So understanding the characteristics of good forecasters, and purposefully building those capabilities within organisations, seems like a no-brainer investment. After all, strategic decision taking is one of the highest leverage activities for organisations, so an improvement in forecasting likely to create improvements in the overall performance of the organisation.
What’s encouraging about Tetlock and Gardner’s book is that the Superforecasters of their title are both born and made. Yes, there are some aptitudes and ways of thinking which give people a great start at being forecasters (forecasters born), and it is possible to improve forecasting ability (forecasters made) by applying a set of techniques, by practice and using feedback to learn, and by working in a team.
Being a good forecaster requires a range of abilities. One – and a critical one – is being mindful of the two ways of thinking we all use, as described by Kahneman in Thinking, Fast and Slow. He categorised System 1 thinking as Fast, automatic, frequent, emotional, stereotypic, subconscious. It’s a crucial human ability, to generalise, to rely on past experience and intuition. And he defined System 2 thinking as Slow, effortful, infrequent, logical, calculating, conscious. And unsurprisingly, forecasting is a System 2 skill, and requires the forecaster to stay mindful and evidence-based at all times, including a thorough and unflinching review of both accurate and inaccurate forecasts, to learn and improve. Handling varied and dissonant viewpoints is also important; the person who holds the opposite view to yours is likely utilising at least some different data sets and different assumptions, so understanding those differences will probably moderate the views of both. And there are many other practices described too.
So how much difference could this make? Remember that this was a research-based tournament, so a lot of attention was paid to measuring and interpreting the results. At the end of the first year, one of the scientific teams had beaten the official control group by 60% and in year 2 by 78%. It even outperformed the day-job intelligence analysts studying their own domains and with access to classified data which the scientific teams didn’t have. Now wouldn’t you want some of that in your organisation?
There is a caveat, though. The questions in the forecasting tournament looked forward in time between one month and a year. Long enough for situations to change quite significantly. But when the time frames are even longer than that, 5 or 10 years, say, if you are a government planning national defence or an organisation building capability for a long-term strategic move, situations can change beyond all recognition. I leave you with a note passed on from Donald Rumsfeld to George Bush. (No, it’s not that quote, the “known unknowns” quote). This one says that we’re still pretty poor at long-range forecasting, so building responses for a range of scenarios is probably wise.
Here’s the note. Nonetheless, a systematic approach to forecasting, preparing and evaluating scenarios, has been proven to help in the short term and is likely to help in the longer term too, even if it provides answers which are inconclusive or wrong. As Louis Pasteur wrote, “Fortune favours the prepared mind,” and forecasting is a key way to build that preparedness.