Everything is Obvious 6/10
6/ The Dream of Prediction
Our eagerness to make predictions about the future is matched only by our reluctance to be held accountable for the predictions we make.
The quality of experts’ predictions is roughly the same as coming from the non-expert folks and is worse than what a statistical model would suggest. Roughly 80% of predictions are wrong [Steven Schnaars], whether they’re made by experts or not. At the same time, Google and Facebook hadn’t been predicted as would-be success until it was … obvious.
To state the obvious, there are predictions that can we can make reliably, and the ones we can’t. This has a lot to do with simple models (the ones that can capture all or most of the variation in what we observe) [think pendulums and orbits of satellites] – the physics textbook exercises fall into this category. Even the most complex scientific models (rocket science) describe a bunch of simple processes.
Enter complex systems where complexity arises not from the number of components, but from their interactions with each other in nonlinear ways (e.g., the bullwhip effect). The models of complex systems are quite simple because incremental improvements won’t make much of a difference for the output. There are lots of variables unaccounted for, and this is the most pronounced in the dynamics of social systems, which are always complex.
Simple systems allow us to predict actual events, while complex systems allow us to predict the probability of such events – at best! [MK: and don’t get me started on the difference between probability and uncertainty.] This is something marketers should be acutely aware of when thinking that more data about a user means better prediction of his/her specific behaviour. However, human brain doesn’t like probabilities, it like certainty to the extent that should one bring an umbrella to work tomorrow or not. [MK: always carry a portable umbrella, it can be useful in unexpected ways!]
Most major events are one-off to the extent that they occur in unique conditions that can’t be fully repeated. Thus, a 90% chance of something doesn’t mean that one has 10 attempts, of which 9 will succeed. If an event with 60% probability has materialized, in a few years’ time people will deterministically forget about the alternative future, that was 40% probable. Human mind wants to know what will actually happen, and this is something that no one can reliably tell without looking back.
There’s a difference between being uncertain about the future (which may be cured by obtaining more information) and the future itself being uncertain (the information is unknowable). The latter is pretty much the world we live in, and we can hope for knowing the probabilities at best.
If this is not complex enough, let’s remember Donald Rumsfeld’s “Unknown Unknowns”: we may not know which outcomes we should be making predictions about in the first place. And here comes the dangerous fallacy: in hindsight we know what has actually happened. So, it’s not just that we know how our predictions turned out (they might’ve been incorrect), but also which predictions we should’ve made (the ones that would have been correct). Making the right prediction is just as important as making the right prediction.
What is relevant can’t be known at the time – only until later. And thus it can’t be planned and optimized for. Since any outcome is a milestone, the degree of relevance of an action or an event varies with time.
Predictions only make practical sense if they’re actionable. Being right or wrong in the prediction about an unimportant thing is unimportant. This is precisely the reason why KPIs are a tool don’t work for high-level executives with freedom of action (and can backfire on the rest of the firm).
One can’t talk about predictions without mentioning Nassim Taleb’s Black Swans. While natural events have heavy-tailed distributions, meaning that small events are unnoticeable, and large events are catastrophic, the social events are not events per se. They are the subsequent transformations of the society that are triggered by this event (or a series of events conveniently wrapped into a single one for the purpose of simplifying the narrative and inevitably losing important details and complexity).
But what is the event? The “invention” of the internet is not an event, it’s a 40+ year-old journey of discovering data transmission and presentation technologies, new hardware, business models and the blockchain. The event itself may be trivial, but weak links it exposes and destroys, leading to transformations at the catastrophic scale – that is worth exploring.
So black swans can’t be predicted as the process will require us to predict a certain outcome and then predict the future beyond that outcome to prove the importance of the outcome in question. This means not the prediction, but a prophecy – predicting not just the outcome, but also its future meaning.
Obtaining meaning from past events is useless for the purpose of making prophecies, because the view of the past (i.e., the events and their meaning) is a collective storytelling effort by many actors – professional and amateur – to make sense of what happened. This complicates things because predicting the meaning of prophecy would also have to predict the dominant narrative of many actors [MK: or as I call them – people with opinions and lots of free time].
Predictions made in front of friends and family can turn out embarrassing, but the predictions made by policymakers or CEOs may turn out devastating. So common sense has to be thrown out of the window.