Counterfactualize this!
"Counterfactualizing" is thinking about that which did not happen. As it turns out, humanity is pretty bad at it. We've become only too deeply aware of this human flaw in the face of the current COVID-19 crisis.
Could this crisis have been foreseen? Sure - and some have. Could we have been better prepared for it? Most certainly ... but our minds have a way of shying away from too much of what's called "downward counterfactual" thinking.
A few years back Swiss Re Institute's Head of Cat Perils Martin Bertogg had arranged for Dr. Gordon Woo to come share his insights at Swiss Re Headquarters in Zurich. The title of the presentation was "Counterfactual Disaster Risk Analysis" ... as that didn't mean a thing to me, I signed up, of course, and it proved to be time exceedingly well spent. Dr. Woo, Catastrophist (a castastrophist is, essentially, someone who imagines worst-case scenarios) at RMS, proceeded to shared a wealth of slides, explaining upward and downward counterfactural thinking - and shared many examples that illustrated beautifully just how inept we are when it comes to learning from that which did not happen.
Upward/downward counterfactual thinking
Imagine you're at the casino and you're playing the one-armed bandit. You're always missing the big win by just a little. In that case, people will most often continue to try. Even though the event (the big win) didn't happen, people continue to play because they believe it will happen - the scenario of a good outcome is clearly in their mind. Humanity's big flaw comes with the downward counterfactual thinking. When a big earthquake/volcano eruption/car crash/tsunami disaster/terror attack/etc. etc. does NOT happen, then we all basically utter a collective "whew" and get on with living life as we have been as quickly as possible. Let's just move on, it didn't happen, all's fine, thank God!
Upward counterfactual thinking is "it could been better", downward counterfactual thinking is "it could have been worse" ... so what do we do with that insight? Dr. Woo mentioned Nassim Taleb's Black Swan Theory ... but there are lots events, as the catastrophist showed, that proves themselves to have been far from Black Swan events. All you have to do is look, without blinders, on a consistent basis. Look at what did happen, by all means, but also explore what did not happen? If something didn't happen, or wasn't that bad - thanking one's good fortunes isn't the smart thing to do. The smart thing is to look at that situation and imagine how it might have been worse (and then learn from that, of course).
9/11 didn't have to happen
He shared a number of examples, of minor earthquakes, or volcanic eruptions that didn't break the surface, or solar storms that missed us, of nuclear disasters where the winds helped - even of the infamous 9/11 terrorist attack. With that, everyone said nothing like that could have been foreseen ... Dr. Woo showed, however, that a plot had been foiled in 1999 - a plot where a terroist had plan to hijack an airplane and fly it into the Eiffel Tower. That event luckily didn't happen ... but did anyone then think and explore "how it could have been worse"? Did anyone learn from what did not happen in a way that might have foiled the 9/11 attack? No - it was seen as a Black Swan event, unimaginable ... until it happened.
I remember Dr. Woo suggesting that his learnings into counterfactual thinking should form part of of the training for anyone entering the insurance industry. I think he's right and I think he goes nowhere near far enough. Every company, every organization, every government - must have such scenario thinkers, people tasked with observing everything that does and does not happen, then explore how it could have been worse and, from there, make recommendations, adjust pricing, highlight dangers and pinpoint possible ways of mitigation.
We need more "scenario thinkers"
These don't need to be people with all the answers. Their job is to ask the questions and come up with worst-case stories. The dam that cracked but didn't break; the ships that collided but didn't spill the oil; the volcano that went back to sleep ... Scenario thinkers would consistently go all the way to highlight: "What would have happened if ..."; "What else could have happened?"; "What would have been worse?"; "What would have been the worst possible outcome?" From those scenarios, experts across the world cold then evaluate and re-evaluate their models and offerings and pricing - and could, with this insights, help clients/organizations/governments become more resilient.
I think that the worst thing we can do is look away. Let's get personal for a moment: Imagine your child was almost run over by car on the way home from school. You're angry and you're incredibly relieved that nothing bad happened ... but imagine the worst case. What could have happened? To your child. To someone else's child? What could prevent that worst-case scenario? Speed bumps? A cross walk? A traffic light? Action, unfortunately, is most often only taken after something terrible actually has happened - then those speed bumps will be in place in no time ...
In that one hour Dr. Woo had shared a great deal and, as a writer, I remember feeling very much at home in his world of scenarios ... after all, that's what a writer does - he asks "What if" ... and then explores those scenarios. All of this is about learning from everywhere we can - and that means exploring everything (without blinders) that did and did not happen and everything that could have happened. By doing this, we wouldn't just not be surprised by events such 9/11 or a global pandemic - we might have been ready with counter-measures to stop them in their tracks.
PS: Imagine this - a world where governments, organizations, companies all take action in investing and building for a safer future. Imagine worst-case scenarios and building to the right global infrastructure to safeguard against earthquakes, dam breaks, storms ... now imagine the econonic boom this create ... here's hoping that we start, now, by no longer looking away from the things that did not happen!
Risk and Health Solutions | Advising, Innovating, Disrupting & Collaborating in Life and Health Insurance | Connecting Partners | Funding solutions to enable health and disease management programs
4yExcellent, profound and thoroughly mind expanding post Daniel !
Chairperson at the 20/20 VISION Program and DIEM Network: 'From Risk to Resilience - How do we get there?'
4ySo true. Couldn't agree more, Daniel Martin Eckhart! Have been working this for years. Wrote 'reality-fiction, "What If..."?'stories, included Focused Feedback from participants in lectures and developed the Nine Universal Roadblocks to Resilience (NUR) model [for use in design, monitoring and evaluation (D,M&E) of projects]...