Performing a Project Premortem
Gary Klein – Harvard Business Review
Projects of every kind go horribly wrong in a bewildering variety of ways. Despite that, the instinct to assume that the act of planning is sufficient to ensure that assumptions are borne out and that delivery follows a smooth path can be overwhelming. But there is great power in assuming the opposite: if we examine early on some of the ways in which the project might have gone horribly wrong, in time to think about how the risk might be mitigated, the chances of successful delivery are enhanced.
That’s the idea behind a premortem – not waiting until the project is dead before performing the autopsy. At root it’s a simple risk management technique, its power is in making it easier for people to liberate themselves from optimism bias.
This is not a new article, it is the definitive account of the method, and having tracked it down to share with a colleague, thought it worth sharing here too. I was first introduced to it by Naomi Stanford, who has an interesting blog post contrasting premortems with devil’s advocacy – the former is a distinctly (if curiously) more positive approach.
Provocation: Redesigning Artificial Intelligence – From Australia Out
Ellen not only always has interesting things to say, she is also unusually effective in finding interesting ways of saying them. This latest piece defies categorisation. It is an essay about AI. It is a reflection on extreme utilitarianism. It is a call to action on the hidden costs of social harmony. It is about edge cases where the edges are sharp and cause harm to those whose lives place them there. It is a call to bring the messiness of cybernetics and systems to the delusional clarity of dehumanised AI. It is a discussion of issues not discussed. It is a challenge to do better.
We are more aware of the threads that bind us together. We have had a glimpse of the fragility of the foundations on which our lives of easy comfort are built. When the exchange for that comfort is the discomfort of others. And so in this space is room to imagine some place else.
And as well as all those things, it is an audio-visual experience, with a soundscape which drifts beyond music and imagery which is not quite illustration. The tone is neither soothing nor haranguing. But in its matter of factness there is great power.
A Failure, But Not Of Prediction
Scott Alexander – Slate Star Codex
At one level, this is about why we didn’t see COVID-19 coming. At another, it is using that as a case study of a whole class of decisions which depend on making judgements about uncertain futures – which is to say most of the ones that matter. The problem is not a shortfall in prediction skills, which is just as well because prediction is a tricky game. It is instead presented as a shortfall in probabilistic reasoning skills, which in turn relates to the classic risk management scales of likelihood and impact. Low likelihood, high impact events matter a great deal – which is why the insurance industry exists. If there is a 10% chance of an imminent global pandemic, it is well worth investing in mitigation, even if it turns out that the pandemic fizzles out – which is why it made sense to stockpile large quantities of flu vaccine in 2009 which turned out not to be needed.
But, slightly less explicitly in the article, there is another step which is essential before any of this becomes useful, as opposed to merely interesting. Probabilistic reasoning can be a good pointer to action, but it has succeeded only if appropriate action is in fact taken. So perhaps those showing greatest wisdom back in January and February were not either those who dismissed what was happening in Wuhan as far away and unimportant, or those who jumped immediately to proclaiming imminent global catastrophe – but those who saw from an apparently moderate risk, an immediate need to take precautionary actions.
There is, of course, a political dimension to this as well. Back in 2009, the then French Health Minister was heavily criticised for the money spent – money apparently wasted – on one of those vaccine stockpiles. She is quite rightly unapologetic, but it’s another reason why understanding the concept of risk and its mitigation is important. As Alexander observes,
Uncertainty about the world doesn’t imply uncertainty about the best course of action!