In an earlier article, Maintaining an effective cyber posture during times of rapid and widespread change, we discussed the various types of cyber risk based on their predictability and whether one could deal with them proactively or reactively. The final type of risk was the “Black Swan”: the risk whose likelihood is so infinitesimal that it might never occur to us, yet which – if it manifests – will have disastrous consequences.
First, let us concede that some disasters are genuinely unpredictable. British Airways Flight 009 is an example: a BA Boeing 747-200 flying over Indonesia in June 1982 unexpectedly entered a cloud of volcanic ash and, over the course of a few minutes, all four engines stopped. Good airmanship and a little luck resulted in the flight landing safely in Jakarta, and it is reasonable to accept that, with the technology of the day, the ash cloud could not reasonably have been foreseen.
There is a gulf, however, between the unforeseeable and the unforeseen; there is also a common tendency to consider the latter as the former, whereas “unforeseeable” and “highly unlikely” are vastly different concepts.
Organisations with effective risk teams have regular discussions about Black Swan risks. Generally speaking the organisation’s risk regime involves maintaining registers of identified risks, where the implicit risks are listed and controls are applied to bring the residual risks (the risks that remain once the controls are in place) within the organisation’s risk appetite. On top of this, however, further discussions will take place on a regular – six- or 12-monthly – basis to try to tease out the risks that had not previously been considered.
Information sharing – particularly the ability to see the experiences of others – is key to such Black Swan discussions, because the tendency is to try to imagine events that have not occurred in one’s own organisation before, but which might happen one day. There is a strong likelihood, though, that even if a given event has not happened in one’s own organisation, it may well have happened to someone else’s. Let us assume that we have chosen to consider a Black Swan event as one that is likely to affect our organisation once in 1,000 years. If we can look at 10,000 organisations’ incident data, we should see an average of ten a year – they will stand out as surprisingly common, not vanishingly rare.
It must also be remembered that internal communication is also critical. We can all recall instances where something apparently unexpected has happened and it has transpired that the issue was well known in one part or other of the organisation – often the “shop floor” staff who have devised a workaround and not thought it a big deal.
Another essential activity is to “think the unthinkable” – which is better termed “think the unpalatable”. Organisations such as NASA have to do this constantly, as human lives are at stake in much of what they do, but why do others not go to the same lengths? Hospitals are an obvious case, as more and more equipment is network-connected. Similarly large scale engineering operations, where security compromises on heavy equipment could cause injury or death.
But even factors that do not affect human safety can be unpalatable to us. We consider the risk of single points of failure and introduce secondary systems that take over should the primary fail, but it is common not to go another step and consider what would happen if all the systems fail, and more importantly how we will respond. Even if we come right up to date and consider the impact of COVID-19, where the scale was unforeseen, many of the impacts were things that occur in incident simulations – cyber or otherwise – such as large numbers of staff becoming ill, offices becoming unusable or public transport being unavailable.
Very few bad things that happen to us have not previously happened to someone else – potentially even (in the case of the 1918 ‘flu pandemic, whose outcome bore more than a passing resemblance of the pandemic of 2020) to our fairly recent ancestors. While we cannot possibly plan for every eventuality of every type of event, if we open our minds, our history books and our communication channels we will usually find that some kind of response – or at least partial response – can be documented for the vast majority.
A Black Swan event is, as we have noted, one that happens seldom and which has devastating impact when it does happen. But all events have causes, and very few causes are unique.