Not an ornithological post but a few words about expecting the unexpected – planning for the unknown. Before somebody cries out that if you could expect the unexpected it wouldn’t have been unexpected, I mean: don’t always be so confident about probability predictions.
In Elizabethan England, swans were white – big white birds. A black one? No way! Unheard of; not even thought of – until they were found in Australia. Then they caused a stir – black swans – totally unexpected. But, of course, naturalists came to explain how they evolved and they shouldn’t have come as a surprise. Predicting the past is a lot easier than predicting the future…
Statistics describes the past (or present) and can give a prediction of what may happen in the future, but only if the future doesn’t bring anything new to the party. I’ve said this in a previous post but I think it’s worth repeating, statistics describe the past and probability tries to predict the future. [I know that some statisticians may argue with this and say statistics is the over-arching discipline but I’m trying to use common language and understanding to clarify a point.] The past doesn’t always give a reliable indication of the future.
If a process is stable, and we understand natural variation, we can be reasonably confident how it will perform in the future; this is where Deming’s control charts are useful because the unexpected (or unplanned) will occur and control charts help to quickly identify it (aka special cause variation). But that’s not the whole story – oh that it were that simple! Special cause variation usually arises from something we can predict will possibly happen at some time in the future (e.g. a machine part failure) – black swans are things we would never have even conceived. They may be events where we believe we have addressed all possibilities. The 2010 Macondo well failure (aka the Deepwater Horizon incident in the US Gulf) – I doubt any of the engineers involved in the planning, design or execution of the well thought it possible the cement job would fail and that the back up systems would also fail – but they did. I doubt any regulator would have considered it, either; it was unheard of. Yet it happened and, in retrospect, it’s not difficult to see where mistakes were made. But hindsight gives us 20:20 vision. As above, naturalists are able to explain the existence of black swans – once they knew they existed.
Professor David Spiegelhalter (Winton Professor the Public Understanding of Risk at Cambridge University) has said there’s no such thing as probability. He cites an example: toss a coin, take a peek at whether it’s heads or tails and ask somebody what the probability is of it being “heads”; to them it’s 0.5 (or 50% or 50:50, etc) but to you it’s either 0 or 1, because you know the answer. The coin has no “probability” as it’s either one or the other – “probability” rests in the mind of the individual and depends on the extent of his/her knowledge. I don’t think that’s a reason to cast aside probability theory and calculations (nor do I believe that was David Spiegelhalter’s intent) – rather, it’s to bring a realization that, no matter how meticulous the study and calculation, it’s never (or rarely) going to be the final word.
Expect the unexpected or, perhaps more realistically, expect the unplanned.
For further reading on this, take a look at “The Black Swan” by Nassim Taleb. Worth picking up to read some new ideas but I don’t think you’ll miss much if you don’t finish it; it starts off well but, in my view, the latter part of the book spends too much time dismissing those who don’t agree rather than reinforcing the arguments.