Nancy Leveson is professor of Aeronautics and astronautics at MIT. She is one of the worlds’ leading researchers on safety, a very serious researcher. I’m using some of the techniques she has developed analyzing complex systems for safety. Her papers are often interesting, but the title of her latest paper blew my mind when I read it:
An Engineering Perspective on Avoiding Inadvertent Nuclear War.
I was born in 1969 and grew up during the cold war. One of the dangers we feared was that a mistake would happen, a bomb would detonate over Russia, Europe or the US, and uncontrolled retaliation would end the world. Dr. Strangelove, the movie, immortalized this scenario.
[youtube https://www.youtube.com/watch?v=98NaJ8ss4sY&w=560&h=315]
Take a deep breath if you watched the trailer above before reading on.
Leveson is not fearful. She has produced the paper for a workshop on systems and strategy stability and she looks back at why this horror scenario didn’t occur:
“The most successful complex systems in the past were simple and used rigorous, straightforward processes. Prevention of accidental detonation of nuclear bombs, for example, used a brilliant approach involving three positive measures […] and reliance on simple mechanical systems that could provide ultra-high assurance. Although there were a few incidents over a long period […] inadvertent detonation did not occur in those cases.”
The question she raises in her paper is whether we are still safe? Well, things are changing:
“The more recently introduced software-intensive systems have been much less reliable. […] More recently, our ability to provide highly trustworthy systems has been compromised by gratuitous complexity in their design and inadequate development and maintenance processes. For example, arguments are commonly made for using development approaches like X-treme Programming and Agile that eschew the specification of requirements before design begins.”
Yes, Leveson is a critic of the popular, modern development paradigms most of us has learned to love. Have we mistakenly stopped worrying?
I met her at a conference on STAMP/STPA in Iceland in 2017, and during a conversation which I was so fortunate to have with her in the lobby of the Iceland University she made herself very clear about her skepticisms towards Agile. But Agile is not the only problem:
“Providing high security is even more problematic. Again, only the most basic security techniques, such as providing an air gap to isolate critical systems, have been highly successful. The number of intrusions in today’s systems is appalling and unacceptable. Clearly what we are doing is not working.”
Leveson suggests a paradigm shift and suggests what the shift can look like. In the paper she discusses systems theory, and how approaches like those she describes in her 2010 book Engineering a Safer World can be useful.
The article can be downloaded in PDF from the MIT website Partnership for Systems Approaches to Safety and Security (PSASS).
I highly recommend anyone interested in software systems safety to read it and reflect on what dr. Leveson has to say
Dr. Leveson at Iceland University with myself and an Icelandic researcher on volcanic safety.