[SystemSafety] The bomb again

John Downer johndowner2 at gmail.com
Mon Oct 7 19:54:29 CEST 2013


Hi Nancy,
Apologies for the delayed rejoinder on this. I took a time-out to change continents and I just got round to checking the list today.

To take your points in turn:

> Whether atomic weapons are a deterrent is a legitimate and important question to consider. But it is a different question from what is the risk of unintended detonation. The latter can be used in the political science discussion of deterrence, but I don't think deterrence figures in the basic analysis of the risk of accidental detonation in terms of the engineering techniques being used. That is all I am saying. 


I think if you look back at my email you'll find that this was essentially the point I was trying to make as well. My argument was poorly phrased (I was in an airport) but we're in agreement. I was trying to say that I understand this list isn't a political science discussion list but it should possible to have probabilistic discussions of the bomb that don't stray into that territory.

> There are engineering protections against a disgruntled military technician setting it off on purpose -- this is not something that engineers ignore. Again, we need to consider the actual engineered design to determine whether this is possible (outside of a Hollywood movie) or whether the protection is adequate. Otherwise, we are engaged in uninformed speculation. 

I get that there are protections. The 'disgruntled technician' was simply a way to make the point that discussions of safety need to be inclusive in how they define "the actual design of the protection." (Perhaps I was wrong, but as I had understood it, you were speaking of the physical protections themselves, rather than the social systems in which they were embedded.)

I should add though, that nuclear weapons are fundamentally designed to explode, and the problem of how to make sure they explode only when the 'correct' people detonate them for the 'correct' reasons is never going to be completely solvable. If only because "correct" in these instances is always a contingent term. (And if we were sure that nobody could explode bombs unless they were supposed to then we wouldn't be worried about losing them!)

As for "Hollywood movie" scenarios -- I think this gets to the heart of my point. I totally agree that in most engineering contexts an argument that hinged on such scenarios would be worthy of all the derision in the world. Things are different in the nuclear context, though. In this sphere failures have such catastrophic potential that they can't ever (or close to ever) be allowed. For the technology to be politically viable, in other words, the probability of failure has to be so low that it surpasses the probably of Hollywood-esque moments. Because history is full of scenes from cheap movies. Incredible events happen all the time. There's nothing we can do about this except ask ourselves if we want to put ourselves in a position where the wrong such moment could have unacceptable consequences.

> I would be interested in a discussion here about the adequacy of the engineering techniques being employed. They seem pretty strong to me, but I may be wrong. Surprisingly, they are pretty well known. Safety is not predicated on secrecy of the safety engineering approach. But pulling probabilistic numbers out of the ether is not useful. To be useful, such numbers must be based on the detailed design of the system. I suppose they could be based on historical evidence, but I find those much less compelling because we have limited experience. 

[re: The openness of the designs.] The bomb designs might be fairly well-known, but (as this list repeatedly demonstrates in respect to aviation tech) making judgements about systems involves a level of technical intimacy (about tests, theoretical ambiguities, etc). After all, there have been any number of failures in practice that were "impossible" on paper.

[re: The probabilistic numbers.] If I was pulling numbers out of my... sorry the aether then you'd have a point, but I'm not. I'm simply arguing that the non-probablisitic risk numbers (I'm not sure what to call them) are not to be trusted either. And, further, that there are logical ways to make technology policy where hazards are known to be enormous and the likelihood of those hazards is known to be highly uncertain.

[re: The historical evidence.] I agree, of course, that designs have changed over time, and that, in a way this limits the 'relevance' of past failures in some ways. At the same time, though, we always say this. Every time there is a nuclear accident or near-miss we say that the designs have changed and so it doesn't count. Perhaps the historical evidence to which we should be paying attention is the evidence is that these claims have repeatedly been wrong. 

Thanks for the robust dialogue.

J.



On Oct 2, 2013, at 10:07 PM, Nancy Leveson <leveson.nancy8 at gmail.com> wrote:

> John,
> 
> Whether atomic weapons are a deterrent is a legitimate and important question to consider. But it is a different question from what is the risk of unintended detonation. The latter can be used in the political science discussion of deterrence, but I don't think deterrence figures in the basic analysis of the risk of accidental detonation in terms of the engineering techniques being used. That is all I am saying. 
> 
> There are engineering protections against a disgruntled military technician setting it off on purpose -- this is not something that engineers ignore. Again, we need to consider the actual engineered design to determine whether this is possible (outside of a Hollywood movie) or whether the protection is adequate. Otherwise, we are engaged in uninformed speculation. 
> 
> I would be interested in a discussion here about the adequacy of the engineering techniques being employed. They seem pretty strong to me, but I may be wrong. Surprisingly, they are pretty well known. Safety is not predicated on secrecy of the safety engineering approach. But pulling probabilistic numbers out of the ether is not useful. To be useful, such numbers must be based on the detailed design of the system. I suppose they could be based on historical evidence, but I find those much less compelling because we have limited experience. 
> 
> Nancy
> 
> 
> On Wed, Oct 2, 2013 at 4:41 PM, John Downer <johndowner2 at gmail.com> wrote:
> I don't have access to Nancy's book just now (I'm in an airport), although I will certainly take a look. 
> 
> In the broader sense though, It seems to me that if we restrict discussions of the atomic bomb to engineering specificities then we essentially give up the right to speak. I doubt that anybody on this list has access to the level of engineering detail about the current generation atomic weapons to speak meaningfully about the intricacies of their designs, (and I expect that anyone who did have access wouldn't be allowed to comment, except to give the kinds of reassurances that that bomb-authorities have given since the 1950s). 
> 
> More importantly, I think we miss something very significant if we only speak about design specificities. We live in a probabilistic world. If I remember correctly, the bomb described in the Guardian article had four redundant safety mechanisms, three of which failed and one of which almost failed. Any engineering analysis, I imagine, would have found that it was 'impossible' for it to detonate accidentally. But 'impossible' in this context always means something like "it would take an absolutely incredible confluence of failures for this thing to fail," which brings us back to probabilism. (Which is to say, I agree with Andrew's earlier observations about confidence.)
> 
> Also, to focus exclusively on design, I think, is to forget that these are socio-technical systems, the safety of which is necessarily subject to (notoriously capricious and unquantifiable) human actions on all sorts of levels. It would be a shame to build the 'perfectly safe' bomb, only to have some disgruntled military technician with an evil cradling to set it off on purpose. It is important we consider our technologies in this broader light. 
> 
> I should point out that formerly top-secret DoD-sponsored studies have come to identical conclusions. Both about the need to understand the bomb's risks probabilistically, and about the way human concerns undermine any technical reassurances. (See, eg: Schlosser 2013: 190-5).
> 
> To say that all non-design based discussions of the bomb are simply expressions of political views is misleading. It is certainly not impossible to argue that there is a credible risk of an accident with the bomb, but it is preferable to the risk of being nuked because we were unable to deter. People I respect believe this. I happen not to but I agree that this list might not be the place for such discussions. I apologize if it seemed like I was pushing a 'cause'.
> 
> 
> 
> 
> On Oct 2, 2013, at 1:41 PM, Nancy Leveson <leveson.nancy8 at gmail.com> wrote:
> 
>> This discussion would be a lot more useful if, as engineers, we commented on the actual design of the protection against accidental detonation of atomic bombs and whether that design is or is not flawed. I tried to bring it up earlier -- it is described in my Safeware book, pages 428-431. As far as I can determine, there is no way that a crash of an aircraft can lead to the detonation of a nuclear bomb. In the two crashes we know about, there was no detonation. Note that the detonation mechanism is kept in an inoperable state and there must be multiple indications of intent to detonate as well as the random generation of a unique signal (which has purposely defined to be of such information complexity that it will not be randomly generated in any credible environment). 
>> 
>> I certainly can be wrong and welcome *engineering" arguments about whether the protection scheme used is adequate, but not probabilistic statements that are not founded on the specific design of the device or are based on political views that have little to do with engineering. 
>> 
>> Nancy
>> 
>> 
>> On Wed, Oct 2, 2013 at 9:43 AM, Matthew Squair <mattsquair at gmail.com> wrote:
>> John, 
>> 
>> The current US requirement for nuclear weapons safety during a crash is a probabilty of one in a million of a premature nuclear detonation. I guess that doesn't really qualify as 'practically nonexistent'. 
>> 
>> That being said, the nuclear weapons safety community has spent an awful lot of time and money thinking about safety in the wake of such accidents as Goldsboro, see their 3I principles for example, and I believe there are broader architectural lessons that can be learned and transferred to other domains. 
>> 
>> See the references in my post for further details. 
>> 
>> http://criticaluncertainties.com/2010/03/21/lessons-from-nuclear-weapons-safety/
>> 
>> Regards, 
>> 
>> 
>> On Wednesday, 2 October 2013, John Downer wrote:
>> Further to earlier discussions on the safety of the bomb (and courtesy of my former colleague Anne Harrington):
>> 
>> From the Guardian: "US nearly detonated atomic bomb over North Carolina – secret document"
>> 
>> "A secret document, published in declassified form for the first time by the Guardian today, reveals that the US Air Force came dramatically close to detonating an atom bomb over North Carolina that would have been 260 times more powerful than the device that devastated Hiroshima.
>> 
>> The document, obtained by the investigative journalist Eric Schlosser under the Freedom of Information Act, gives the first conclusive evidence that the US was narrowly spared a disaster of monumental proportions when two Mark 39 hydrogen bombs were accidentally dropped over Goldsboro, North Carolina on 23 January 1961. The bombs fell to earth after a B-52 bomber broke up in mid-air, and one of the devices behaved precisely as a nuclear weapon was designed to behave in warfare: its parachute opened, its trigger mechanisms engaged, and only one low-voltage switch prevented untold carnage."
>> 
>> http://www.theguardian.com/world/2013/sep/20/usaf-atomic-bomb-north-carolina-1961
>> 
>> 
>> For context, here's the official government assessment from 1960: "Stay Safe, Stay Strong: The Facts about Nuclear Weapons"http://archive.org/details/StaySafe1960
>> 
>> My favorite bit is at minute 20:00:
>> 
>> So how safe is a nuclear bomber coming in for a crash landing?
>> "...the possibility of an accidental nuclear explosion is so small as to be practically nonexistent...you and your family may live in peace, free from the fear of nuclear accidents"
>> 
>> 
>> 
>> 
>> ---------
>> Dr. John Downer
>> SPAIS; University of Bristol. 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> -- 
>> Matthew Squair
>> MIEAust CPEng
>> 
>> Mob: +61 488770655
>> Email: MattSquair at gmail.com
>> Website: www.criticaluncertainties.com
>> 
>> 
>> 
>> _______________________________________________
>> The System Safety Mailing List
>> systemsafety at TechFak.Uni-Bielefeld.DE
>> 
>> 
>> 
>> 
>> -- 
>> Prof. Nancy Leveson
>> Aeronautics and Astronautics and Engineering Systems
>> MIT, Room 33-334
>> 77 Massachusetts Ave.
>> Cambridge, MA 02142
>> 
>> Telephone: 617-258-0505
>> Email: leveson at mit.edu
>> URL: http://sunnyday.mit.edu
>> _______________________________________________
>> The System Safety Mailing List
>> systemsafety at TechFak.Uni-Bielefeld.DE
> 
> 
> 
> 
> -- 
> Prof. Nancy Leveson
> Aeronautics and Astronautics and Engineering Systems
> MIT, Room 33-334
> 77 Massachusetts Ave.
> Cambridge, MA 02142
> 
> Telephone: 617-258-0505
> Email: leveson at mit.edu
> URL: http://sunnyday.mit.edu

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20131007/067d3f88/attachment-0001.html>


More information about the systemsafety mailing list