[SystemSafety] USAF Nuclear Accidents prior to 1967

Nancy Leveson leveson.nancy8 at gmail.com
Sat Sep 21 22:26:24 CEST 2013


One other thought I forgot. The fact that there was one near miss (and note
that it was a miss) with nuclear weapons safety in the past 60+ years is an
astounding achievement. Instead of using this to stir up fear in people, it
should be studied for why it has been so successful. I describe the way
accidental detonation is prevented on pages 428-431 in my book *Safeware*.
Some of the reasons: a very simple design, no use of computers, and keeping
computers out over the years (despite people's arguing that they should be
introduced). I'm not arguing against the use of computers in general, but
introducing computers to do the very same thing that the simple
electromechanical detonation prevention system does would introduce
complexity that could no longer be validated or controlled. As Brooks
noted, there is a difference between essential complexity and accidental
complexity, the latter introduced in the design and not part of the problem
being solved.

Nancy


On Sat, Sep 21, 2013 at 2:10 PM, Nancy Leveson <leveson.nancy8 at gmail.com>wrote:

> I'm not really sure why people are using an incident that happened 54
> years ago when engineering was very different in order to make points about
> engineered systems today. It's the same problem I have with people
> continuing to talk about my paper on the Therac-25 accidents and ignore the
> hundreds of radiation therapy accidents that have occurred in the
> intervening 35 years. The engineering techniques (both hardware and
> software) and changed dramatically in the past 60 years.
>
> But the NAT/HRO controversy continues. I wrote a paper about this. You can
> find it at http://sunnyday.mit.edu/STAMP-publications.html (and then
> search for "Moving Beyond Normal Accidents and High Reliability
> Organizations").  The argument in our paper is essentially that both
> theories are incorrect and result from an oversimplification of engineering
> (the proponents are all sociologists and seem unfamiliar with the
> engineering literature in their papers and with basic engineering
> concepts). Perrow, for example, has a narrow view of engineering design for
> safety as involving only redundancy and the HRO community does not even
> bother to consider engineering design.
>
> Nancy
>
>
>
> On Sat, Sep 21, 2013 at 1:36 PM, Peter Bernard Ladkin <
> ladkin at rvs.uni-bielefeld.de> wrote:
>
>> The Guardian today has an article on an accident to a US B-52 bomber in
>> North Carolina in 1961. The aircraft, suffering a mid-air break-up,
>> released two nuclear weapons, which were armed. One of the bombs was,
>> according to a book by Ralph Lappe, "equipped with six interlocking safety
>> mechanisms, all of which had to be triggered in sequence to explode the
>> bomb. ...Air Force experts....found that five of the six interlocks had
>> been set off by the fall! Only a single switch prevented the 24 megaton
>> bomb from detonating..."
>>
>> This quote is contained in a short memo by Parker F Jones, an analyst at
>> Sandia Labs, written in October 1969. He deprecates Lappe's general account
>> but says that on this point he is correct; emphasises the vulnerability
>> embodied by the switch, its type and function (it does not appear to have
>> been adequately assessed for reliability in an accident scenario) and
>> concludes that this type of bomb "did not provide adequate safety for the
>> airborne alert role in the B-52." and footnotes that the "same conclusion
>> should be drawn about present-day SAC bombs."
>>
>> This is all contained in an article in The Guardian at
>> http://www.theguardian.com/**world/2013/sep/20/usaf-atomic-**
>> bomb-north-carolina-1961<http://www.theguardian.com/world/2013/sep/20/usaf-atomic-bomb-north-carolina-1961> Jones's memo is presented at
>> http://www.theguardian.com/**world/interactive/2013/sep/20/**
>> goldsboro-revisited-**declassified-document<http://www.theguardian.com/world/interactive/2013/sep/20/goldsboro-revisited-declassified-document>
>>
>> This is due to Eric Schlosser, who is about to publish a book called
>> Command and Control. Schlosser has visited facilities, and so on, and gave
>> an interview to The Guardian at http://www.theguardian.com/**
>> books/2013/sep/21/eric-**schlosser-books-interview<http://www.theguardian.com/books/2013/sep/21/eric-schlosser-books-interview>
>> Apparently, he made an FOIA request for all the incidents in the 10 years
>> to 1967, and received 245 pages of them.
>>
>> Scott Sagan made similar inquiries in his 1993 book The Limits of Safety,
>> for which he is justly famous. I didn't find the incident in Scott's book,
>> so asked him if he knew about it. Scott's thesis in that book was testing
>> Charles Perrow's Normal Accidents theory against the
>> high-reliability-organisation theory of La Porte and colleagues.
>>
>> The NA hypothesis is that tightly-coupled interactively-complex systems
>> are unavoidably vulnerable to accidents which occur while everything is
>> operating "as designed". The HRO theory says that there are certain
>> characteristics of complex organisations which have proven to have had high
>> reliability. One example of such an organisation is USN peacetime carrier
>> operations (launching and retrieval of aircraft); another is Pacific Gas
>> and Electric's nuclear power plant operations (which was a bit of a
>> surprise to us who lived through part of the Diablo Canyon controversy).
>>
>> USAF has obviously not had an accident in which a nuclear weapon has been
>> accidently detonated. The question therefore was whether USAF SAC exhibited
>> the characteristics of a La Porte HRO. Sagan argued that such accidents had
>> been avoided through happenstance, and that the history rather supported
>> the NA theory. It seems from the advance commentary that Schlosser's book
>> will make a similar case.
>>
>> PBL
>>
>> --
>> Prof. Peter Bernard Ladkin, Faculty of Technology, University of
>> Bielefeld, 33594 Bielefeld, Germany
>> Tel+msg +49 (0)521 880 7319  www.rvs.uni-bielefeld.de
>>
>>
>>
>>
>> ______________________________**_________________
>> The System Safety Mailing List
>> systemsafety at TechFak.Uni-**Bielefeld.DE<systemsafety at TechFak.Uni-Bielefeld.DE>
>>
>
>
>
> --
> Prof. Nancy Leveson
> Aeronautics and Astronautics and Engineering Systems
> MIT, Room 33-334
> 77 Massachusetts Ave.
> Cambridge, MA 02142
>
> Telephone: 617-258-0505
> Email: leveson at mit.edu
> URL: http://sunnyday.mit.edu
>



-- 
Prof. Nancy Leveson
Aeronautics and Astronautics and Engineering Systems
MIT, Room 33-334
77 Massachusetts Ave.
Cambridge, MA 02142

Telephone: 617-258-0505
Email: leveson at mit.edu
URL: http://sunnyday.mit.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20130921/96e2d824/attachment-0001.html>


More information about the systemsafety mailing list