[SystemSafety] HROs and NAT (was USAF Nuclear Accidents prior to 1967)

Matthew Squair mattsquair at gmail.com
Tue Sep 24 00:10:54 CEST 2013


Slightly tangentially there's actually a theoretical underpinning for
Perrow's NAT provided by the theory of Highly Optimised Tolerance. HOT was
developed to explain the 'robust yet fragile' nature of complex systems
designed to operate in uncertain environments (and are highly coupled with
that environment).

Given you're trying to optimise performance under global
resource constraints (like cost) while trading off against risk you end up
with systems that are 'safe most of the time' but every so often a
catastrophic event will occur due to an unanticipated combination of
failures or environmental factors, robustness and fragility are inherently
intertwined in such systems. The proponents also predict that you'll
see power law failure distributions in HOT systems as a result.

I like HOT as a theory because it makes testable empirical predictions
(power law behaviour) and you can also develop system simulations from the
math and compare these to the real world.

Of course it also means that high consequence accidents
will inevitably occur much more often than we'd like to think, and
usually for unanticipated reasons, which brings us back to Perrow's
original thesis.



On Tuesday, 24 September 2013, John Downer wrote:

>
> On Sep 23, 2013, at 5:50 AM, Andrew Rae <andrew.rae at york.ac.uk<javascript:_e({}, 'cvml', 'andrew.rae at york.ac.uk');>>
> wrote:
>
> The strong interpretation is that there is empirical support for naming a
> particular set of characteristics as the most important, and that this
> support comes from identifying particular organisations as safety
> over-achievers. You can't support this strong interpretation via the weak
> interpretation. The weak interpretation is a _fallback_ position that
> requires abandoning the strong interpretation. What's left is not HRO.  It
> is exactly the same space that Normal Accidents, Disaster Incubation
> Theory, HROs, Vulnerable System Syndrome, and (tangentially) STAMP, have
> been trying to fill. We know that organisation structure and attitude
> matters, but we don't have a successful model for how it matters. (I'm
> deliberately avoiding a definition of "successful" here. Choose one from
> reliable/repeatable, makes accurate predictions, is practically useful for
> safety management).   I put STAMP tangentially into that list because it is
> oriented more towards "practically useful" than "has explanatory power".
> Each model deserves to be evaluated against its own claims.
>
>
> I'm not sure I buy this completely. Nancy already pointed to STAMP's
> distinctiveness. NAT focuses on accidents rather than the lack of them, as
> did Turner. Reason is a social psychologist first and foremost. HR(O/T) is
> its own thing.
>
> Show me an organizational sociologist who interrogates seemingly
> successful socio-technical systems and I'll show you an HRO person.
>
> Whether or not they accurately identify 'successful' systems or the
> organizational principles that contribute to those successes comes down to
> them as scholars, I suppose. There are plenty of differences on these
> issues within the HRT literature. It's true that most of the people we
> would immediately identify as HRO scholars broadly agree on some things
> (the value of redundancy, for instance), but I don't think that one *has*to agree with those things in order to call oneself an HRO scholar.
>
>
>

-- 
*Matthew Squair*
MIEAust CPEng

Mob: +61 488770655
Email: MattSquair at gmail.com
Website: www.criticaluncertainties.com <http://criticaluncertainties.com/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20130924/63686181/attachment.html>


More information about the systemsafety mailing list