[SystemSafety] Safety Culture

Fredrik Asplund fasplund at kth.se
Sun Dec 10 20:41:24 CET 2017


Sorry, ... that should be "good about the concept"?
Sincerely,
/ Fredrik

-----Original Message-----
From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Fredrik Asplund
Sent: den 10 december 2017 19:34
To: Peter Bernard Ladkin; systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Safety Culture

>Nuclear-power people are justifiably proud of what they call their pervasive "safety culture".
>I have been discussing with some experts how similar processes could be introduced around cybersecurity.

So, leaving out the idea that safety culture is all you need. What is it that you consider good (you write "justifiably proud", and that you are discussing how similar processes can be introduced - presumably because you find something here valuable)?

Sincerely,
/ Fredrik

-----Original Message-----
From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Peter Bernard Ladkin
Sent: den 10 december 2017 13:42
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Safety Culture

On 2017-12-10 12:46 , paul_e.bennett at topmail.co.uk wrote:
> On 10/12/2017 at 10:16 AM, "Peter Bernard Ladkin" <ladkin at causalis.com> wrote:
>>
>> So, a couple of short questions.

With your answers.

>>
>> 1. How do you tell if a company has a "safety culture"?

Your answer: nobody dies on the job.

>> 2. How do you tell that this "safety culture" is causally effective 
>> in the company fulfilling attribute X?

I don't read an answer.

>> 3. What are those attributes X of companies that correlate with 
>> safety?

You suggest:

a. company-wide training in behavioural safety b. no-blame reporting culture for safety incidents c. general reporting on "stress" and help offered d. specific safety training for specific tasks

>> 4. How do you know that your answer to 3 is right?

I don't read an answer.

I might mention that Weick, LaPorte and Co have a similar answer to question 1 and a very different set of answers to question 3. They also have an answer to question 2, which is that critical tasks are performed reliably.

And their answer to question 3 is repudiated by the "normal accident" theorists, who say that if you have a complex-interactive, tightly-coupled system then you can expect accidents, no matter what you have in the way of a company culture.

BTW, I agree with Andy's observation that this response is elfin-safety oriented (what NAmericans know as OSHA). There are some industries in which elfin-safety aligns with overall safety. Nuclear power plants and airlines are two of them, because in those industries, generally speaking, if you protect your employees you protect all stakeholders.

What I (hope I) am getting at with my questions is that the "culture" stuff is vague. If there is a hazard identified in system operation, then I can redesign to avoid it or I can mitigate it, and then that hazard arises less and/or its severity is reduced. That is a concrete engineering process which has seen, for example, the mechanical dependability of commercial aircraft increase by leaps and bounds over the last half-century.

Whereas the same kind of nebulous maybe-causality exists between notional company "safety culture"
and appropriately-safe systems/systems operation as exists between software development processes and the dependability of the software produced.

There are people (mostly management theorists, but also some engineers, as here) who wish to claim that "safety culture" is the be-all and end-all of safety (which I consider an unacceptable exaggeration). But I don't know how to go into a company and tell in a yes/no fashion whether there is a "safety culture" and, most importantly, I don't know what I can justifiably conclude about their operational/product/elfin safety if there is (other than that it is likely better than it would be if there weren't). Similarly I can go into a company, look at their software development process, and am able to draw very few conclusions on that basis about the objective reliability of the software they produce (other than that it is probably better than it would be otherwise).

Nuclear-power people are justifiably proud of what they call their pervasive "safety culture". I have been discussing with some experts how similar processes could be introduced around cybersecurity. One unsurprising suggestion is that it must have representation and responsibility at board level (as I mentioned earlier; in Anglo-Saxon businesses, of course). But what does this do?
Consider. With safety, you want to avoid accidents. There aren't many of those, so one obvious thing a board member can do is read and review incident reports and their analyses and keep track of how the company is learning lessons from those. (It is getting easier by the year to do that in commercial aviation, for example.) But with cybersecurity, if you are a moderate-size company then you likely have tens to thousands of incidents a day and it rapidly becomes physically impossible, board member or no, for any one person to keep complete track. Things have to be summarised for that board member, sometimes drastically, and summaries by their nature leave stuff out, and what you leave out today may do you in tomorrow. How to summarise? Nobody has much idea, and, even if they did, there is little objective evidence to say that it is right. Everyone on the board might thing cybersec is critical, but that seems little use unless someone can say what info they need and what they  are to d  o with it and give it to them in a cognitively comprehensible manner. So "board-level responsibility" is more a mantra than it is a concrete guide to instilling a "cybersecurity culture"
(whatever that may be).

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany MoreInCommon Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de





_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE


More information about the systemsafety mailing list