[SystemSafety] Fwd: Re: Safety Culture

Fredrik Asplund fasplund at kth.se
Tue Dec 12 11:23:27 CET 2017


Ok. So we disagree on that then (although I agree a Counterfactual Test is *useful* in many cases).

I am sorry, but I still don't understand where your position is. Yes, "Safety Culture" is unacceptable vague. I think one of the papers I mentioned made this point many years ago too, and tried in its way to define the concept more clearly.

So, does that mean that more effort is needed to get acceptable definitions and good guidance in place in more domains, perhaps even generalizing across them where it is possible? Or, do you think we shouldn't pursue it? Is it futile in all aspects? (To be more specific I am thinking more about in engineering, not so much operation.)

Sincerely,
/ Fredrik

-----Original Message-----
From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Peter Bernard Ladkin
Sent: den 12 december 2017 05:42
To: The System Safety List
Subject: [SystemSafety] Fwd: Re: Safety Culture


On 2017-12-11 11:43 , Fredrik Asplund wrote:
> A) I fully agree that an experiment is the only way to conclusively argue for causality, if you are trying to get at new knowledge.

I don't know who you are agreeing with; certainly not me. If the above were to be the case, nobody would ever be able to determine causality in an accident. (That is why the Counterfactual Test turns out to be so useful. It works in individual cases and does not need experiment.)

The problems with assessing "safety culture" are that it is vague. Take my observation that two industries with admired "safety culture" have dedicated UN agencies. Is it necessary for a successful safety culture to have a UN agency dedicated to your industry? How would we ever get to test that? Besides, UN agencies are by no means all as apparently effective as ICAO and IAEA, so wouldn't it depend upon the quality of the agency, whatever that might be?

You have to go more deeply into the modes of operation. "IAEA does this", "ICAO does that", and "this" and "that" are seen to be effective. So, say you conclude that "this" is an effective measure for enhancing overall safety in the industry in question. You try to introduce "this" into some other industry, and it flounders because operatives in that industry behave differently. Do we conclude the new industry doesn't have a "safety culture"? Or do we conclude that the culture is simply different and so "this" is ineffective?

There turns out to be very little guidance; effectiveness lies all in the detail.

Take the suggestion I considered, in which for "X" to be pervasive in a company, it is necessary that "X" have representation and responsibility at Board level. You'll find this a common mantra - in Anglo-Saxon-structured companies, as I pointed out. But actually what is expected is much more detailed. It is expected that accurate information about X within the company will be effectively transmitted to Board Member M whose responsibility it is. And that if Board Member M notes that some procedures or phenomena need improving, heshe will take action to get those improvements implemented, and that the action to implement the improvements will not flounder. There are a lot of human-behavioural and organisational assumptions in there which must be fulfilled in order for these measures to be effective. Sometimes they are present (for example, "X" is "finance"); sometimes they are not. The point here is that "X has representation and responsibility on the Board" is just a vague summary for a whole lot of detailed internal company procedure which needs to be in place and effective.

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany MoreInCommon Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de









More information about the systemsafety mailing list