[SystemSafety] Safety Culture redux

Chris Hills safetyyork at phaedsys.com
Fri Feb 23 10:19:12 CET 2018


The point is as PBL says " that the meme associated with "error" contains a deprecatory social value-judgement."   It is a culture we need to change.  Peoples mind-set.  Largely amongst the  average "programmer" rather than the safety Engineers or critical systems developers. We need to replace "bug"  with "error" at  every opportunity if we are to make this cultural change before some of us are killed by a software bug in our retirement homes. 

This change will probably take a decade and I doubt we will see any material changes for a year or two so the sooner we start the better.  


-----Original Message-----
From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Peter Bernard Ladkin
Sent: Friday, February 23, 2018 5:00 AM
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Safety Culture redux

It is a little odd to see Les arguing for the relative pointlessness of words and dictionaries while suggesting at the same time that code review is a most effective engineering procedure.

Code, in the sense in which we speak of it in "code review", is a series of assertions in a formal language. A sort of non-fiction book (of instructions or declarations, whichever is your style).
When we review that book, we interpret its statements according to what we think is their meaning.
Dictionaries are devices which say what individual words mean. The only reason code review can be successful at all is because of that binding of word and phrase to meaning.

Actually, fixing the meanings of individual words and phrases in this formal language, binding words and phrases to short, clear meanings in an exceptionless way, turns out to be one of the most effective methods in the engineering of reliable programs, as shown originally by Algol 60 and Pascal, as well as the language REFINE, now sadly defunct, in which I implemented my thesis work, an algebraic structure for implementing real-calendrical-time period calculations, and more recently by the decades of experience with SPARK. And, conversely, not fixing them is known to be a source of considerable vulnerability: witness, at the beginning of the Internet era and the establishment of US CERT in the 1990's, the 80%-90% of security vulnerabilities which could have been simply ruled out by using technology that had already existed for thirty years, namely making your data types behave according to the way you thought about them (aka strong typing).

One may speak of words and dictionaries, but it is probably more efficacious to speak of concepts and how they hang together.

Solving a problem, ameliorating an issue, inevitably involves conceptualising it in such a way that a solution can be seen to be one. And if it can be seen to be one, but doesn't turn out to be one, it likely means that you are missing part of the issue, that your conceptualisation turned out to be inadequate. If you don't like the word "conceptualise" here, please replace it by the word "understand", and I think you will see that this is almost a banal statement. So, whatever you might prefer to call it, conceptual analysis, otherwise known as "understanding the problem", is a necessary part of solving many problems. And the best tool for conceptual analysis is generally a set of clean and clear concepts, rather than obscure and exception-laden concepts (do I need to argue this?).

Anyway, the original issue raised by Chris is more about memes rather than just about words. Chris pointed out that the meme associated with "error" contains a deprecatory social value-judgement.
Software people say all software contains bugs. And nothing follows from that, for most people. If software people said instead that all software contains errors, then there is a plethora of regulations and even laws saying who is responsible for damage arising from design errors in commercial products and it is at least possible that someone might start trying to apply them.

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany MoreInCommon Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de








More information about the systemsafety mailing list