[SystemSafety] Safety Culture redux

Les Chambers les at chambers.com.au
Sun Feb 25 02:00:04 CET 2018


Chris
 A quick thought on responsibility .  For culture that is .   I draw an analogy with the military. 
They take the word 'command' very seriously .  A commander sends soldiers into battle where 
they can be killed .   Commanders are typically surrounded by staff officers who give advice but 
have no direct responsibility. 
 If you command software development you are immediately directing people who write code .  
You are therefore responsible for what they think is important  and hence  the quality of their 
work.  The trainers and the academics are staff officer material . 
 If you want to change the culture in an organisation you need to get the commanders in a room 
and convince them . 

Les

> Chris,
> To make you more aware, it hasn¡¯t been just ¡°small like-minded groups¡±. I
> teach a number of software engineering classes for Construx
> (www.construx.com). I usually teach about 30-40 classes each year, and
> classes average about 20 attendees. So that¡¯s essentially a minimum of
> around 600 people per year. While I don¡¯t publish a monthly blog, I do
> publish the occasional magazine article. Again, I avoid using the term
> ¡°bug¡± wherever possible and only use ¡°defect¡±. See, for example,
> ¡°Insanity, Hiring, and the Software Industry¡± in the November, 2015 issue
> of IEEE Computer.
> 
> ¡ª steve
> 
> -----Original Message-----
> From: Chris Hills <safetyyork at phaedsys.com>
> Organization: Phaedrus Systems
> Reply-To: "safetyyork at phaedsys.com" <safetyyork at phaedsys.com>
> Date: Friday, February 23, 2018 at 3:02 AM
> To: Steve Tockey <Steve.Tockey at construx.com>
> Cc: 'Peter Bernard Ladkin' <ladkin at causalis.com>,
> "systemsafety at lists.techfak.uni-bielefeld.de"
> <systemsafety at lists.techfak.uni-bielefeld.de>
> Subject: RE: [SystemSafety] Safety Culture redux
> 
> Hi Steve
> 
> Individuals in isolation quietly mentioning it to small like-minded groups
> won't help.  We need  movement in the mainstream, with some momentum.
> 
> I have managed to get (so far) about  a dozen well known people in the
> industry who write blogs, newsletters and magazine articles to commit to
> raising "error not bug"  every month or so through 2018.  If more of you
> join in and we get enough people publishing (blogs, newsletters, articles,
> papers, conference presentations etc) we might actually start to change the
> culture  after that we can go for World Peace (after that we can attempt
> the
> reform of US gun laws :-)  )
> 
> Chris 
> 
> -----Original Message-----
> From: Steve Tockey [mailto:Steve.Tockey at construx.com]
> 
> For what it¡¯s worth, I have been using ¡°defect¡± instead of ¡°bug¡± for at
> least 20 years. I¡¯m not sure how much of an impact it really had, but
> developers I talk to do seem to understand the problem with using the term
> ¡°bug¡±.
> 
> On the other hand, a consultant buddy of mine likes to call them,
> ¡°Developer malpractice¡±.
> 
> Cheers,
> 
> ¡ª steve
> 
> ·¢×ÔÎÒµÄ iPad
> 
> > On Feb 23, 2018, at 1:19 AM, Chris Hills <safetyyork at phaedsys.com> wrote:
> > 
> > The point is as PBL says " that the meme associated with "error" contains
> a deprecatory social value-judgement."   It is a culture we need to change.
> Peoples mind-set.  Largely amongst the  average "programmer" rather than
> the
> safety Engineers or critical systems developers. We need to replace "bug"
> with "error" at  every opportunity if we are to make this cultural change
> before some of us are killed by a software bug in our retirement homes.
> > 
> > This change will probably take a decade and I doubt we will see any
> material changes for a year or two so the sooner we start the better.
> > 
> > 
> > -----Original Message-----
> > From: systemsafety
> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of
> Peter Bernard Ladkin
> > Sent: Friday, February 23, 2018 5:00 AM
> > To: systemsafety at lists.techfak.uni-bielefeld.de
> > Subject: Re: [SystemSafety] Safety Culture redux
> > 
> > It is a little odd to see Les arguing for the relative pointlessness of
> words and dictionaries while suggesting at the same time that code review
> is
> a most effective engineering procedure.
> > 
> > Code, in the sense in which we speak of it in "code review", is a series
> of assertions in a formal language. A sort of non-fiction book (of
> instructions or declarations, whichever is your style).
> > When we review that book, we interpret its statements according to what
> >we
> think is their meaning.
> > Dictionaries are devices which say what individual words mean. The only
> reason code review can be successful at all is because of that binding of
> word and phrase to meaning.
> > 
> > Actually, fixing the meanings of individual words and phrases in this
> formal language, binding words and phrases to short, clear meanings in an
> exceptionless way, turns out to be one of the most effective methods in the
> engineering of reliable programs, as shown originally by Algol 60 and
> Pascal, as well as the language REFINE, now sadly defunct, in which I
> implemented my thesis work, an algebraic structure for implementing
> real-calendrical-time period calculations, and more recently by the decades
> of experience with SPARK. And, conversely, not fixing them is known to be a
> source of considerable vulnerability: witness, at the beginning of the
> Internet era and the establishment of US CERT in the 1990's, the 80%-90% of
> security vulnerabilities which could have been simply ruled out by using
> technology that had already existed for thirty years, namely making your
> data types behave according to the way you thought about them (aka strong
> typing).
> > 
> > One may speak of words and dictionaries, but it is probably more
> efficacious to speak of concepts and how they hang together.
> > 
> > Solving a problem, ameliorating an issue, inevitably involves
> conceptualising it in such a way that a solution can be seen to be one. And
> if it can be seen to be one, but doesn't turn out to be one, it likely
> means
> that you are missing part of the issue, that your conceptualisation turned
> out to be inadequate. If you don't like the word "conceptualise" here,
> please replace it by the word "understand", and I think you will see that
> this is almost a banal statement. So, whatever you might prefer to call it,
> conceptual analysis, otherwise known as "understanding the problem", is a
> necessary part of solving many problems. And the best tool for conceptual
> analysis is generally a set of clean and clear concepts, rather than
> obscure
> and exception-laden concepts (do I need to argue this?).
> > 
> > Anyway, the original issue raised by Chris is more about memes rather
> >than
> just about words. Chris pointed out that the meme associated with "error"
> contains a deprecatory social value-judgement.
> > Software people say all software contains bugs. And nothing follows from
> that, for most people. If software people said instead that all software
> contains errors, then there is a plethora of regulations and even laws
> saying who is responsible for damage arising from design errors in
> commercial products and it is at least possible that someone might start
> trying to apply them.
> > 
> > PBL
> > 
> > Prof. Peter Bernard Ladkin, Bielefeld, Germany MoreInCommon Je suis
> Charlie
> > Tel+msg +49 (0)521 880 7319  www.rvs-bi.de
> > 
> > 
> > 
> > 
> > 
> > 
> > _______________________________________________
> > The System Safety Mailing List
> > systemsafety at TechFak.Uni-Bielefeld.DE
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE



--
Les Chambers
les at chambers.com.au
+61 (0)412 648 992




More information about the systemsafety mailing list