[SystemSafety] Modelling and coding guidelines: "Unambiguous Graphical Representation"

Les Chambers les at chambers.com.au
Wed Mar 16 00:58:12 CET 2016


Peter
The issue of making a safety case with the argument of proven in use
resonates with me because I was once intimately involved in such a scenario.
BTW all my following comments relate to systems that integrate software. On
this particular project we had the advantage of time. At the point of making
the safety case we had roughly 200 control computers operating in the field
for a year with no failures or safety incidents. The accumulated operating
hours allowed us to use certain equations that looked fantastic on paper and
were probably the most compelling element of the case. In my engineering
black-and-white heart though, I know this was complete BS. I rationalised my
willing participation in this case with the knowledge that my client had
bent over backwards, delivering more than the contract required, to make the
system safe. At best the mathematics supported the qualitative idea that we
were good guys, we'd done a good job and so far there had been no failures
so, "... can we please have our last payment". And we richly deserved to be
paid because we HAD done a good job.

But your disclosure that:
'The IEC is about to publish a technical specification on the criteria to be
fulfilled for a component to be considered "proven in use".'
... is of grave concern to me, if it is to apply to software and electronic
systems for the following reasons:
1. The concept of "same" does not apply to a software product. No two
software products are the "same", ever. This was forcefully brought home to
me as, on the same project, I did a lot of work cleaning up the
configuration management process. All software is for ever engaged in a
constant round of upgrade. Not only the code. Configuration parameters are
constantly changing regardless of what class of software you are working
with. So the software used here today will NEVER be the same as the software
you might deploy somewhere else tomorrow. 
2. The concept of "identical environment" does not apply to software. As
with item 1 there is no such thing. In the software context, there is no
human verifiable criteria you could apply to defining one environment as
identical to another.

What is the IEC thinking! If what they're doing involves software they are
formalising and promulgating utter BS. They seem to be confusing software
development with cooking where everything will pan out fine if you add just
a smidgen too little salt or a fraction too much spice. As we all know with
software move one ";" to the wrong place and you've got a failed system
(unfortunately I have this demonstrated to me every day of the week several
times a day). 

With this kind of Stone Age thinking they should go back to standardising
nuts and bolts.

Les

-----Original Message-----
From: systemsafety
[mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of
Peter Bernard Ladkin
Sent: Monday, March 14, 2016 7:35 PM
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Modelling and coding guidelines: "Unambiguous
Graphical Representation"

On 2016-03-14 09:32 , Coq, Thierry wrote:
> The argument about trusting proven in use components has been 
> completely disproved by the Ariane
> 501 flight and its consequences.

It hasn't.

The IEC is about to publish a technical specification on the criteria to be
fulfilled for a component to be considered "proven in use".

The Ariana 501 event was a case in which a component which had been reliable
in previous use was reused, without anyone apparently determining that the
inputs from Ariane 5 to the digital component were different from those it
had already successfully handled. There was no valid inference from Ariane 4
success to Ariane 5 success for this component (actually, for more than
one). As Ariane Flight 501 unfortunately demonstrated.

Ariane 501 is a good example for why the conditions on reuse must be taken
rigorously. I used it in my SSS2016 talk on statistical evaluation of
critical software.

> A proven-in-use component in one environment may be replete with 
> defects that may emerge in another environment.

That is why the environment for the proposed future use must be the "same"
in certain specific ways.

> It also has disproved most ways of thinking probabilities of failure 
> for software-dependent systems.

It hasn't vitiated any of the probabilistic material at all. Nobody's had to
retract a statistical paper because of it.

People working in the field have been constantly emphasising the need for
the "new" environment to be identical in pertinent ways to the environment
in which the component has proven its reliability in use. Weaken those
conditions at your peril.

The engineering question is the matter of judging when the pertinent
conditions have been fulfilled.

PBL

Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld,
33594 Bielefeld, Germany Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs.uni-bielefeld.de







More information about the systemsafety mailing list