[SystemSafety] Safety Cases

Les Chambers les at chambers.com.au
Mon Feb 10 02:33:04 CET 2014


Drew

Seems like a very fine distinction to me. Any safety program worth its salt
has a hazard analysis which generates safety requirements. Those safety
requirements are then fed into the design exercise. In my experience they
often have a major impact on hardware/software architectures. In any event,
at some point an analysis is done to check that requirements have been
satisfied. The overall V&V program should pick up safety requirements along
with non-safety-related functional and non-functional requirements. There
should be a heap of verification and validation documentation to support
that is part of the normal program. Once you have "done" all that, you can
"claim" the system is safe, summarising your claim in a safety case - which,
if you have your wits about you, you should be building as the project
progresses. In fact, if a customers want to stay on top of this issue they
should be requiring a progressive release of the safety case, for example:
on completion of requirements analysis, architectural design as well as at
project completion.

Les

 

From: systemsafety-bounces at lists.techfak.uni-bielefeld.de
[mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of
Andrew Rae
Sent: Monday, February 10, 2014 10:16 AM
To: Tracy White
Cc: systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Safety Cases

 

Tracy,
The important point is that "done" and "claimed" are different things, not
synonyms as you imply. Activities that are very good at the "done" are not
necessarily very useful for the "claimed" and vice versa. 

In particular, a lot of activities that go into making a safe design are
only indirectly evidence that the design is safe. In essence, they are
evidence that've tried hard, not that you've achieved anything. This is why
we distinguish between "process" and "product" evidence. One of the
advantages of explicit safety cases is they force you to consider exactly
what your evidence shows or doesn't show. 

Contrawise, some activities which are used a lot to generate evidence are
only indirectly helpful at making a design safer. A lot of quantitative
analysis goes into this basket. Only if it reveals issues that are addressed
through changes to design or operation can quantitative analysis actually
directly improve safety. Otherwise it is evidence without improvement. 

Drew




My system safety podcast: http://disastercast.co.uk
My phone number: +44 (0) 7783 446 814
University of York disclaimer:
http://www.york.ac.uk/docs/disclaimer/email.htm

 

On 9 February 2014 22:26, Tracy White <tracyinoz at mac.com> wrote:

 

[Andrew Rae Stated]

 

(Note: not all safety activities are about evidence. Most of them are about
getting the design right so that there _aren't_ safety problems that need to
be revealed).

 

I completely agree that ‘getting the design right’ is an important element
of any assurance argument but I disagree that it can be done (claimed)
without providing ‘evidence’. If you think you can claim that you got the
‘design right’, then you must have done something to achieve that and for
those efforts there will be evidence.

 

Regards, Tracy 

 

  _____  

On 7 Feb 2014, at 23:24, Andrew Rae <andrew.rae at york.ac.uk> wrote:

If I can slightly reframe from Martin's points, the real problem is asking
these questions in the negative. If the system _didn't_ have the properties
it needs, what activities or tests would be adequate to reveal the problems?


Whenever there is a focus on providing evidence that something is true, this
is antithetical to a proper search for evidence that contradicts. 
As Martin points out, most evidence is not fully adequate to show that
properties are true. The best we can do is selecting evidence that would
have a good chance of revealing that the properties were not true. 

(Note: not all safety activities are about evidence. Most of them are about
getting the design right so that there _aren't_ safety problems that need to
be revealed). 


Simple question for the list (not directly related to safety cases):


How often have you seen a safety analysis that was:

    a) Conducted for a completed or near completed design

    b) Revealed that the design was insufficiently safe

    c) Resulted in the design being corrected in a way that addressed the
revealed problem(s)

Supplementary question: 

   What was the activity?

[Not so hidden motive for asking, just so the question doesn't look like a
trap - I've seen a lot of QRA type analysis that meets (a), but the only
times I've seen (b) and (c) follow on are when the analysis is reviewed, not
when the analysis is conducted]

Drew

 

 


1    What properties does the system need to have in order for it to be
adequately dependable for its intended use? (and how do you know that these
properties will be adequate?) 
2    What evidence would be adequate to show that it had these properties?
3    It it practical to aquire that evidence and, if not, what is the
strongest related property for which it would be practical to provide strong
evidence that the property was true?
4    What are we going to do about the gap between 1 and 3?




My system safety podcast: http://disastercast.co.uk
My phone number: +44 (0) 7783 446 814
<tel:%2B44%20%280%29%207783%20446%20814> 
University of York disclaimer:
http://www.york.ac.uk/docs/disclaimer/email.htm

 

On 7 February 2014 12:05, RICQUE Bertrand (SAGEM DEFENSE SECURITE)
<bertrand.ricque at sagem.com> wrote:

It seems to me that at the end of the reasoning, the standard xyz (e.g. IEC
61508) requests some work to be done available in documents (whatever the
name). Standard xyz contains (strong) requirements on 1 and (weaker)
requirements on 2 but at least requirements on the means and methods to
achieve 1.

 

It looks circular.

 

In the understanding of stakeholders being compliant to standard xyz means
not doing a lot of engineering stuff that is unfortunately explicit or
implicit in the standard xyz. But most often they even never read it. This
is also an explanation about the observed gap in the industry.

 

Bertrand Ricque

Program Manager

Optronics and Defence Division

Sights Program

Mob : +33 6 87 47 84 64 <tel:%2B33%206%2087%2047%2084%2064> 

Tel : +33 1 59 11 96 82 <tel:%2B33%201%2059%2011%2096%2082> 

Bertrand.ricque at sagem.com

 

 

 

From: systemsafety-bounces at lists.techfak.uni-bielefeld.de
[mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of
Martyn Thomas
Sent: Friday, February 07, 2014 12:16 PM
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: [SystemSafety] Safety Cases

 

In the National Academies / CSTB Report  Software for Dependable Systems:
Sufficient Evidence?
(http://sites.nationalacademies.org/cstb/CompletedProjects/CSTB_042247) we
said that every claim about the properties of a software-based system that
made it dependable in its intended application should be stated
unambiguously, and that every such claim should be shown to be true through
scientifically valid evidence that was made available for expert review. 

It seems to me that this was a reasonable position, but I recognise that it
is a position that cannot be adopted by anyone whose livelihood depends on
making claims for which thay have insufficient evidence (or for which no
scientifically valid evidence could be provided). Unfortunately, much of the
safety-related systems industry is in this position (and the same is true,
mutatis mutandis, for security).

It seems to me that some important questions about dependability are these:

1    What properties does the system need to have in order for it to be
adequately dependable for its intended use? (and how do you know that these
properties will be adequate?) 
2    What evidence would be adequate to show that it had these properties?
3    It it practical to aquire that evidence and, if not, what is the
strongest related property for which it would be practical to provide strong
evidence that the property was true?
4    What are we going to do about the gap between 1 and 3?

The usual answer to 4 is "rely on having followed best practice, as
described in Standard XYZ". That's an understandable position to take, for
practical reasons, but I suggest that professional ingegrity requires that
the (customer, regulator or other stakeholder) should be shown the chain of
reasoning 1-4 (and the evidence for all the required properties for which
strong evidence can be provided) and asked to acknowledge that this is good
enough for their purposes. 

I don't care what you choose to call the document in which this information
is given, so long as you don't cause confusion by overloading some name that
the industry is using for something else.

I might refer to the answers to question 1 as a "goal", if I were trying to
be provocative.

Martyn




#
" Ce courriel et les documents qui lui sont joints peuvent contenir des
informations confidentielles, être soumis aux règlementations relatives au
contrôle des exportations ou ayant un caractère privé. S'ils ne vous sont
pas destinés, nous vous signalons qu'il est strictement interdit de les
divulguer, de les reproduire ou d'en utiliser de quelque manière que ce soit
le contenu. Toute exportation ou réexportation non autorisée est
interdite.Si ce message vous a été transmis par erreur, merci d'en informer
l'expéditeur et de supprimer immédiatement de votre système informatique ce
courriel ainsi que tous les documents qui y sont attachés."
******
" This e-mail and any attached documents may contain confidential or
proprietary information and may be subject to export control laws and
regulations. If you are not the intended recipient, you are notified that
any dissemination, copying of this e-mail and any attachments thereto or use
of their contents by any means whatsoever is strictly prohibited.
Unauthorized export or re-export is prohibited. If you have received this
e-mail in error, please advise the sender immediately and delete this e-mail
and all attached documents from your computer system."
#


_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE

 

_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20140210/76e5620b/attachment-0001.html>


More information about the systemsafety mailing list