[SystemSafety] Static Analysis

Matthew Squair mattsquair at gmail.com
Mon Mar 3 11:57:03 CET 2014


Well,

I'm thinking that the empirical approach is a necessary part of any
endeavor with a modicum of innovation in it. Sometimes you really do need
to suck it and see, to paraphrase Clarke's second law.

As an example that's what precursor missions are for, try the technology
with a cheap 'throwaway' mission. Sojourner for Discovery, Pioneer for
Voyager. And when you don't, as in the case of Hubble, the resultant cost
and schedule overruns speak volumes.

Though I'd be reluctant to apply that routinely to the traveling public
en-masse of course, to pick up on Peters point.


Matthew Squair

MIEAust, CPEng
Mob: +61 488770655
Email; Mattsquair at gmail.com
Web: http://criticaluncertainties.com

On 3 Mar 2014, at 8:51 pm, Michael Jackson <jacksonma at acm.org> wrote:

Peter:

I think Patrick Graydon's point is that in any system involving the
physical world
(including human behaviour) there are inescapable concerns that lie beyond
the
reach of  mathematical and logical reasoning and demand tests and
experiments
for their investigation. For these concerns testing can show the presence
of error
but not its absence: infinite testing is not an option. Accepting this
point we must
at some stage decide that no more testing is practicable, and that the
system is
now to be put into operation.

It is uncomfortable to characterise this decision as 'try-it-and-see' but
it is correct
in principle. Of course, for a critical system we are obliged to analyse
the design
and implementation very thoroughly, and to test very long and very hard.
Then the
system is put into operation with a circumspect realisation that there may
be, and
indeed probably are, some residual safety risks that have been detected
neither
by analysis not by testing. 'Putting the system into operation' therefore
becomes
itself a careful and gradual process embodying a strong ingredient of
further testing.

The phrase 'try-it-and-see' sounds like a sneer; but perhaps it is a
valuable reminder
that mathematical certainty of safety is simply not achievable.

-- Michael




At 07:46 03/03/2014, you wrote:

On 3 Mar 2014, at 08:02, Patrick Graydon <patrick.graydon at mdh.se> wrote:

>

> Hmm.  While my (possibly ill-informed) opinion is that the non-safety
world over-uses a try-it-and-see approach,  I wonder if we can
categorically say that try-it-and-see is /never/ appropriate in safety.


Most obviously, you are constrained by the regulatory environment. If it is
for rail in Germany, then the kit must be approved for use by the
regulator. It is replacing some kit or other, usually, so it must be
demonstrated and documented to be at least as safe as that which it is
replacing. It's the law. You don't get to "try it and see".


Similarly, development according to IEC 61508 and "derivatives" (which
often aren't really) requires that you demonstrate that the requirements of
the standard have been met. In some jurisdictions (not all European
countries, but some), you can be criminally liable if your kit breaks and
you hurt someone, and you didn't develop according to IEC 61508 provisions.
Indeed, there is a European Directive from 2008 about products which might
cause harm. It is required a risk assessment be performed to determine if
the risk is acceptable or unacceptable. The directive issues from 2008, but
it usually takes a year or two for it to make it into national laws
(Germany was 2011). There, you don't get to 'try it and see' either.


Now, exactly how far people conform to all this is, as usual, a matter for
social negotiation. But if you want to 'try it and see' for safety-critical
kit of almost any description, then that had better be tinkering inside an
already-acceptable risk situation or you risk prosecution if something goes
wrong, modulo the enforcement situation. In Britain, you also have ALARP to
worry about.


Broadly speaking, Les's observation that no, you can't do that with
safety-critical kit is thus ensconsed in European practice and law. How far
that situation actually governs what people do is another matter. Like the
treaty (then law) which says you can only run an annual budget deficit of
3%, broken within three years by France, then Germany............


PBL


Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited

_______________________________________________

The System Safety Mailing List

systemsafety at TechFak.Uni-Bielefeld.DE


_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20140303/1179d0c0/attachment.html>


More information about the systemsafety mailing list