[SystemSafety] Collected stopgap measures

Peter Bernard Ladkin ladkin at causalis.com
Fri Nov 16 09:41:33 CET 2018


I have just come back from a meeting of the 61508 MTs in Grenoble and this feels to me like a
parallel universe.

On 2018-11-16 02:42 , Paul Sherwood wrote:
>>> -----Original Message-----
>>> > For the software only properties, it's obvious that we DO NOT need documented requirements, or
>>> documented design. Software is often (almost
>>> always, these days, in agileworld?) successfully evolved and consumed  without either of these.
> 
> ... but I still stand by this statement.

IEC 61508 and (as far as I am aware) ISO 26262 require there to be a software safety requirements
specification. In IEC 61508 there is a whole subclause, 7.2, specifying it, which is 3.5pp long.

So are we talking about so-called "safety" applications which, through some magic, do not have to
conform to applicable safety standards? Or are we talking cowboy developers who claim they are
producing software for "safety" applications but in fact aren't?

The people who commission, install and operate safety-related systems in any sector except medical,
automotive and aerospace do not, as far as I am aware, commission software from companies which are
not able to produce conformance documentation.

So where are all these software engineers producing software for safety applications who don't
produce documentation?

I can all but guarantee they are not producing software for market-leading safety-related systems
developers and integrators, because all of those of which I am aware require adherence to the
applicable safety standards, otherwise one accident and the lawyers will force them to close up shop
(and in the UK the Board would have to work hard to stay out of jail).

> AFAIK there were never any a-priori requirements or architecture for:
> 
> - linux kernel
> - openssh
> - gcc
> - llvm
> - python
> 
> ... or most of the software that Google runs internally (i'm sure others can provide many additional
> examples).
> 
> The fact that such software exists and is widely relied upon and trusted is enough to justify the
> statement.

No it is evidently not. It is enough to justify the statement that relatively reliable software has
been developed for some applications without documented requirements or documented design. It does
not follow from that that all software for all applications can be developed without.... The fact
that I don't need a map to travel around Bielefeld doesn't mean I don't need maps for other places.

> I can't see how anyone could claim to have engineered a system for safety or security without
> stating what losses/hazards/threats that aim to address (requirements) and how the solution is
> supposed to be achieved (architecture). But these are system properties etc etc.

I can't parse the first sentence, but you are right that a risk analysis is required, and safety
requirements based on this risk analysis must be formulated, and the software design must be
accompanied by documentation showing how the software safety requirements are met. None of these are
"system properties etc etc". They are documentation.

> And yet I keep on encountering supposedly expert safety folks who are happy to claim things like
> "with this 'safe' hypervisor you can run untrusted code in an internet-facing guest alongside safety
> critical functions."

It is not at all clear what you mean here by "expert safety folks". Lots of people want to talk
about E/E/PE system safety, but that doesn't mean they are expert. I have found a handy rule of
thumb is to ask a question involving "E/E/PE" to see if they know what it means.

I once saw an advertisement for a conference on safety of software, with some moderately well-known
computer-science theoreticians on the program committee - and not a single person recognised in the
"safety community". So I mailed one of these distinguished people to ask how come she could help
organise a conference on safety without a safety expert? What would they be discussing? She evaded
the question, referring me to the committee chairman. I imagine it was another bunch of people
wanting to talk to each other about reliability of such systems - the usual confusion. (Not that it
is at all bad to talk about reliability!)

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de





-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20181116/28555d9d/attachment-0001.sig>


More information about the systemsafety mailing list