[SystemSafety] A Fire Code for Software?

Peter Bernard Ladkin ladkin at causalis.com
Tue Mar 20 12:11:00 CET 2018


Speaking of criminal (under HSWA 1974) rather than civil liability,

On 2018-03-19 10:39 , Martyn Thomas wrote:
> .... if an OEM sells you a system that
> has been built using COTS components (Linux, say) and doesn't tell you
> that buried in your alarm is a Telnet service with a default password
> (perhaps because the OEM doesn't even know), and then your system is
> hacked with bad consequences ...
> Then I'd prefer to be the expert witness for the prosecution, not for
> the defence.
Well, that depends also.

Back a decade and a half ago, there was a bug in ssh which allowed an exploiter to obtain "root"
privilege with a running process, so such occurrences are not unknown. The recent Spectre and
Meltdown vulnerabilities are other examples, but we haven't seen vulnerable chips being removed from
critical plant by the lorryload.

Suppose an exploit of one of those resulted in harm, and the vendor was prosecuted. An ideal
explanation of the defendant's behaviour would surely centre around the reasonableness of the care
taken by the vendor after discovery of the vulnerability. That seems to be an obligation under
Section 6 Paragraph (2) which you quote.

At the moment, vulnerabilities are "eliminated or mitigated" by vendors in an atmosphere of
confidentiality after date of discovery, and then announced. So the vendor says to the plant
operator "you have a piece of kit of ours with model number XXXYYY. It is vulnerable to exploit; do
<THIS>". They have presumably discharged their obligation under Section 6 Paragraph (2). The reason
we don't see chips vulnerable to Spectre/Meltdown being removed by the lorryload is that <THIS>
consists in throttling the speculative-execution behaviour and, whether operators have done it or
not, this won't necessarily have affected the ability of the kit to do its daily job. If the kit is
exploited after the vendor's intervention, surely any responsibility would rather lie with the
operator's reaction to the intervention, not with the intervention itself.

What if, in the specific case under consideration, the vendor regularly sent to all its customers a
list of newly-discovered Linux vulnerabilities with the cover letter "our records show you are using
a software version which might incorporate <newly discovered vulnerability>. Check if this function
is present, and disable it/pay us money to disable it for you".

So the existing regimes, including the new national implementations of EU 2016/1148, as they are
currently followed, could be argued to discharge a vendor's obligation under HSWA Section 6
Paragraph (2).

You set the example using COTS Linux, and it is a rather blatant example of what we might consider
informally to be negligence .... on somebody's behalf, but not necessarily the vendor's. If the
vendor indeed took a COTS Linux, then it was obtained from some known distribution. If the
application is safety-critical, then either the vendor was large enough to re-engineer the Linux by
itself, or the vendor bought in a version of Linux from one of the critical-system-Linux-vendors. If
the latter, then if appropriate due diligence was performed by the vendor on the Linux code it was
being sold, then that is (or would be argued to be) SFAIRP assessment behaviour. The Linux-vendor
would have supplied a system with a vulnerability which it had contracted to assure was not present
(if the contract was written appropriately, which I am assuming). So the vendor is off the hook, we
might suppose. If the vendor had reengineered the Linux code itself, then yes, they are on the hook.
That is because the original assessment would not have been SFAIRP - they incorporated a known
vulnerability and amongst those methods of elimination/mitigation SFAIRP should surely be to check
your functional components against known-vulnerability databases.

So I don't really see the teeth in the HSWA yet for your example. You'd need something like strict
liability, but then similar kinds of issues as these would likely suggest that this is
inappropriately crude. Strict liability has worked with commercial aviation for upwards of ninety
years now, but that is civil liability, not criminal.

Should we be having this discussion in a cyberpolicy journal?

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de





-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20180320/5b8847f0/attachment.sig>


More information about the systemsafety mailing list