[SystemSafety] A Fire Code for Software?

Martyn Thomas martyn at thomas-associates.co.uk
Tue Mar 20 14:43:41 CET 2018



On 20/03/2018 11:11, Peter Bernard Ladkin wrote:
> Speaking of criminal (under HSWA 1974) rather than civil liability,
>
> On 2018-03-19 10:39 , Martyn Thomas wrote:
>> .... if an OEM sells you a system that
>> has been built using COTS components (Linux, say) and doesn't tell you
>> that buried in your alarm is a Telnet service with a default password
>> (perhaps because the OEM doesn't even know), and then your system is
>> hacked with bad consequences ...
>> Then I'd prefer to be the expert witness for the prosecution, not for
>> the defence.
> Well, that depends also.
>
> <snip>
>
> At the moment, vulnerabilities are "eliminated or mitigated" by vendors in an atmosphere of
> confidentiality after date of discovery, and then announced. So the vendor says to the plant
> operator "you have a piece of kit of ours with model number XXXYYY. It is vulnerable to exploit; do
> <THIS>". They have presumably discharged their obligation under Section 6 Paragraph (2). The reason
> we don't see chips vulnerable to Spectre/Meltdown being removed by the lorryload is that <THIS>
> consists in throttling the speculative-execution behaviour and, whether operators have done it or
> not, this won't necessarily have affected the ability of the kit to do its daily job. If the kit is
> exploited after the vendor's intervention, surely any responsibility would rather lie with the
> operator's reaction to the intervention, not with the intervention itself.
>
> What if, in the specific case under consideration, the vendor regularly sent to all its customers a
> list of newly-discovered Linux vulnerabilities with the cover letter "our records show you are using
> a software version which might incorporate <newly discovered vulnerability>. Check if this function
> is present, and disable it/pay us money to disable it for you".
>
> So the existing regimes, including the new national implementations of EU 2016/1148, as they are
> currently followed, could be argued to discharge a vendor's obligation under HSWA Section 6
> Paragraph (2).
That's a defence to be tested in court if/when we see regular and
detailed disclosures of vulnerabilities by vendors to their customers.
If the vendor does it effectively, and as soon as reasonably
practicable, the vendor's duty mat have been transferred to their customer.


>
> You set the example using COTS Linux, and it is a rather blatant example of what we might consider
> informally to be negligence .... on somebody's behalf, but not necessarily the vendor's. If the
> vendor indeed took a COTS Linux, then it was obtained from some known distribution. If the
> application is safety-critical, then either the vendor was large enough to re-engineer the Linux by
> itself, or the vendor bought in a version of Linux from one of the critical-system-Linux-vendors. If
> the latter, then if appropriate due diligence was performed by the vendor on the Linux code it was
> being sold, then that is (or would be argued to be) SFAIRP assessment behaviour. The Linux-vendor
> would have supplied a system with a vulnerability which it had contracted to assure was not present
> (if the contract was written appropriately, which I am assuming). So the vendor is off the hook, we
> might suppose. If the vendor had reengineered the Linux code itself, then yes, they are on the hook.
> That is because the original assessment would not have been SFAIRP - they incorporated a known
> vulnerability and amongst those methods of elimination/mitigation SFAIRP should surely be to check
> your functional components against known-vulnerability databases.

Suppose the linux vulnerability was in the radio, which happens to share
a communications channel with safty critical components.

Yes, typically the duty holder will be the system integrator - or
subsystem integrator, or ...

As lawsuits cascade down the supply chain (which we should call a
"supply tree" to be more accurate), the liability may end up with the
supplier who has the worst lawyers. It could be very costly, even if you
win.


>
> So I don't really see the teeth in the HSWA yet for your example. You'd need something like strict
> liability, but then similar kinds of issues as these would likely suggest that this is
> inappropriately crude. Strict liability has worked with commercial aviation for upwards of ninety
> years now, but that is civil liability, not criminal.
>
> Should we be having this discussion in a cyberpolicy journal?

Yes.

 HMG has released a suggested CoP for the cybersecurity of IoT:
https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/686089/Secure_by_Design_Report_.pdf

Voluntary, of course. But all, presumably, considered to be "reasonably
practicable".

Martyn

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 488 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20180320/bab9aeb1/attachment.sig>


More information about the systemsafety mailing list