[SystemSafety] Safety and effective or not cybersecurity countermeasures

Peter Bernard Ladkin ladkin at causalis.com
Thu Jun 6 11:28:43 CEST 2019


Bruce,

thanks for your detailed comments. Some observations:-

On 2019-06-06 09:51 , Bruce Hunter wrote:
> 
> On 27/05/2019 at 09:15, Peter Bernard Ladkin wrote:
> 
> /[begin quote] [IEC TR 63069 Ed 1 Section 5 Guiding Principle 1: protection of safety implementations]/
> .... Evaluations of safety functions
> should be based on the assumption of effective (security) countermeasures./
> 
> /[end quote]/
> 
> ....there is a lot wrong with *assuming effective cybersecurity
> countermeasures are in place* while evaluating safety functions./
> 
> Wording is often a compromise with consensus standards and I agree that the second sentence in IEC
> TR 63069 may have been phrased better to convey the intended meaning.

For some of your colleagues in WG20, the intended meaning is exactly what is written. They believe
that safety evaluations and measures and cybersecurity evaluations and measures have - and should
continue to have - nothing to do with each other. (They advocate this, despite the clear indications
in the IEC TR 63069 explanation of threat-risk assessment <security> that they are inevitably
intertwined.)

I suspect this comes at least in part from the existing process-industry culture. The safety people
are engineers, know FTAs and HAZOP and so on. The security people wear blue work clothes, have belt
radios and large flashlights and Alsatian dogs and check fences and locks. Now both groups have to
reconcile themselves with people talking about buffer-overflow vulnerabilities and PKI and neither
of them really see their way clear to accommodating that so easily.

> While it would be good to integrate safety and security risk management is not  practical to do
> this, leading to the guidance in section 6.1 and Figure 4, Safety and security interaction.

IEC 61508 is concerned with E/E/PE systems. I think it is perfectly possible to deal with technical
concerns about unintentional malfunction and intentional malfunction of E/E/PE systems the same way,
namely with concern about the possibility of malfunction and how it can be ameliorated. Qualitative
FTs and Attack Trees are essentially the same technique; Attack Patterns are Hoare Logic for
malfeasants rather than for dependability.

If there is a conflict between safety evaluation and cybersecurity evaluation, it lies rather in the
different notions of risk used in the two subfields of dependability. If one takes a risk-based
approach to safety (and cybersecurity), as IEC 61508 professes to do, one should/must address the
difference in the nature of risk.

> On 05/06/2019 at 09:15, David Mentré wrote:
> /Taking as example a software based railway interlocking control device with some networking
> function. ...../
> 
> I agree with David’s position and this is a good example to use in explaining my view of WG20’s
> intent of the Guiding Principles and the WG20 committee. 

No, sorry, it is not a good example. It is a rotten example. IEC 62443 and IEC TR 63069 are
explicitly for IACS, not for rail. You can't put fences around rail kit. Your interlocking subsystem
is one zone; your dispatching/train control system another zone; i.e., zone = system and is thereby
a redundant concept. Key systems such as control and interlocking are geographically distributed and
largely cannot be shielded from physical interference, as you can a chemical reactor and its control
system.

> Railway Interlock systems would have a SIL rating of at least 3 (failure rate of < 10^-4). 

Rail interlocking systems in dense European rail networks have to be essentially technically
perfect. The situation is different in a continent such as Australia, with lines such as the Great
Northern, blocks a few hundred miles long and trains a few times a week, as my colleagues at
Queensland Rail explained in detail to me a decade and a half ago.

You raise a number of issues with the application of 63069 and 62443, which to my mind are worth
looking at in more detail, but we surely need a process-industry example to do so.

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de





-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20190606/aff70d88/attachment.sig>


More information about the systemsafety mailing list