[SystemSafety] 1. Software Update Workshop Proceedings, 2. Siemens Position Paper on Cybersecurity

Peter Bernard Ladkin ladkin at causalis.com
Tue Sep 19 09:15:59 CEST 2017


1. The US National Academy of Sciences Committee/Forum on Cyberresilience holds an annual series of
workshops. One in February 2017 was on Software Updating. It raised the usual issues, and a few more
besides. In particular for critical systems the tension between the processes of validation and the
necessity for quick patching of newly-discovered security weaknesses was addressed. How well it was
addressed I leave up to the reader. As with all NAS publications, the electronic download is free of
charge.

https://www.nap.edu/download/24833

2. In August 2017, the large engineering company Siemens has published a position paper on
cybersecurity, which I include here. It came to my attention a couple of weeks ago, and I have
checked with the distributor that it is freely distributable. There are many points in in which
raise important questions. Because of the market power of Siemens, which has almost half a million
employees, these questions should be discussed.

a. Siemens wants to separate cybersecurity regulation from safety regulation. Those of us in the
standardisation process have been dealing with this position for a year or so. It is contrary to
that expressed in the German electrotechnical guidelines for IACS from VDE, VDE-AR-E 2802-10-1. The
VDE committee considered over many years how cybersecurity issues and measures may affect safety
functions (in the IEC 61508 sense) in IACS and came to the conclusion that there are inevitably
mutual influences. VDE-AR-E 2802-10-1 proposes how to reconcile (prioritise) those influences. I
have examples in which cybersecurity and safety functionality are inevitably intertwined (in the
helpful phrase of Swartout and Balzer). Apparently Siemens doesn't want to do it like this, but I
have seen no concrete proposals for how to deal with the apparent counterexamples.

b. Siemens seems to want to separate responsibilities quite clearly between manufacturer, system
integrator, and operator. A manufacturer M produces product X with claimed properties P.X
(substantiated by third-party assessors, if you like). X is installed by a system integrator SI in
an IACS system S. In this particular environment, X is agreed to have claimed properties S.X (which
might in some ways be different from P.X, enhanced or reduced or whatever). Operator O then operates
and maintains the plant. P.X and S.X are shared amongst everyone. Siemens seems to want S to
shoulder the responsibility for S.X, and O to shoulder the responsibility for any problems arising
when the plant is in operation.

The main issue here is that it seems to extend the "shrink-wrapped software" model to IACS kit. It
is widely accepted that that model has worked poorly in ensuring the quality of software and the
worry must surely be that it will (continue to) work equally poorly in ensuring the appropriate
resilience of IACS kit.

An alternative model, practiced in most markets for other kinds of engineered kit, such as cars and
airplanes, as well as consumer products more generally, is that M is responsible for the fitness of
purpose of X throughout its operational lifetime. (Of course issues arise concerning whether S and O
behave appropriately according to the advice and requirements of M, concerning recalls and so on,
but this is a different matter. As is how one might define "operational lifetime". The US general
aviation industry grappled with this liability issue in the 1980's-90's.) Cars and airplanes have
relatively good safety records, with airworthiness and safety recalls overseen by government
regulators. There are strong proponents of this model on this list.

It is worth recalling, maybe, that Siemens makes the Simatic S7 controller, a model of which was
compromised in the Stuxnet incident. Siemens lists the Simatic S7-1500 in a table of product
certifications, and says it has been certified to CSPN by ANSSI. I do know that the S7-1500 is used
in nuclear power applications, because I have a PhD student at Areva who is working on its security
analysis in that application in the SMARTTEST project.

c. A couple of other products apparently also carry ANSSI-CSPN certification, or TÜV TRUST IT
certification. For many others, there seems just to be self-certification against IEC 62443 or ISO
27001. Siemens advocates explicitly international standardisation as a means of ensuring
cybersecurity fitness for purpose.

Now, I have spent and spend a lot of my time in general standardisation activity. International
standards for umbrella concepts such as safety and cybersecurity are, perhaps inevitably, flawed.
Two glaring examples, from a twenty-year-old standard, IEC 61508:

a) There is apparently no term for expressing a flaw in which the requirements specification fails
to capture an operational situation and a system works differently from wanted or expected. I have
called that a "requirements flaw" or "requirements error". Such requirements errors are responsible
for the vast majority of mission-failure incidents in complex kit, and this has also been known for
nearly three decades (cf. Robyn Lutz's work with NASA). How can we have a standard for two decades
which doesn't explicitly treat the known main cause of mission failure?

b) The terms "verification" and "validation" mean assessing the fitness for purpose of kit in
general terms (verification) and in the specific application (validation). It has been known for
software, and made explicit, for half a century that testing shows the presence of flaws in software
but not their absence. Nevertheless, we continue to read in IEC 61508-3:2010 subclause 7.7.2.7 a)
that "testing shall be the main validation method for software". It could read "testing shall be the
main invalidation method for software" and then it would be OK.

Standards on such things as plugs and telecommunications protocols, where things have to work a
certain way and you have to agree on dimensions, work generally very well. Standards concerning more
abstract, general engineering approaches, such as assuring safety and security, are more often
flawed than not. I have advocated expert peer review of proposed international standards. The
experience of the European Union in annually reviewing all of its large consortium research projects
has generally been that it enhances the quality of the products. In an ideal world in which
international standards on such things as safety, cybersecurity and resilience were of very high
technical quality, then certification according to those standards would be one way of assuring
fitness for purpose. But there is no certification regime attached to IEC 62443 or to IEC 61508. At
best, there is third-party certification through institutions such as ANSSI and the TÜVs.
Self-certification according to IEC 62443, as Siemens advertises in a Table, establishes exactly
what properties of their product? They don't say.

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de




-------------- next part --------------
A non-text attachment was scrubbed...
Name: 2017_08_25-SiemensPositionPaper_ITSecurity_final-1.pdf
Type: application/pdf
Size: 302417 bytes
Desc: not available
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20170919/ce338a5d/attachment-0001.pdf>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 801 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20170919/ce338a5d/attachment-0001.sig>


More information about the systemsafety mailing list