[SystemSafety] Power Plants and Disruption

Peter Bernard Ladkin ladkin at rvs.uni-bielefeld.de
Wed Mar 23 07:55:14 CET 2016


There have been a couple of recent articles in the Risks Digest about the December 2015 disruption
of power supply in some parts of the Ukraine. See https://catless.ncl.ac.uk/Risks/29.37.html and
https://catless.ncl.ac.uk/Risks/29.38.html . A lot is being made of a couple of pieces of malware
which the disruptors used. Indeed, if you read many of the reports of the incident on the WWW, this
is most of what they talk about. It is clear that there was a cyberattack. But the cyberattack per
se did not cause the electricity-supply outages, according to the SANS report
http://ics.sans.org/media/E-ISAC_SANS_Ukraine_DUC_5.pdf

The outages were caused by human action using valid authentication to the control systems. Nothing
new there, not even much of interest to cybersecurity people. Apparently nobody knows who they were.
I noted in a note submitted to Risks that Ukraine is undergoing civil war. It could have been
legitimate personnel undertaking a hostile act. Or it could have been intruders who obtained
authentication credentials using any of the well-known means.

Infrastructural cybersecurity, though, is a big issue in general. Half a decade after Stuxnet, it
appears that many operations personnel in civil nuclear facilities still think that systems which
are "air gapped" are not vulnerable to malware, and plug their BYOD in. This from a recent report by
Chatham House at
https://www.chathamhouse.org/publication/cyber-security-civil-nuclear-facilities-understanding-risks
. It caused quite a stir when published in October 2015 but our community seems to have missed it. I
participated last week in a consultation on it. I also have a blog post at
http://www.abnormaldistribution.org/2016/03/23/power-plants-and-cyberawareness/ which contains a
fair number of links.

You can see why cybersecurity is low on the list of concerns. Do a trivial, intuitive hazard
analysis. People in nuclear power plants have to worry about the condition and operation of valves,
pipes, turbines, pumps and pressure vessels, the maintenance of the reactor(s) in a steady state,
and the condition of the local fuel and spent-fuel storage. That is all stuff which causes real,
instant or semi-instant damage if it goes wrong. Compared with which, the computer systems? Well,
bah humbug. They don't blow up, distribute becquerels or any other nasties. The company pays people
to keep them clean and running and they come and look at them every so often.

The trouble with this trivial, intuitive hazard analysis is that it is increasingly wrong about the
criticality of the computer-based systems, as the Chatham House report points out.

PBL

Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany
Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs.uni-bielefeld.de





-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 455 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20160323/a7ef2677/attachment.pgp>


More information about the systemsafety mailing list