[SystemSafety] Does "reliable" mean "safe" and or "secure" or neither?

Les Chambers les at chambers.com.au
Wed Apr 20 23:18:07 CEST 2016


One could take advice on defining integrity from the Latin integritās:
whole, intact, chased, pure, correct, sound, uncorrupted.
In general integrity is associated with moral behaviour.

I don't think it's a virtue in itself, it's more a performance art - a virtuosity, manifested by the actions of the virtuous. It can be demonstrated by a thing, an individual or a group - usually under stress. In fact integrity is always associated with struggle between the good and bad sides of our nature. Which leads me to the cardinal virtues: temperance, courage, prudence and justice, where "cardinal" means to sustain in the face of difficulty. Studying the properties and behaviours of these virtues may be a good place to start if you need to delve further. They all represent metaphors that can be projected onto the properties of the "things" we build. This is even more important as the intelligence of these things asymptotically approaches, and one day may exceed, that of humanity. Example: Microsoft's failed chat bot Tay here: http://www.abc.net.au/news/2016-03-25/microsoft-created-ai-bot-becomes-racist/7276266 .

But here's the thing, any standards body that goes down this path will soon encroach upon the territory of established religion whose moral codes often diverge even though their collective central core is probably the same. Playing mind games with the immortal soul is dangerous territory for the secular spiritualist. You don't want the Pope firing a shot across your bow. The complication is that moral codes differ as a function of geography, profession, religion and how much food you might have in the cupboard and are evolving rapidly through time. I doubt if they can be internationally agreed upon, baselined and standardised, at least to the point where you can definitively say: comply or not comply.

It's better to stick with the abstract definition and, as far as possible, define concrete instances albeit at a gross level. We have many. For example a system that does not have a specification has no integrity. A system that fails in its mission has no integrity. A person who commits to a course of action and does not deliver has no integrity. In our knowledge domain we have a cornucopia of archetypal situations, actions and states of mind that can clearly be defined as having or lacking integrity. If a standard just listed those it would be a boon to society (put it in an appendix PBL – it would be thick!). This is one justification for stacking standards bodies with industry representatives - people knee deep in the evil that men do. Better still, corral the sinners themselves. They know who they are (this aggregate includes us all)!

On the brighter side I've been wondering lately if integrity actually exists in nature, or is it a human construct which we can only achieve through struggle. I have reflected on the beautiful forests of Midwestern America. They made an impression on an Australian. We don't see the changes of season here as the Americans do, say in Michigan. In the fall the colours are stunning and in the spring the forests come back new, lush and green.  Even in winter when we put our Christmas tree out on the balcony in the snow it stayed green for two months. From all this I'm concluding that a forest can have integrity if it's flourishing, if it's robust and if it can fight off corrupt influences: man for example. Most of all it has integrity if it's sustainable. With apologies to Latin scholars we should probably add that property to integrity's abstract class.

Cheers
Les

-----Original Message-----
From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Peter Bernard Ladkin
Sent: Thursday, April 21, 2016 2:30 AM
To: RICQUE Bertrand (SAGEM DEFENSE SECURITE); Andy Ashworth; 'Christopher Johnson'; systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Does "reliable" mean "safe" and or "secure" or neither?

A necessary discussion!

I understand Chris's point (through overexposure? :-) ), but I agree with Andy and Bertrand - probably because we're all on the IEC 61508-3 MT where these things have been hanging around in the air for a couple of years now, and it's clear what "integrity" means to us at the "court" compared with what it means to the poor unwashed outside the gates who only have IEC 61508:2010 to guide them
:-) (That is a very large smile, BTW.)

But it's not just a matter of concepts and definitions. I've heard Chris's "scary" talk a couple of times now, in which he shows that the cultures around safety and the cultures around security are different and often lead to incompatible requirements. That's a real problem, on the ground, and he is very persuasive about it.

I didn't see why it should be so a couple years ago, when he first said it at an SSS invited talk (probably because he mixes it in with derisive comments on causality, which always gets my goat :-) ). I have now seen for myself what he means. It was also emphasised in the Chatham House report on Cybersecurity in NPPs, and Roger Brunt (the former chief of security for the British nuclear regulator, and an author of the report) emphasised it during the discussion in March.

A key technical point comes out of this, which we will address at the German standards authority on May 4, along with German colleagues active in ICS safety+security and NPP safety+security within the IEC. And that is that the requirements for updating safety-critical software conflict with the usual update cycle for security and nobody - nobody - I have talked to knows how to solve that problem.
Roger is very aware of it. The recent IEC offerings on safety+security gloss over it. We've gotta solve it somehow. (For Bertrand, I mentioned this also to Gilles Deleuze. I know now that the French, the Brits and the Germans are all interested in a solution. Of course, being interested in one and getting one are two different things.)

PBL

On 2016-04-20 17:54 , RICQUE Bertrand (SAGEM DEFENSE SECURITE) wrote:
> 1. adherence to moral principles; honesty 2. the quality of being 
> unimpaired; soundness 3. unity; wholeness
> 
> Webster :
> 1    :  firm adherence to a code of especially moral or artistic values :  incorruptibility
> 2    :  an unimpaired condition :  soundness
> 3    :  the quality or state of being complete or undivided :  completeness
> 
> I understand that in technological field what we mean by "integrity" in engineering is both definitions 2 and 3.
> 
> Bertrand Ricque
> Program Manager
> Optronics and Defence Division
> Sights Program
> Mob : +33 6 87 47 84 64
> Tel : +33 1 58 11 96 82
> Bertrand.ricque at sagem.com
> 
> 
> -----Original Message-----
> From: Andy Ashworth [mailto:andy at the-ashworths.org]
> Sent: Wednesday, April 20, 2016 5:48 PM
> To: 'Christopher Johnson'; RICQUE Bertrand (SAGEM DEFENSE SECURITE); 
> 'Peter Bernard Ladkin'; systemsafety at lists.techfak.uni-bielefeld.de
> Subject: RE: [SystemSafety] Does "reliable" mean "safe" and or "secure" or neither?
> 
> If Integrity is interpreted as applying to a system's ability to perform in accordance with the designer's intent, rather than focusing on data integrity, then the security definition works. 
> 
> In my mind I further characterise security as measures to thwart a deliberate attack, while dependability usually considers random effects that can affect a system's behaviour and as such the two concepts are complementary.
> 
> 
> 
> Andy Ashworth, P.Eng
> System Safety Certifier
> OLRT Constructors/Constructeurs
> Confederation Line
> 1600 Carling Ave. Ottawa, Ontario
> Suite 450, PO Box 20, K1Z 1G3
> 
> Office: 613.916.6706
> Cell: 613.314.6404
> Email: andy.ashworth at ottawa-lrt.com
> 
> 
> 
> 
> 
> -----Original Message-----
> From: systemsafety
> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf 
> Of Christopher Johnson
> Sent: April-20-16 11:03 AM
> To: RICQUE Bertrand (SAGEM DEFENSE SECURITE) 
> <bertrand.ricque at sagem.com>; Peter Bernard Ladkin 
> <ladkin at rvs.uni-bielefeld.de>; 
> systemsafety at lists.techfak.uni-bielefeld.de
> Subject: Re: [SystemSafety] Does "reliable" mean "safe" and or "secure" or neither?
> 
> I dont think this is appropriate any more.
> 
> Security here seems to imply conventional IT systems - most of the breaches I work on in safety related SCADA/ICS applications focus on the consequent loss of control which is not characterised either by concerns over data integrity or confidentiality.
> ________________________________________
> From: systemsafety 
> [systemsafety-bounces at lists.techfak.uni-bielefeld.de] on behalf of 
> RICQUE Bertrand (SAGEM DEFENSE SECURITE) [bertrand.ricque at sagem.com]
> Sent: 20 April 2016 15:23
> To: Peter Bernard Ladkin; systemsafety at lists.techfak.uni-bielefeld.de
> Subject: Re: [SystemSafety] Does "reliable" mean "safe" and or "secure" or neither?
> 
> I would stick to the Laprie taxonomy :
> 
> Dependability =  Availability + Reliability + Safety + Integrity(not 
> the SIL one, the true one) + Maintainability Security = Integrity + 
> Confidentiality
> 
> Bertrand Ricque
> Program Manager
> Optronics and Defence Division
> Sights Program
> Mob : +33 6 87 47 84 64
> Tel : +33 1 58 11 96 82
> Bertrand.ricque at sagem.com
> 
> -----Original Message-----
> From: systemsafety
> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf 
> Of Peter Bernard Ladkin
> Sent: Monday, April 18, 2016 8:21 PM
> To: systemsafety at lists.techfak.uni-bielefeld.de
> Subject: Re: [SystemSafety] Does "reliable" mean "safe" and or "secure" or neither?
> 
> On 2016-04-18 18:25 , Chris Hills wrote:
>> What is the current thinking?   Does "reliable" also infer safe or secure?
> 
> The system consists of the following.
> 
> You, tied up in a chair, fixed to the floor. Along with your nemesis, with a rifle, who is pointing it at you, and is an excellent shot, and intends to shoot. Heshe pulls the trigger.
> 
> If the rifle is reliable, the system is unsafe.
> 
> If the rifle is completely unreliable, the system is safe.
> 
> PBL
> Prof. Peter Bernard Ladkin, Faculty of Technology, University of 
> Bielefeld,
> 33594 Bielefeld, Germany Je suis Charlie
> Tel+msg +49 (0)521 880 7319  www.rvs.uni-bielefeld.de
> 
> 
> 
> 
> 
> #
> " Ce courriel et les documents qui lui sont joints peuvent contenir des informations confidentielles, être soumis aux règlementations relatives au contrôle des exportations ou ayant un caractère privé. S'ils ne vous sont pas destinés, nous vous signalons qu'il est strictement interdit de les divulguer, de les reproduire ou d'en utiliser de quelque manière que ce soit le contenu. Toute exportation ou réexportation non autorisée est interdite Si ce message vous a été transmis par erreur, merci d'en informer l'expéditeur et de supprimer immédiatement de votre système informatique ce courriel ainsi que tous les documents qui y sont attachés."
> ******
> " This e-mail and any attached documents may contain confidential or proprietary information and may be subject to export control laws and regulations. If you are not the intended recipient, you are notified that any dissemination, copying of this e-mail and any attachments thereto or use of their contents by any means whatsoever is strictly prohibited.
> Unauthorized export or re-export is prohibited. If you have received this e-mail in error, please advise the sender immediately and delete this e-mail and all attached documents from your computer system."
> #
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
> 
> 
> #
> " Ce courriel et les documents qui lui sont joints peuvent contenir des informations confidentielles, être soumis aux règlementations relatives au contrôle des exportations ou ayant un caractère privé. S'ils ne vous sont pas destinés, nous vous signalons qu'il est strictement interdit de les divulguer, de les reproduire ou d'en utiliser de quelque manière que ce soit le contenu. Toute exportation ou réexportation non autorisée est interdite Si ce message vous a été transmis par erreur, merci d'en informer l'expéditeur et de supprimer immédiatement de votre système informatique ce courriel ainsi que tous les documents qui y sont attachés."
> ******
> " This e-mail and any attached documents may contain confidential or proprietary information and may be subject to export control laws and regulations. If you are not the intended recipient, you are notified that any dissemination, copying of this e-mail and any attachments thereto or use of their contents by any means whatsoever is strictly prohibited. Unauthorized export or re-export is prohibited. If you have received this e-mail in error, please advise the sender immediately and delete this e-mail and all attached documents from your computer system."
> #
> 

--
Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs.uni-bielefeld.de









More information about the systemsafety mailing list