[SystemSafety] Another unbelievable failure (file system overflow)

Les Chambers les at chambers.com.au
Wed Jun 3 15:10:58 CEST 2015


Martyn

In my experience the presence of IEC 61508 has had a positive effect when
attached to a contract as a compliance constraint. It forces organisations
to clean up their act. I've seen the same thing happen with EN 50128 in rail
projects.

I think we get too tied up in the details sometimes and forget about the
overall positive impact of these standards.

I do believe that the pricing of IEC 61508 is an immoral act of greed and a
clear violation of clause 3 of a well known standard, common to many faiths
in many civilisations over millennia. 

Refer: http://en.wikipedia.org/wiki/Ten_Commandments

"Thou shalt not make unto thee any graven image."

Just as this standard will never stop greed, or murder for that matter, the
existence of a functional safety standard will not make any system totally
safe. It all lies with the people working within the FS framework. How
committed are they? It's exactly the same as being committed to a faith.
Faith fills a need in most of us. We like to believe (without proof) that we
are part of a master plan for which we do not make the rules. Some of us
like to reinforce it by attending a church/mosque/synagogue once a week and
reflecting on it for an hour or two. In the Middle East I worked with people
who reflected five times a day. Many Westerners would view this as an
unproductive waste of time but I remember thinking at the time that it
wouldn't hurt us all to reflect with that kind of frequency, on something
positive. The more reflection, the stronger the faith and the higher the
probability of righteous action when our faith is tested. This is why I keep
pushing this barrow of constant reflection on the safety discipline for
those whose actions could cause harm to others.

 

We should all cheer up. "The faith" had a good day today. Sepp Blatter
resigned and the US Congress wound back the Patriot Act. Things are looking
up for global moral standards. 

 

Cheers

Les

 

From: systemsafety-bounces at lists.techfak.uni-bielefeld.de
[mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of
Steve Tockey
Sent: Wednesday, June 3, 2015 3:36 AM
To: martyn at thomas-associates.co.uk;
systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Another unbelievable failure (file system
overflow)

 

 

Martyn,

I can't speak for IEC 61508, but I do agree that in general the weaknesses
you point out are at least borderline ethical issues.

 

 

-- steve

 

 

 

 

From: Martyn Thomas <martyn at thomas-associates.co.uk>
Reply-To: "martyn at thomas-associates.co.uk" <martyn at thomas-associates.co.uk>
Date: Monday, June 1, 2015 1:34 AM
To: "systemsafety at lists.techfak.uni-bielefeld.de"
<systemsafety at lists.techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Another unbelievable failure (file system
overflow)

 

Les/Steve

Thanks for this. There's little discussion of professional ethics in any
forum that I read.

Do you think there's any hope that we might be able to make a small advance
in a focused area, such as IEC 61508? The standard isn't fit for purpose, in
that it largely ignores cybersecurity issues and does not provide a sound
basis for assessing whether safety-critical systems are safe enough for
their proposed application. It's also too long, inconsistent, too expensive,
and can't be copied/republished for use in teaching, research or
professional debate. I see these weaknesses, in the central international
standard for the safety of computer-based systems, as an ethical issue. Do
you agree?

Regards

Martyn

On 31/05/2015 05:14, Les Chambers wrote:

Steve

Thanks for referencing the code of ethics. It should be brought up more
often. Unfortunately, for me, it makes depressing reading. Especially when
you come upon paragraphs such as:

 

3.12. Work to develop software and related documents that respect the
privacy of those who will be affected by that software.

 

Although he has probably never read it, there is a man, who will probably
never see his homeland again because he took these sentiments to heart and
attempted his own corrective action. And what of the thousands of
scientists, engineers and technologists who contributed to the construction
of the software, the existence of which, he exposed to the world?

 

My point is that non-compliance with this code of ethics is massive and
almost universal. In fact, any engineer maintaining strict compliance with
every paragraph of this code would be unemployable in our modern world.

 

Reading these paragraphs through the lens of experience I am blown away by
their flippancy. From personal experience I can tell you that screwing up
the courage to implement even one of these items can be a massive life
changing event. This sentence would be lost on a graduate. They're all
perfectly reasonable statements of how one should behave. Much like, "Thou
shall not kill, thou shall not commit adultery ...".  The issue lies in the
moral courage to implement.

 

There is no quick fix to this problem as we are a decentralised, unorganised
and generally fragmented lot. We don't have the luxury of the medical
profession that deals with a single organism. We can't simply state and
righteously comply with the notion of, "Do no harm." In fact, for us, the
opposite is true, many of us work in industries where the primary purpose is
to kill other human beings, and with high efficiency (fewer soldiers kill
more enemy).

 

One thing we can do is deal with the problem at its root:

 

We are graduating incomplete human beings from science and engineering
courses. There is insufficient focus on the moral issues surrounding the
impact of our machines on humanity. For example, a study of applied
philosophy, including ethics, should be a nonnegotiable component of all
engineering courses. Not just a final year subject, but a subject for every
year with a weekly reflection on the content. Much like the weekly safety
meetings I was forced to attend in the chemical processing industry. 

 

I'm sure there will be howls of laughter at this, but, let me tell you it's
the only thing that caused me to back a senior manager about five levels
above my pay grade into a corner - he could physically not escape me short
of punching me out and stepping over my body - and berate him until he
promised to properly train his operators in the emergency procedures for a
safety critical system.

 

Popping a few paragraphs up on the web would never have done the trick.

 

That experience was trivia compared to where we are headed. The massive
computing power now available means that our software is beginning to take
higher level decisions away from human beings. Some of these decisions are
moral ones (refer my previous post on lethal autonomous weapons systems).
"Shall I kill all humans associated with this structure, or no?"

 

At a recent engineering alumni meeting I asked the head of my old
engineering Department how much philosophy is taught to undergraduate
engineers. He chuckled. "It is available as an elective but less than one
percent participate," he said.

 

I plan to speak to him again soon.

 

Cheers

Les

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20150603/f05c6a0c/attachment-0001.html>


More information about the systemsafety mailing list