[SystemSafety] a public beta phase ???

Matthew Squair mattsquair at gmail.com
Wed Jul 20 14:55:15 CEST 2016


Hi Peter,

Can I answer your questions in reverse order?

Regarding 'unhappy experiences'.

Vigilance systems are still widely used in Australia, the US and the RUS,
as we all have continental rail systems. I'll give you a illustrative quote
from a US NTSB accident report (RAR-06/03) where the at fault train was
equipped with a vigilance (the FRA calls it an alerter) system.

*“[the engineer]... remained sufficiently alert to make train control
inputs [subverting the alerter system] yet was unaware to respond to
vitally important signal indications... .” *

>From this and other rail accidents we know that even smart vigilance
systems, that monitor a range of driver control inputs, are imperfect.
Automaticity of behaviour is the fundamental problem here. I don't see any
indication that Tesla or other car manufacturer's are smarter in this
regard (unless they're doing things like monitoring posture,
eye closure and so on with in-vehicle camera's). Distraction is a also a
challenge for such systems (like texting, as at the Chatsworth accident).

Regarding 'profound consequences'

Do you recall (a while ago) a video of of an Airbus over Paris due to the
pilot fighting the automation? I submit that we have transitioned from a
situation in which the automation in aircraft is 'subservient' to the crew
to one in which the the human crew are either supervising or collaborating
(depending upon your POV) with the automation. That's a very different
relationship and the how of accidents occur is also different (Kali or
AF447 are quite different to say Tenerife or BEA flight 548). Changing the
way humans interact with a system and therefore the way in which accidents
occur is (at least to me) a profound consequence of such automation.




On Wed, Jul 20, 2016 at 9:10 PM, Peter Bernard Ladkin <
ladkin at rvs.uni-bielefeld.de> wrote:

>
>
> On 2016-07-20 12:30 , Matthew Squair wrote:
> > .... Aviation stands as a classic example of how automating traditional
> > operator tasks and radically changing the role of the operator can have
> profound consequences,
>
> What "profound consequences" are you thinking of here?
>
> > ... Likewise in implementing vigilance systems, the
> > rail industry has a lot of experience (most of it unhappy) in how naive
> vigilance system design
> > doesn't actually achieve what you want.
>
> What unhappy experience are you thinking of here? Whose rail industry?
>
> PBL
>
> Prof. Peter Bernard Ladkin, Bielefeld, Germany
> MoreInCommon
> Je suis Charlie
> Tel+msg +49 (0)521 880 7319  www.rvs-bi.de
>
>
>
>
>
>


-- 
*Matthew Squair*
BEng (Mech) MSysEng
MIEAust CPEng

Mob: +61 488770655
Email: MattSquair at gmail.com
Website: www.criticaluncertainties.com <http://criticaluncertainties.com/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20160720/330faf92/attachment-0001.html>


More information about the systemsafety mailing list