[SystemSafety] Apple being sued for illegal use of Facetime

Les Chambers les at chambers.com.au
Fri Jan 6 00:42:21 CET 2017


Steve
The lawyer deals in the letter of the law. The engineer deals with the
spirit of the law and that is: the dictates of practical reality. The lawyer
would say one should read the manual from cover to cover and act responsibly
according to the operational guidelines therein. The spirit of the law says
that a manufacturer should not sell you a consumer product that can kill you
if you indulge in reasonably predictable human behaviour.

I cite MIT researcher Bryan Reimer:
"Having a human there to resume control is very difficult,” says Bryan
Reimer, an MIT researcher who studies driving behavior. Once relieved of the
burden of constantly paying attention, people are quick to lose focus, and
getting them back on task is difficult. Imagine you’re watching the final
moments of The Shining when someone suddenly turns on the light and tosses
you a Rubik’s cube. How quickly could you register what’s happening, let
alone attempt to solve the puzzle? Now you see the challenge of the
handoff."
(Refer this excellent Wired magazine article:
https://www.wired.com/2017/01/human-problem-blocking-path-self-driving-cars/
)

It seems to me this Apple conversation is ignoring a pertinent issue. We are
talking about two separate and distinct environments that must have
different principles applied.
1. Uncontrolled environment (that of consumer products: automobiles, saw
benches, children's toys)
2. Controlled environment (aviation, chemical processing, Rail
transportation --- the SR71 Blackbird)

Manufacturers deploying products in uncontrolled environments cannot depend
on operator training to make a machine safe. The manufacturer has no control
over where that product roams and who uses it. Example: You buy a Tesla and
attend a one-week intensive training course on this highly automated
machine. Then you toss the keys to your young daughter so she can go down
the shop ... OR the Tesla becomes a rental vehicle driven by any bunny who
walks in the door. I recently had this experience with a Hyundai SUV. I
stopped at a petrol station in the back blocks of Wales, engaged the
handbrake (which is not a lever but a little button) and, on attempting to
release the brake, spent 30 minutes finding the unique combination of
switches and push buttons that would achieve this goal. The car carried no
driver's manual.

Machines intended for uncontrolled environments must be "intuitively safe"
(I know ... defining that phrase would be a good subject for a Ph.D.)

In a controlled environment you have the undivided attention of the machine
operator. Self interest is the key motivator. They stand to lose their jobs
if they don't follow detailed operational procedures. I've had this
experience with chemical processing reactors where the last line of defence
is a human being who must be awake, competent, committed ... and only able
to feed his children if he has memorised and can follow the manual. 
The problem with high levels of automation is that most of the time nothing
happens. They say that's why time was invented, so nothing happens all at
once  ... and further they invented space because if it ever did happen all
at once it wouldn't happen to you. Unfortunately when things do go pear
shaped with an automated machine it does happen all at once and it does all
happen to you as Joshua Brown discovered.
So because of this boring life we lead with increasing levels of automation,
people in controlled environments need constant retraining on emergency
procedures. In an aircraft or chemical processing plant the machine also
changes its behaviour after upgrades (as does the Tesla). In my experience
the plant engineer would sit down with everyone of the operators and go
through the changes one-on-one because the application was so safety
critical.

So what do we do about highly automated safety critical consumer products in
uncontrolled environments?
I had a great new year reading the above wired magazine article. At last
some sanity!!! Companies such as Google have, from the start, determined to
just take the human being out of the loop and go straight to BASt/NHTSA
level 4/5 (Google: 'Summary of Levels of Driving Automation for On-Road
Vehicles'). This means getting the concept of driverlessness right before
you hand your product over to a consumer. This is an expensive strategy
confined to people with large product development budgets, capable of cash
burn for some years before any sign of income. Musk has excluded himself
from this club. 
The Wired article indicates there is a consensus, at least among some
members of the auto industry, that the problem of monitoring drivers and
predicting what they might or might not do - then figuring out ways to
compensate for the outcome of bad behaviour - is just as difficult as
driving the sodding car with a computer. So why not go straight to the
"driving the car" solution with the emotional, unpredictable, untrained,
unknown operator completely out of the loop.

Funny they should come to this conclusion. 40 years ago I worked with a team
automating chemical reactors who had a similar idea, and this was in a
controlled environment. We had a tubular reactor that was very difficult to
control on manual so we just shut the whole thing down if the direct digital
control system failed. The operator didn't get a look in. 
In general I have noticed over the years that as we add more intelligence to
machines we change their very nature, that is, their physical make up. In
the chemical processing industry we reduced the number of reactor vessels
required to make a product because, with more intelligent monitoring and
control, we could more safely mix dangerous chemicals with potentially
explosive intermediate products of reaction. The big push will always be to
mitigate or remove the outcome of random human behaviour. And in the limit
we remove the human altogether. This is so true in most aspects of modern
aviation.
All of which raises an important philosophical issue. What other loops
should humans be removed from: child rearing? Government? What will we be
left with? Free love and bicycles? Even they may be under threat.
My view is that, relieved of the mundane, we should focus more heavily on
pursuits where our human randomness is a benefit - creative discovery for
example.

Meanwhile Steve, last week in my garage I opened my 14-year-old motor
vehicle's manual for the first time to figure out where to put a litre of
engine coolant. So sue me!

Cheers

Les


-----Original Message-----
From: systemsafety
[mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of
Steve Tockey
Sent: Thursday, January 5, 2017 1:06 PM
To: Tim Schürmann
Cc: The System Safety List
Subject: Re: [SystemSafety] Apple being sued for illegal use of Facetime


Isn't it called, "Reading the owner's manual"?



On Jan 4, 2017, at 4:07 PM, "Tim Schürmann"
<tschuerm at techfak.uni-bielefeld.de> wrote:

> Well, correct me if i'm wrong... but isn't the acquisition of your
> 'driving licence' the 'operator training'.
> Are you, (so to say) suggesting one should get a specific training from
> the manufacturer of ones preferred car?
> 
> Kind Regards
> Tim
> 
> On Mi, 2017-01-04 at 22:58 +0000, Brent Kimberley wrote:
>> Operator training can be an issue.
>> 
>> 
>> 
>> On Wednesday, January 4, 2017 4:20 PM, Mike Ellims
>> <michael.ellims at tesco.net> wrote:
> 
> 
>> I’m not so sure that’s a great idea. Our new Mondeo has a system to
>> that is supposed to detect “driver fatigue”. The rate of false
>> positives is positively maddening and it took ages to find the tiny
>> little “OK” button on the steering wheel so I could get rid of the &^%
>> $£ stupid message in the middle of the dashboard display (an LDC
>> simulating analogue dials).
>> 
>> The number of little buttons (and I mean little as in small) is in
>> itself is positively maddening...
>> 
> 
> [...Rest of the Mail can be found on the SystemSafetyList..]
> 
> -- 
> Kind Regards
> Tim Schürmann
> 
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE




More information about the systemsafety mailing list