[SystemSafety] fyi: IEEE article about ethics in driverless cars

Les Chambers les at chambers.com.au
Sat Jul 9 02:48:10 CEST 2016


Michael

The most telling paragraph in this article is Karl Iagnemma's statement:

"As of today, we don’t have any procedure for what we would commonly think of as ethical decision making. I’m not aware of any other group that does either. I think the topic is a really important one. It’s a question that’s very important to pose, but it’s going to take a while for us to converge to a technical solution for it. We’d love to be able to address that question today, but we just don’t have the technology."

 

I don't know Karl and I may be doing him a disservice but this just sounds like the words of a technocrat without a classical education. The problem isn't technical, it's intensely human. Just the act of conducting a survey about preferences for trolley problem outcomes seems misguided. Giving any credibility to the outcome: "76% of participants thought that it would be more moral for AVs to sacrifice one passenger rather than kill 10 pedestrians. " is a nonsense. 

 

At issue here is the never-ending internal human conflict between character and characterisation, where characterisation is how we wish to be perceived and character is what we really feel. Characterisation is how we present ourselves to the world, character is what we exhibit through our behaviour, especially under pressure. As Montaigne put it: "we are, I know not how, somewhat doubled in ourselves, so that what we believe we disbelieve, and cannot rid ourselves of what we condemn." 

 

The survey represents characterisation. We all want to be perceived as moral, humane, self sacrificing actors. But we also have animal survival instincts and primal needs, the results of millennia of evolution. John F. Kennedy's democratic ideals didn't extend to his treatment of women. Steve jobs was an anti-materialistic hippie who capitalised on the inventions of a friend who wanted to give them away for free.

 

My point is that I doubt very much the 76 percent would, in the moment, do what they feel is moral. These kinds of surveys are therefore useless as a requirements capture tool. Also at the point of purchase I doubt very much if the 76 percent would buy a vehicle programmed to sacrifice them or their passengers. Frankly I wouldn't.

I don't have a solution for all this (other than a basic human need to retain the power of life and death in human decision-making - the much discussed "kill switch") but I just wish the debate was a little more insightful. This will require engineers to DO THE READING. You can find a good discussion of character and characterisation here: John Yorke, Into the Woods: A Five-Act Act Journey into Story. 

 

Cheers

Les

 

 

 

 

 

From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of C. Michael Holloway
Sent: Friday, July 8, 2016 11:25 PM
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: [SystemSafety] fyi: IEEE article about ethics in driverless cars

 

Greetings,

IEEE Spectrum's badly-named "Cars That Think" blog has an article that may interest some of you:  "People Want Driverless Cars with Utilitarian Ethics, Unless They're a Passenger."  

This link will take you to it - http://bit.ly/killmenot

-- 
cMh [ C. Michael Holloway | Senior Research Engineer | NASA Langley Research Center, MS 130, Hampton VA USA | Tel: +1.757.864.1701 ] 

[E]very difference of opinion is not a difference of principle.   Thomas Jefferson

(The words in this message are mine alone; neither blame nor credit NASA for them.)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20160709/cd7f237b/attachment-0001.html>


More information about the systemsafety mailing list