[SystemSafety] fyi: IEEE article about ethics in driverless cars

MARESCH Joachim Joachim.MARESCH at thalesgroup.com
Tue Jul 12 18:30:10 CEST 2016


In my opinion you will never get a law or code regarding AV ethics accepted by AV users and other road users and government and car manufacturers and ... . The solution from government point of view has to be "The responsibility for the AV is with the car manufacturer". Any accident consequences where an AV is involved has to be compensated by the manufacturer. By this the car manufacturer has to make a risk analysis taking into account the probability and the severity of an accident using its specific software. Additional  the government should maybe set limits for causalities, saying hurting x number of people leads to the expiration of the car type approval for example. Of course the car manufacturers do not like this way of handling. The discussion of ethical behavior seems to me as a strategy to transfer car manufacturers responsibility to society.

Joachim Maresch

-----Ursprüngliche Nachricht-----
Von: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] Im Auftrag von systemsafety-request at lists.techfak.uni-bielefeld.de
Gesendet: Sonntag, 10. Juli 2016 12:00
An: systemsafety at lists.techfak.uni-bielefeld.de
Betreff: systemsafety Digest, Vol 48, Issue 4

Send systemsafety mailing list submissions to
	systemsafety at lists.techfak.uni-bielefeld.de

To subscribe or unsubscribe via the World Wide Web, visit
	https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety
or, via email, send a message with subject or body 'help' to
	systemsafety-request at lists.techfak.uni-bielefeld.de

You can reach the person managing the list at
	systemsafety-owner at lists.techfak.uni-bielefeld.de

When replying, please edit your Subject line so it is more specific than "Re: Contents of systemsafety digest..."


Today's Topics:

   1. Re: fyi: IEEE article about ethics in driverless cars
      (Matthew Squair)


----------------------------------------------------------------------

Message: 1
Date: Sat, 9 Jul 2016 21:45:42 +1000
From: Matthew Squair <mattsquair at gmail.com>
To: Les Chambers <les at chambers.com.au>,
	systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] fyi: IEEE article about ethics in
	driverless cars
Message-ID: <3C078632-BEC2-4D85-B047-0A8431A31C85 at gmail.com>
Content-Type: text/plain; charset="windows-1251"

Hi Les,

You know utilitarianism is not the only way we humans make decisions?  I'd recommend the works of Fiske and Tetlock on taboo transactions and hierarchical transaction theory. See paper below.

http://www.sscnet.ucla.edu/anthro/faculty/fiske/pubs/Fiske_Tetlock_Taboo_Trade-offs_1997.pdf



Matthew Squair

MIEAust, CPEng
Mob: +61 488770655
Email; Mattsquair at gmail.com
Web: http://criticaluncertainties.com

> On 9 Jul 2016, at 10:48 AM, Les Chambers <les at chambers.com.au> wrote:
> 
> Michael
> The most telling paragraph in this article is Karl Iagnemma's statement:
> "As of today, we don?t have any procedure for what we would commonly think of as ethical decision making. I?m not aware of any other group that does either. I think the topic is a really important one. It?s a question that?s very important to pose, but it?s going to take a while for us to converge to a technical solution for it. We?d love to be able to address that question today, but we just don?t have the technology."
>  
> I don't know Karl and I may be doing him a disservice but this just sounds like the words of a technocrat without a classical education. The problem isn't technical, it's intensely human. Just the act of conducting a survey about preferences for trolley problem outcomes seems misguided. Giving any credibility to the outcome: "76% of participants thought that it would be more moral for AVs to sacrifice one passenger rather than kill 10 pedestrians. " is a nonsense.
>  
> At issue here is the never-ending internal human conflict between character and characterisation, where characterisation is how we wish to be perceived and character is what we really feel. Characterisation is how we present ourselves to the world, character is what we exhibit through our behaviour, especially under pressure. As Montaigne put it: "we are, I know not how, somewhat doubled in ourselves, so that what we believe we disbelieve, and cannot rid ourselves of what we condemn."
>  
> The survey represents characterisation. We all want to be perceived as moral, humane, self sacrificing actors. But we also have animal survival instincts and primal needs, the results of millennia of evolution. John F. Kennedy's democratic ideals didn't extend to his treatment of women. Steve jobs was an anti-materialistic hippie who capitalised on the inventions of a friend who wanted to give them away for free.
>  
> My point is that I doubt very much the 76 percent would, in the moment, do what they feel is moral. These kinds of surveys are therefore useless as a requirements capture tool. Also at the point of purchase I doubt very much if the 76 percent would buy a vehicle programmed to sacrifice them or their passengers. Frankly I wouldn't.
> I don't have a solution for all this (other than a basic human need to retain the power of life and death in human decision-making - the much discussed "kill switch") but I just wish the debate was a little more insightful. This will require engineers to DO THE READING. You can find a good discussion of character and characterisation here: John Yorke, Into the Woods: A Five-Act Act Journey into Story.
>  
> Cheers
> Les
>  
>  
>  
>  
>  
> From: systemsafety 
> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf 
> Of C. Michael Holloway
> Sent: Friday, July 8, 2016 11:25 PM
> To: systemsafety at lists.techfak.uni-bielefeld.de
> Subject: [SystemSafety] fyi: IEEE article about ethics in driverless 
> cars
>  
> Greetings,
> 
> IEEE Spectrum's badly-named "Cars That Think" blog has an article that may interest some of you:  "People Want Driverless Cars with Utilitarian Ethics, Unless They're a Passenger."  
> 
> This link will take you to it - http://bit.ly/killmenot
> 
> --
> cMh [ C. Michael Holloway | Senior Research Engineer | NASA Langley Research Center, MS 130, Hampton VA USA | Tel: +1.757.864.1701 ]
> [E]very difference of opinion is not a difference of principle.   Thomas Jefferson
> 
> (The words in this message are mine alone; neither blame nor credit 
> NASA for them.)
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20160709/8d6c4166/attachment-0001.html>

------------------------------

Subject: Digest Footer

_______________________________________________
systemsafety mailing list
systemsafety at lists.techfak.uni-bielefeld.de
https://lists.techfak.uni-bielefeld.de/mailman/listinfo/systemsafety


------------------------------

End of systemsafety Digest, Vol 48, Issue 4
*******************************************


More information about the systemsafety mailing list