[SystemSafety] Autonomously Driven Car Kills Pedestrian

John Howard jhoward at roboticresearch.com
Wed Mar 28 18:15:38 CEST 2018


Thanks for clarifying Steve.  Perhaps my perception is different because 
of the industry I am most closely involved with (defense).  My only 
insight into the rest of the industry was the Waymo report from last 
year, which I thought outlined a reasonable approach given the lack of 
any other software safety standards specific to the automotive 
industry.  (They at least claim to use ISO 26262 and MIL-STD-882E as 
basis for their own System Safety Program.)  {On The Road to Fully 
Self-Driving: Waymo Safety Report, Oct. 2017}

In regard to our own processes, I obviously can't share details, but the 
governing safety standard is MIL-STD-882E.  It is not sufficient on it's 
own so we are also developing internal processes which borrow heavily 
from ISO 26262, and use MBSE (SysML with MagicDraw).  In addition, the 
systems we develop are evaluated by the Army Test and Evaluation Center 
(ATEC), which gives every system a safety rating related to hazard 
risk.  I need to be careful what I say here since I am fairly certain 
that some ATEC folks are also on this list. ;-)

That said, I am curious how you thing this should be done?  Should the 
entire industry wait for someone to develop a process similar to DO-178 
for the automotive industry?  Should software developers be prohibited 
from producing safety critical software without some kind of 
certification?  I am very serious about these questions.  While I am 
skeptical that the industry is quite as bad as you make it seem, I can 
certainly agree it isn't what it should be.  I just don't know what the 
answer is, and am eager to learn from others to improve our own internal 
processes in the mean time.

-- 
John Howard, Sr. Systems Engineer
Robotic Research, LLC
22300 Comsat Dr.
Clarksburg, MD 20871
Office: 240-631-0008 x274

On 3/27/2018 5:03 PM, Steve Tockey wrote:
>
> John,
> You can interpret my email as a scathing indictment of the mainstream 
> software industry as a whole. I am already on record as saying,
>
> “We are an industry of highly paid amateurs”.
>
> The company I work for, Construx, interacts with hundreds of 
> “professional” software development teams each and every year. Having 
> met in person the people inside these companies (both high-profile 
> companies and not), it is clear that the vast majority of the people & 
> projects I have met with are square in the middle of “highly 
> paid amateur” syndrome. That includes the likes of, for example, 
> Google, Facebook, Amazon, Microsoft, Alibaba, and so on. Given how 
> pervasive corporate cultures are, I would be shocked and amazed if the 
> software developers in Google’s self-driving car team are in 
> any way different that those in the rest of that company. I could be 
> wrong, but I don’t think there is a very high probability of that.
>
> To hear that your organization is different is very good news. I am 
> very happy to hear that there is at least one organization in the 
> self-driving car software space that is actually taking things 
> seriously. I would also exclude from “highly paid amateur” syndrome, 
> generally speaking, avionics vendors (Honeywell, Rockwell Collins, . . 
> .) and to a slightly lesser extent the medical device companies 
> because of the externally imposed standards.
>
> That being said, are you willing to provide any more detail about how 
> your organization is doing things differently than in the 
> mainstreamsoftware industry? For example:
>
> *) Are you following any specific software safety standards, ISO 
> 26262, DO-178C, . . .? If so, which one(s)?
>
> *) Have you adapted your software development processes / procedures 
> around any of those standards? If so, are you willing to share how? 
> Specifically, I mean that DO-178C requires that teams produce 
> written requirements and design documentation but it is left up to the 
> organization to determine what that documentation looks like. Might 
> you be willing to share what your document templates look like?
>
> *) How are you determining that any given software project is actually 
> complying with the applicable standards / processes / procedures? Are 
> there independent audits of any kind?
>
> *) Do you have any personnel qualifications on the developers? For 
> example, say, you hire, train, and promote developers around a set 
> of qualifications derived from the Software Engineering Body 
> of Knowledge (SWEBOK)?
>
> *) How is someone with, can I say, legitimate “engineering” experience 
> and expertise involved in the software development activities? Are 
> said people actually licensed / chartered engineers? If so, 
> what engineering disciplines are they licensed / chartered in?
>
> *) Do the engineering teams determine realistic project schedules 
> based on given project scope (or, 
> alternatively, determine realistic scope based on given 
> project schedules)? Or, are project scopes and schedules imposed by 
> external stakeholders?
>
>
> — steve
>
>
>
>
> From: systemsafety 
> <systemsafety-bounces at lists.techfak.uni-bielefeld.de 
> <mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de>> on 
> behalf of John Howard <jhoward at roboticresearch.com 
> <mailto:jhoward at roboticresearch.com>>
> Organization: Robotic Research LLC
> Date: Monday, March 26, 2018 at 7:35 AM
> To: "systemsafety at lists.techfak.uni-bielefeld.de 
> <mailto:systemsafety at lists.techfak.uni-bielefeld.de>" 
> <systemsafety at lists.techfak.uni-bielefeld.de 
> <mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
> Subject: Re: [SystemSafety] Autonomously Driven Car Kills Pedestrian
>
> Perhaps I am misunderstanding.  Is this your perception of the 
> self-driving car industry?  If so, on what basis?
>
> I cannot speak for other companies, but I can assure you that none of 
> these statements apply to any of the autonomous vehicle project I am 
> involved with.
>
> -- 
> John Howard, Sr. Systems Engineer
> Robotic Research, LLC
> 22300 Comsat Dr.
> Clarksburg, MD 20871
> Office: 240-631-0008 x274
> On 3/25/2018 1:39 PM, Steve Tockey wrote:
>>
>> FWIW, I found some interesting quotes in the following article:
>>
>> http://time.com/5213690/verruckt-slide-death-schlitterbahn/
>>
>>
>> Could  these be applied in cases involving self-driving cars?
>>
>> “was never properly or fully designed”
>>
>> “rushed it into use and had no technical or engineering expertise”
>>
>> “complied with “few, if any” longstanding safety standards”
>>
>> “the . . . death and the rapidly growing list of injuries were 
>> foreseeable and expected outcomes”
>>
>> “desire to “rush the project” and . . . designer’s lack of expertise 
>> caused them to “skip fundamental steps in the design process.””
>>
>> “not a single engineer was directly involved in . . . engineering or 
>> . . . design”
>>
>>
>> It seems as though all of these statements would apply equally to any 
>> case involving self-driving cars.
>>
>>
>> Cheers,
>>
>> — steve
>>
>>
>>
>>
>> From: systemsafety 
>> <systemsafety-bounces at lists.techfak.uni-bielefeld.de 
>> <mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de>> on 
>> behalf of Matthew Squair <mattsquair at gmail.com 
>> <mailto:mattsquair at gmail.com>>
>> Date: Friday, March 23, 2018 at 5:03 PM
>> To: Peter Bernard Ladkin <ladkin at causalis.com 
>> <mailto:ladkin at causalis.com>>, 
>> "systemsafety at lists.techfak.uni-bielefeld.de 
>> <mailto:systemsafety at lists.techfak.uni-bielefeld.de>" 
>> <systemsafety at lists.techfak.uni-bielefeld.de 
>> <mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
>> Subject: Re: [SystemSafety] Autonomously Driven Car Kills Pedestrian
>>
>> I think Uber will come unglued in civil court.
>>
>> If say the driver is legally deemed to not be in direct control but 
>> ‘supervising’ by the court then Uber is still liable for devising a 
>> method of supervision of an unsafe device that demonstrably doesn’t 
>> work, and it could be argued they could have reasonably known this in 
>> the circumstances*. If the argument turns that the driver is solely 
>> the culpable agent then as he’s also a Uber employee/contractor 
>> they’re still responsible for his actions. So, which ever way it 
>> turns Uber will carry the can, at least in a civil prosecution which 
>> is where this will get thrashed out I’d guess.
>>
>> ‘Move fast and break things’ indeed…
>>
>> *As the conversation on this thread would indicate.
>>
>> On 24 March 2018 at 4:16:49 am, Peter Bernard Ladkin 
>> (ladkin at causalis.com <mailto:ladkin at causalis.com>) wrote:
>>
>>>
>>>
>>> On 2018-03-23 17:40 , Michael Jackson wrote:
>>> >
>>> > So the responsibility in overseeing autonomous driving is worse 
>>> than that of an old-fashioned
>>> > driving instructor in a dual-control car, teaching an untrusted 
>>> learner—you can’t even order
>>> > the software to slow down: in short, it is far more demanding and 
>>> stressful than driving the
>>> > car yourself.
>>> Spot on, as usual.
>>>
>>> Woods and Sarter, in their seminal study of pilots using A320 
>>> automation, found it was worse than
>>> that. When the situation got odd, rather than cutting out the 
>>> automation and taking control ("first,
>>> fly the airplane"), they found the crew inclined to try to debug the 
>>> automation.
>>>
>>> PBL
>>>
>>> Prof. Peter Bernard Ladkin, Bielefeld, Germany
>>> MoreInCommon
>>> Je suis Charlie
>>> Tel+msg +49 (0)521 880 7319 www.rvs-bi.de <http://www.rvs-bi.de>
>>>
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> The System Safety Mailing List
>>> systemsafety at TechFak.Uni-Bielefeld.DE 
>>> <mailto:systemsafety at TechFak.Uni-Bielefeld.DE>
>>
>>
>> _______________________________________________
>> The System Safety Mailing List
>> systemsafety at TechFak.Uni-Bielefeld.DE
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20180328/94d1872f/attachment-0001.html>


More information about the systemsafety mailing list