[SystemSafety] Another unbelievable failure (file system overflow)

Matthew Squair mattsquair at gmail.com
Thu Jun 11 04:04:04 CEST 2015


I'll throw in another little wrinkle, major software failures seem to
correlate to changes in the context in which the system is used or operated.

So let's take that Google robot car and drop it into snarly traffic mayhem
on a temporary bypass during afternoon peak hour in Marseille. Or
negotiating shared traffic space in Mumbai.

I wonder how well it would operate then?

Matthew Squair

MIEAust, CPEng
Mob: +61 488770655
Email; Mattsquair at gmail.com
Web: http://criticaluncertainties.com

On 11 Jun 2015, at 3:39 am, Martyn Thomas <martyn at thomas-associates.co.uk>
wrote:

Dear Brian

Will your safety evaluation be "white box" or "black box"?

Are you permitted to attack the vehicles, to provide a real-world
environment as part of the assessment?

Martyn

On 10/06/2015 16:38, Smith, Brian E. (ARC-TH) wrote:

FYI…  I’m a member of the NASA Integrated Product Team that is

evaluating the safety of the driverless car experiments being

conducted on the Ames campus here in Mountain View (the headquarters

of Google).  I also live in the city and see Google AVs driving by my

home almost daily.  Without specific “metrics” to evaluate how these

vehicles behave, from my subjective knothole, they seem to respond to

various traffic situations just like cars with drivers.  Nissan is

also about to begin AV experiments here in our area.


_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20150611/ef39a2a6/attachment.html>


More information about the systemsafety mailing list