[SystemSafety] Elephants, dinosaurs and integrating the VLA model

Les Chambers les at chambers.com.au
Fri Aug 4 05:19:36 CEST 2023


Steve
I see. Have we a technological variant of “The Problem of Dirty Hands” here?
https://plato.stanford.edu/entries/dirty-hands/
Indeed we may have a 21st-century channelling of Anthony Trollope’s novel, The 
Way We Live Now, 
 the praiseworthy deeds of the powerful escape the normal 
categories of morality. 

Lady Carbury:  “If a thing can be made great and beneficent, a boon to 
humanity, simply by creating a belief in it, does not a man become a benefactor 
to his race by creating that belief?”
“At the expense of veracity?” suggested Mr. Booker.
"At the expense of anything?” rejoined Lady Carbury with energy. “One cannot 
measure such men by the ordinary rule.”
“You would do evil to produce good?” asked Mr. Booker.
“I do not call it doing evil
.You tell me this man may perhaps ruin hundreds, 
but then again he may create a new world in which millions will be rich and 
happy.”
“You are an excellent casuist, Lady Carbury.”
“I am an enthusiastic lover of beneficent audacity,” said Lady Carbury


Les

> Les wrote:
> 
> “Case study: Elaine Herzberg, killed by a self-driving Uber in Tempe, 
Arizona in
> 2018. The system did not classify her as a pedestrian because she was 
crossing
> without a crosswalk; the neural net did not include consideration for
> jaywalking pedestrians.”
> 
> The whole story is a bit more complicated than that. Here is a good summary 
of the NTSB report:
> 
> <https://arstechnica.com/cars/2019/11/how-terrible-software-design-decisions-
led-to-ubers-deadly-2018-crash/>
> [Screen-Shot-2019-11-06-at-4.10.29-PM-760x380.png]
> How terrible software design decisions led to Uber’s deadly 2018 
crash<https://arstechnica.com/cars/2019/11/how-terrible-software-design-
decisions-led-to-ubers-deadly-2018-crash/>
> arstechnica.com<https://arstechnica.com/cars/2019/11/how-terrible-software-
design-decisions-led-to-ubers-deadly-2018-crash/>
> 
> “Imagine yourself as an expert witness supporting Tesla in a similar 
situation.
> What section, subsection or footnote of IEC 61508 or ISO 26262 - or other
> standard - would you cite to prove Elon had applied best practice in his
> development life cycle?”
> 
> I find it hard to believe (nah, impossible actually) that I would ever agree 
to be an expert witness on the side of someone like Tesla. Rather, I think 
being an expert witness on the other side would be far easier. It would 
probably be almost trivial to prove Elon had NOT applied anything remotely 
similar to best practice in his development life cycle.
> 
> — steve
> 
> On Aug 3, 2023, at 7:48 PM, Les Chambers <les at chambers.com.au> wrote:
> 
> Peter
> Your comment:
> "Martyn already observed on 2023-06-27 that there are legal requirements 
which
> constrain deployment of safety-related systems. That legal requirement in the
> UK and Australia is 77 years old. Your question seems to be suggesting that 
you
> somehow think it, and other constraints, might no longer apply. Well, they 
do.
> As Martyn said "AI doesn't change that.
> In the UK or Australia, developer and deployer must reduce risks ALARP."
> 
> . is righteous . that is, if decreed by a king ruling by fiat (Latin for "let
> it be done").
> 
> Legal requirements are one thing, usually coming into play to the right of
> "bang"; keeping the public safe in the first place is another more important
> issue.
> The interesting question is, how does one PROVE (to an auditor or a judge) 
that
> one has reduced risks ALARP if one's delivered system's behaviour is 
initiated
> from a neural network? A dataset that cannot be interpreted, verified or
> validated thoroughly in process, and that changes after delivery. AI
> aficionados admit they don't understand why NNs can work so well or fail so
> unpredictably.
> Witness: https://dawnproject.com/
> 
> Case study: Elaine Herzberg, killed by a self-driving Uber in Tempe, Arizona 
in
> 2018. The system did not classify her as a pedestrian because she was 
crossing
> without a crosswalk; the neural net did not include consideration for
> jaywalking pedestrians.
> 
> These systems are famous for not knowing what they don't know and imposing
> their ignorance on the real world. Hannah Arendt was prescient: "It's not so
> much that our models are false, but that they might become true"
> 
> Imagine yourself as an expert witness supporting Tesla in a similar 
situation.
> What section, subsection or footnote of IEC 61508 or ISO 26262 - or other
> standard - would you cite to prove Elon had applied best practice in his
> development life cycle?
> 
> Or, if you cannot pony up, would you agree that these standards are no longer
> fit for purpose in regulating the development of AI-integrated Safety-
Critical
> systems?
> 
> And furthermore, please explain the purpose of these standards, if they 
cannot
> be instrumental in stopping the murder for money currently occurring on US
> roads?
> 
> Les
> 
> PS: I note that Tesla's full self-driving (FSD) feature is available in the 
UK
> as well as the US. It is not available in Australia or Germany.
> 
> ---------------------------
> On 2023-08-03 02:32 , Les Chambers wrote:
> 
> Can anyone on this list refer me to where in the standards one can obtain
> guidance on how to engineer such a system safely?
> 
> That seems to be a question with a completely obvious answer.
> 
> Martyn already observed on 2023-06-27 that there are legal requirements which
> constrain deployment
> of safety-related systems. That legal requirement in the UK and Australia is
> 77 years old. Your
> question seems to be suggesting that you somehow think it, and other
> constraints, might no longer
> apply. Well, they do. As Martyn said "AI doesn't change that."
> 
> In the UK or Australia, developer and deployer must reduce risks ALARP.
> 
> How do you go about engineering any system such that risks are reduced ALARP,
> say in the UK? You
> follow sector-specific functional safety standards if there are some, as well
> as the engineering
> functional safety standard for E/E/PE systems, which is IEC 61508. This
> approach is regarded by the
> regulator, at least in the UK, as appropriate to fulfill the ALARP
> requirement (although of course
> the courts are the final arbiters of that).
> 
> PBL
> 
> Prof. i.R. Dr. Peter Bernard Ladkin, Bielefeld, Germany
> Tel+msg +49 (0)521 880 7319  www.rvs-bi.de
> 
> --
> Les Chambers
> les at chambers.com.au
> 
> https://www.chambers.com.au
> https://www.systemsengineeringblog.com
> 
> +61 (0)412 648 992
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
> Manage your subscription: https://lists.techfak.uni-
bielefeld.de/mailman/listinfo/systemsafety



--
Les Chambers
les at chambers.com.au

https://www.chambers.com.au
https://www.systemsengineeringblog.com

+61 (0)412 648 992




More information about the systemsafety mailing list