[SystemSafety] Cognitive bias, safety and disruptive technology

Smith, Brian E. (ARC-TH) brian.e.smith at nasa.gov
Wed Mar 13 04:01:57 CET 2019


Bruce,

The phenomenon of cognitive bias is one I’m so glad you brought up.  It can affect social, technical, personal, governmental, and business life.  It’s desperately hard for me to admit that I see life through my own knothole.

There’s a fascinating variety of cognitive biases illustrated by the graphic at this site:

Cognitive Bias Codex - The Big Picture - Barry Ritholtz<https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=6&ved=2ahUKEwi8zq-Sg_7gAhUGOa0KHaH_BLsQFjAFegQICxAB&url=https%3A%2F%2Fritholtz.com%2F2016%2F09%2Fcognitive-bias-codex%2F&usg=AOvVaw29Y75fdPvzwG4ttmp9TH3s>

Cognitive biases affect how we see system safety.  Many are accustomed to practicing "preventive safety"; that is, identifying undesired system behavior and mitigating it all costs.  This is the “Safety-I” lens/filter by which they see everything.  A fresh approach is emerging thanks to Hollnagel (among others).  It is called Productive Safety and seeks to support or facilitate what goes well by studying everyday successful performance and fostering those characteristics across a given system in order to “produce” safety (known as Safety-II).

Humans executing time-tested procedures are the primary source of Productive Safety in today’s aviation system, yet the processes by which human operators contribute to safety have been largely unstudied and poorly understood.  I wonder if this situation is the result of a cognitive bias toward preventive safety arising from what some might say are misleading statistics that human errors “cause” 80 to 90% of extremely rare accidents?  Are not airborne and ground-side humans involved in the millions of successful flights each year?  How and why do they accomplish this feat?

Safety solutions based solely on hazards and risks paint an incomplete picture of safety.  Someone once said: You shouldn’t give marriage advice based solely on studying divorces in your own neighborhood!  ;-)

Identifying, collecting, and interpreting data free of cognitive bias on the resilient performance among operators and systems is critical for developing integrated, optimized human/technology or autonomous systems.  The continued safe operation of the many socio-technical systems in which we are immersed demand that we surrender our preconceived notions.

The IEC 61508 requirements for: competence; safety verification; independence (safety) justification and sw verification probably move in the direction of productive safety especially for, as you say, novel and disruptive technology.

I came across this ironic comment by Jon Ralston: ""Ever since I learned about confirmation bias, I’ve been seeing it everywhere. Everywhere!”

Brian Smith, NASA Ames

From: systemsafety <systemsafety-bounces at lists.techfak.uni-bielefeld.de<mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de>> on behalf of Bruce Hunter <brucer.hunter at gmail.com<mailto:brucer.hunter at gmail.com>>
Date: Tuesday, March 12, 2019 at 18:06
To: "systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>" <systemsafety at lists.techfak.uni-bielefeld.de<mailto:systemsafety at lists.techfak.uni-bielefeld.de>>
Subject: [SystemSafety] Cognitive bias, safety and disruptive technology

Despite the irony in today's twitter statement, by Donald Trump on 737 Max 8 crash - "often old and simpler is far better"; I never thought I'd say he may be partly correct for addressing novelty in design.

It's been interesting following the news and social media opinions about the Boeing 737 Max 8 crashes. We are often torn between "knee-jerk" reactions and methodical investigation with prudent correction of system faults. With this crash we even have flight information being made public before investigations have even begun (flightradar24.com<https://urldefense.proofpoint.com/v2/url?u=http-3A__flightradar24.com&d=DwMFaQ&c=ApwzowJNAKKw3xye91w7BE1XMRKi2LN9kiMk5Csz9Zk&r=BdmnsxBTtQkEw9hqDxi9TN0PB0hp2D_WBIjVASrZv3o&m=V-FgRcRycSIXZxrpVZiBmiUn5lxWvkp9iDrFAFKo1MM&s=TtJyjdSzo6fzgI6q-Yiru4suWDBKao61o1pld-1xXe4&e=>https://t.co/Uyvfp1x9Xb<https://urldefense.proofpoint.com/v2/url?u=http-3A__t.co_Uyvfp1x9Xb&d=DwMFaQ&c=ApwzowJNAKKw3xye91w7BE1XMRKi2LN9kiMk5Csz9Zk&r=BdmnsxBTtQkEw9hqDxi9TN0PB0hp2D_WBIjVASrZv3o&m=V-FgRcRycSIXZxrpVZiBmiUn5lxWvkp9iDrFAFKo1MM&s=pN96ZQGOAQFmVPhXTRkv9-Uzj8826CObpixR7eAzL4o&e=>). "Correlation [still] does not mean Causation".

Similarly with the NHTSA claim on safety correlation of Tesla SW, even authorities can fall into cognitive bias which clouds their thinking. I think cognitive bias is a real issue with disruptive technology such as autonomous vehicles, IoT and others that are starting to be used in safety-related systems.

System Safety does have these concerns addressed in IEC 61508 novelty requirements for: competence (part 1, 6.2.14e); safety verification (7.18.2.3); independence (safety justification (part 2, 7.4.10.5);and sw verification (part 3, 7.9.2.3).

In the hope of staring discussion thread on the softer human side of system safety I'd like to offer the a few examples of cognitive biases that impact system safety, especially with novel and disruptive technology (see useful infographic https://www.businessinsider.com.au/cognitive-biases-that-affect-decisions-2015-8)

  *   Confirmation bias:
     *   Definition - We tend to take notice of only information that confirms our preconceptions
     *   Example - This is very pertinent to media discussion on Boeing 737 Max 8 crashes and NHTSA claim on safety correlation of Tesla SW
     *   Compensating heuristic - Don't jump to conclusions. Wait till facts are known and tested.
  *   Bandwagon bias (Groupthink) -
     *   Definition - the probability of one person adopting a belief increases based on the number of people who hold that belief
     *   Example - the issue of ignoring rubber seal deficiencies that led to ill informed decision for launch Challenger Space Shuttle
     *   Compensating heuristic -  independence, and acknowledging and dealing with all concerns despite other imperatives (saying no is sometimes very hard)
  *   Availability heuristic bias:
     *   Definition - Humans overestimate the importance of information available to them
     *   Example - Donald Rumsfeld's known-knowns and news/social media assumptions
     *   Compensating heuristic - Ensuring completeness in reviews and risk analysis
  *   Clustering illusion bias
     *   Tendency to see patterns in random events
     *   Example - see Confirmation bias
     *   Compensating heuristic - Causation does not mean correlation, see Confirmation bias
  *   Pro-innovation bias
     *   Definition - When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitation
     *   Example - This seems to be the case with the  NHTSA claim and the whole spectre of disruptive technology. Coolness conquers caution (caution constrains convenience; and  convenience causes coolness.... Hopefully this did not impact the 737 Max 8 design.
     *   Compensating heuristic - What could go wrong is more important that what could go right - test to fail not to necessarily pass
  *   Conservatism bias
     *   Definition - favouring prior evidence over new evidence that has emerged
     *   Example -"we have always done it that way" may close of may opprtunities for better solutions that may also be safer
     *   Compensating heuristic -not to close off new options but manage them with facts and verification
  *   and many more...

Why is it we are so quick to drop the rigour and existing standards which have built up over time to prevent our judgements being blinded by biases?
Does anyone else have good advice on compensating for cognitive biases and prevent bad safety decisions?

Bruce
(old and simpler 😉)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20190313/4a671445/attachment-0001.html>


More information about the systemsafety mailing list