[SystemSafety] Cognitive bias, safety and disruptive technology

Les Chambers les at chambers.com.au
Thu Mar 14 00:25:48 CET 2019


Bruce

Could I add a bias?

BIAS CLASSIFICATION BIAS:

Definition - where conclusions based on pattern matching from extensive experience are classified as confirmation bias and disregarded.

Example - Two 737 Max 8s crash within five months of each other. Where:

                    1. The same variant of the same aircraft

                    2. At the same point in the sector - straight after takeoff

                    3. Exhibiting the same porpoising  behaviour 

                    4. Both pilots report problems controlling the aircraft prior to crashing

                    5. Causal analysis of the first crash blames aircraft defect

Compensating heuristic - trust your gut (born of experience) and ACT without delay.

 

Re the 737 grounding: It's interesting to analyse the action time delay for various nation states.

China - 1 day

Australia - 2 days

USA & Canada - 4 days 

Conclusion: the Americans had the most to lose. Courage takes time.

As an engineer with a 44 year background in software intensive safety critical systems I would patternize the above is a candidate for immediate action. See-Tiger-run-from-Tiger. Do not hang around for further data analysis.

What was the FAA thinking?

On deciding when to act I have found Andy Grove's book, Only the Paranoid Survive helpful. Back in the day Intel Corporation was being trounced in the marketplace by cheaper better memories from Asian manufacturers. Their senior executives, wedded to the concept of 'memories is where it's at' struggled with making the decision to focus on something else - processor chips. Until one day Andy came up with a thought experiment. "What if they fired us all and brought in a new management team?" he said. "What would THEY do?" The answer was clear. Get into the processor chip business boots and all. So they acted.

 

As you age experiential patterns become beliefs and beliefs drive action. If the side-effects of your action involve financial loss (eg cratering of Boeing's share price) that belief must be strong.  So in some cases bias is not such a bad thing.

 

Les

 

From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Bruce Hunter
Sent: Wednesday, March 13, 2019 11:07 AM
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: [SystemSafety] Cognitive bias, safety and disruptive technology

 

Despite the irony in today's twitter statement, by Donald Trump on 737 Max 8 crash - "often old and simpler is far better"; I never thought I'd say he may be partly correct for addressing novelty in design.

It's been interesting following the news and social media opinions about the Boeing 737 Max 8 crashes. We are often torn between "knee-jerk" reactions and methodical investigation with prudent correction of system faults. With this crash we even have flight information being made public before investigations have even begun (flightradar24.com https://t.co/Uyvfp1x9Xb). "Correlation [still] does not mean Causation".

Similarly with the NHTSA claim on safety correlation of Tesla SW, even authorities can fall into cognitive bias which clouds their thinking. I think cognitive bias is a real issue with disruptive technology such as autonomous vehicles, IoT and others that are starting to be used in safety-related systems.

System Safety does have these concerns addressed in IEC 61508 novelty requirements for: competence (part 1, 6.2.14e); safety verification (7.18.2.3); independence (safety justification (part 2, 7.4.10.5);and sw verification (part 3, 7.9.2.3).

 

In the hope of staring discussion thread on the softer human side of system safety I'd like to offer the a few examples of cognitive biases that impact system safety, especially with novel and disruptive technology (see useful infographic https://www.businessinsider.com.au/cognitive-biases-that-affect-decisions-2015-8)

*	Confirmation bias:

*	Definition - We tend to take notice of only information that confirms our preconceptions
*	Example - This is very pertinent to media discussion on Boeing 737 Max 8 crashes and NHTSA claim on safety correlation of Tesla SW 
*	Compensating heuristic - Don't jump to conclusions. Wait till facts are known and tested.

*	Bandwagon bias (Groupthink) - 

*	Definition - the probability of one person adopting a belief increases based on the number of people who hold that belief
*	Example - the issue of ignoring rubber seal deficiencies that led to ill informed decision for launch Challenger Space Shuttle
*	Compensating heuristic -  independence, and acknowledging and dealing with all concerns despite other imperatives (saying no is sometimes very hard)

*	Availability heuristic bias:

*	Definition - Humans overestimate the importance of information available to them
*	Example - Donald Rumsfeld's known-knowns and news/social media assumptions
*	Compensating heuristic - Ensuring completeness in reviews and risk analysis

*	Clustering illusion bias

*	Tendency to see patterns in random events
*	Example - see Confirmation bias 
*	Compensating heuristic - Causation does not mean correlation, see Confirmation bias

*	Pro-innovation bias

*	Definition - When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitation
*	Example - This seems to be the case with the  NHTSA claim and the whole spectre of disruptive technology. Coolness conquers caution (caution constrains convenience; and  convenience causes coolness.... Hopefully this did not impact the 737 Max 8 design. 
*	Compensating heuristic - What could go wrong is more important that what could go right - test to fail not to necessarily pass

*	Conservatism bias

*	Definition - favouring prior evidence over new evidence that has emerged
*	Example -"we have always done it that way" may close of may opprtunities for better solutions that may also be safer
*	Compensating heuristic -not to close off new options but manage them with facts and verification

*	and many more...

Why is it we are so quick to drop the rigour and existing standards which have built up over time to prevent our judgements being blinded by biases? 

Does anyone else have good advice on compensating for cognitive biases and prevent bad safety decisions?

 

Bruce

(old and simpler 😉)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20190314/8f0bb044/attachment-0001.html>


More information about the systemsafety mailing list