[SystemSafety] Bounded rationality or ignorance?

Les Chambers les at chambers.com.au
Tue Oct 16 01:51:41 CEST 2018


Matthew

RE:
When it comes to making decisions about risk there’s a number cognitive biases that can affect our decisions. For example:

- Confirmation bias (as Peter mentioned),
- Omission neglect,
- the narrative fallacy, 
- availability bias, and
- framing effects/prospect theory

Can I add ...

The human failings inherit in this list of biases are not a problem, no, they represent an opportunity.
If you agree that, in the service of safety, any lie, any half-truth, any sneaky psychological trick is justified - read on.

A life transforming experience convinced me of this where curtailment of testing in the service of project schedule caused two safety incidents that could have killed someone. If there had been a death I would never have forgiven myself. I would have spent the rest of my life wondering, "should I have screamed louder; should I have been a better persuader; could I have been a better political actor." 
So, from that point on I have resolved to take no prisoners when it comes to defending proper systems engineering process. And this, unashamedly, means preying on the weaknesses of those who would give in to pride and greed, shutting down testing of a safety critical system when it is not complete.

I encourage all systems engineers to study these human failings for they are fertile ground in the science of persuasion. Let me repeat them:

Narrative fallacy. The idea that the future can be predicted by generalising and anecdotal story from the past. (see also Reason-Respecting Tendency)
https://fs.blog/2016/04/narrative-fallacy/

Availability bias. The notion that what can be easily recalled to memory must occur with a higher probability.
https://fs.blog/2011/08/mental-model-availability-bias/

Omission neglect. Where people fail to reflect on what they do not know, underestimate the importance of missing information, and form strong opinions even when the available evidence is weak. 
https://psychology.iresearchnet.com/social-psychology/decision-making/omission-neglect/

Confirmation bias.  Our tendency to cherry-pick information that confirms our existing beliefs or ideas. Example: 'Railway signalling engineers are smart people. Our guys have never had an incident in the past.'
https://fs.blog/2017/05/confirmation-bias/

Framing effects/prospect theory.  Our tendency to have our choices influenced by the way they are framed. Different wordings, settings, and situations will have a powerful effect on decision-makers. Example: which would you prefer - a ‘95% effective’ condom or a ‘5% failure’ condom?
Framing often comes in the form of gains or losses, as in prospect theory (Kahneman & Tversky, 1979). This theory demonstrates that a loss is perceived as more significant, and thus more worthy of avoiding, than an equivalent gain. 
Example: "Sure, we will deliver on time if we stop testing ... But if we kill someone through system failure your career is over! You'll lose your company car. You won't be able to feed your family. You'll spend the rest of your life on the unemployment queue, in misery and regret. "
https://thedecisionlab.com/bias/framing-effect/

This is why I travel around teaching engineers how to tell stories. They are the vehicle whereby we pull levers in people's heads by weaponising the 'narrative fallacy', 'availability bias' and 'framing effects'. Think of it as weaponising human frailty in the service of safety.

It turns out that human beings make sense out of events by recognising patterns in situations and events. And the most common pattern we use is the story structure. A good story well told wraps an emotional charge around an 'undeniable' truth. The stronger the charge the more likely we are to remember the pattern and have it inform our behaviour in the future (availability bias). 
All stories are anecdotes, and yes, because condition X occurred some time in the past that resulted in outcome Y does not mean that the same conditions occurring now will achieve the same result in the present. But the stronger the emotional charge the less your audience will care. They will remember and they will act. This is sneaky but it's effective. 
I'll leave you with an example. I crafted this poem about Bhopal. It was accompanied by Pablo Bartholomew's image of a girl child being buried in the earth (https://www.worldpressphoto.org/collection/photo/1985/world-press-photo-year/pablo-bartholomew). Its purpose is to give managers pause when they consider cutting costs on plant maintenance.

Apology
What can I say 
To your dead eyes
For your lost child songs
For the sun you won't see rise

What can I say 
For your growing up denied
To the boy who's lost his sister 
To the man without a bride

I should say something
But, how can I say
That when I could have saved you
I looked the other way

And I murdered you for money
And I murdered you for pride
And then when asked to please explain
I lied

Now the earth falls upon you
And rain drops on leaves
And in the quiet between the notes
Nothing deceives

While the merchants count their money
The maker's keeping score
Of each wrong note the player wrote
For the tune we signed up for

So engineers, forswear your company loyalty
Piss on company pride
The only loyalty you owe is to this little one 
Who died

Let not your morals perish with her
But bare them deep where none can bar
And think not of engineer as what you do
But the sum of all you are

Cheers
Les
	 

> 
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE

_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE




More information about the systemsafety mailing list