[SystemSafety] Bounded rationality or ignorance?

Matthew Squair mattsquair at gmail.com
Fri Oct 12 00:28:12 CEST 2018


When it comes to making decisions about risk there’s a number cognitive biases that can affect our decisions. For example:

- Confirmation bias (as Peter mentioned),
- Omission neglect,
- the narrative fallacy, 
- availability bias, and
- framing effects/prospect theory

The less the information the stronger these biases can become, which leaves us with a definite problem when we try to reason about rare events.

In effect the rarer the event is the more bounded we are in our rationality. 


Regards, 

> On 12 Oct 2018, at 1:10 am, Peter Bernard Ladkin <ladkin at causalis.com> wrote:
> 
> 
> 
> On 2018-10-11 14:59 , Olwen Morgan wrote:
>> 
>> 
>> Here's an example of .....: A project has decided to use MCDC coverage as the test
>> coverage domain to be used in unit testing. The aim is to get 100% MCDC coverage of developed code
>> units. Owing to slippage, a manager decides that to make up for lost time, the project will stop
>> unit testing when 80% MCDC coverage has been achieved.
>> 
>> Here we have (typically), a manager who does not realise the risks involved in settling for only 80%
>> coverage. Is this a "cognitive limitation" or just ignorance? 
> 
> What about a case of acting on confirmation bias? There are only risks if you believe that the
> software is going to fail tests and you are trying to find the tests it will fail.
> 
> But a manager is unlikely to believe the software is going to fail tests. The manager has seen the
> inspections and what they achieved, believes in hisher prowess, experienced the progress of the
> project in terms of apparently-working LOC, seen individual unit tests, believes the software more
> or less works because heshe has seen what effort has gone into it and what has come out (working SW,
> at some level). It works, doesn't it? And it's HISHER project, what heshe is paid to do.
> 
> Testing is an overhead; no more "product" (= LOC) is thereby produced. But you have to do some (due
> diligence; besides acceptance testing is in the contract). But surely not more than "necessary" ...
> 
> PBL
> 
> Prof. Peter Bernard Ladkin, Bielefeld, Germany
> MoreInCommon
> Je suis Charlie
> Tel+msg +49 (0)521 880 7319  www.rvs-bi.de
> 
> 
> 
> 
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE



More information about the systemsafety mailing list