[SystemSafety] New paper on MISRA C

Paul Sherwood paul.sherwood at codethink.co.uk
Mon Sep 17 17:45:07 CEST 2018


Hi Olwen,

thank you very much for your email. Please see my comments inline.

On 2018-09-17 16:07, Olwen Morgan wrote:
>>>> So a simple question, and sorry for being blunt ... Why is MISRA C 
>>>> still considered relevant to system safety in 2018?
> 
> Safety and reliability are different dependability properties. A
> system can be unreliable yet safe - e.g. an airliner that won't power
> up properly and is sitting on the tarmac. It can also be reliable but
> unsafe - e.g. it may have instruments that keep on working but give
> incorrect readings.
> 
> AFAI am aware, except in limited cases, there is no robustly
> reproducible evidence that attributes of software components have any
> demonstrable correlation with overall system dependability properties.
> Anyone who claims that making code comply with a coding standard helps
> to make it safe or reliable is missing the point.

I think I may be quoting this statement for the remainder of my career.

Perhaps I should offer you dinner...

> The aim of coding standards is to mitigate the introduction of defects
> at the coding stage of the software life cycle. A defect in a software
> component may compromise system safety, system reliability or both.
> Think about Ariane 5. It's guidance algorithm led to an unrecoverable
> attitude deviation (unreliability), as a consequence of which it had
> to be destroyed (unsafe for anyone under the falling debris). Software
> defects can be precursors to both unreliability and unsafe states
> depending on overall system design.

Absolutely.

> There is, however, one area in which software reliability relates more
> directly to system safety. That is when a piece of software is
> designed solely to provide a safety function. The classic case here is
> in machine safety, where a PLC may implement a function that removes
> power from a machine shaft when certain conditions are detected, e.g.
> if a light-curtain is breached. For the particular sub-case of
> functions that are there exclusively for safety, there is a direct
> connection between reliability and safety. If the PLC does not
> reliably remove power when the light-curtain is breached, then one is
> liable to have to dismantle the machine to retrieve some poor
> individual's mangled body parts.

Got it, thank you.

> I am not aware of the MIT work to which you refer. perhaps you could
> give a reference?

My starting point was Nancy Leveson's webcast [1] which was facilitated 
by Simone Whiteley, a contributor here. An hour well-spent IMO.

Engineering A Safer World [2] is very readable, but it does run to 500 
pages.

The STPA Handbook [3] provides a short background before diving into the 
details of the analysis method relatively quickly.

br
Paul

[1] https://www.youtube.com/watch?v=8bzWvII9OD4&t=277s
[3] https://mitpress.mit.edu/books/engineering-safer-world
[2] http://psas.scripts.mit.edu/home/get_file.php?name=STPA_handbook.pdf




More information about the systemsafety mailing list