[SystemSafety] New paper on MISRA C

clayton at veriloud.com clayton at veriloud.com
Thu Sep 13 20:22:00 CEST 2018


Thanks Paul for referencing the Leveson book.  Much of this goes much farther back to her study of the Therac-25 case which I’m sure everyone is familiar with on this list (if not they should be!).

> As I understand it MIT and others have successfully debunked the notion that system safety is correlated with component reliability

This too comes out of post-analysis of some horribly developed and designed systems, like the Therac-25, where a simple observation was made (surprisingly still generally not obvious to many): it is not possible to test and find all the bugs in a system. Component reliability, where bugs are found, fixed, and test cases are passed, do not account for the insidious failures that occur in the real world where "component interaction" can be so complex, no amount of testing can anticipate it.  

This is why best practices and disciplined software engineering processes are vital in developing safety critical systems.  As Leveson and many have stated before, most failures arise of out system requirements flaws due to lack of rigor (e.g. poor hazard analysis and mitigation).  It should always be noted, lack of disciplined process and review develops from non-technical factors as well, such as organizational attitudes and awareness.  So yeah, system-safety doesn’t correlate with any one practice, much less MISRA-C.

In this email thread, its interesting nobody seems to be commenting on the subject’s paper, have you read it? It is about systemic failure prevention, as well as component reliability, but more importantly as part of an overall disciplined process (which includes many other activities as well). Here is a quote directly out of it:

"As said earlier, MISRA C cannot be separated from the process of documented software development it is part of. In particular, the use of MISRA C in its proper context is part of an error prevention strategy which has little in common with bug finding, i.e., the application of automatic techniques for the detection of instances of some software errors. This point is so rarely understood that it deserves proper explanation.”

That last sentence has proven itself again, even on this  safety mail list.

> 
> Why is MISRA C still considered relevant to system safety in 2018?

Yes, system safety is not directly correlated with reliability (system, component or otherwise). But what does that have to do with MISRA-C not being relevant? 

Guess what? Applying MISRA-C (or any other accepted activity) outside a disciplined software engineering process can make a system more dangerous! This has been cited by many of us with experience observing MISRA-C being “blindly applied” over the many years.  I suggest reading the paper, at least, to understand how MISRA-C:2012 tries to address these problems by promoting and facilitating a rigorous review process.  Without that process, that is, treating MISRA-C as pass/fail checks in boxes, it becomes just another bow to the goddess Panacea. 

So yeah, it can be relevant one way or the other ;-) 

It is true that safety-critical software has been developed successfully without complying specifically to MISRA-C, but it is not true, that successful development with the C language can occur without disciplined processes and guidelines that MISRA-C:2012 has codified.

Clayton Weimer
https://www.linkedin.com/in/weimer/



More information about the systemsafety mailing list