[SystemSafety] What measures do you use (was:Koopman replies to concerns over Toyota UA case)

paul_e.bennett at topmail.co.uk paul_e.bennett at topmail.co.uk
Wed Jan 3 20:20:43 CET 2018


On 03/01/2018 at 6:41 PM, "Steve Tockey" <Steve.Tockey at construx.com> wrote:
>
>Paul E. Bennett wrote:

[%X]

>My concern is how subjective a review becomes. Different reviewers 
>may
>pass or fail the exact same code. The same reviewer may pass or 
>fail the
>exact same code on different days. The goal should be to get as 
>much
>subjectivity out of the review as possible so the developer has a 
>³Clear,
>Concise, Correct, Coherent, Complete and Confirmable² (your words)
>understanding of the minimum acceptance criteria. Likewise the 
>reviewer(s).

When I said "Sometimes it takes no more than a brief look to know that a
code sample is not good enough to include in the final product." I was
thinking of some of the worst code I have ever seen. It was submitted for
a review. The author of said code said it worked. Myself and others were
swift to shoot it back to the author as a failed review for the following
reasons:-

  No comments (at all)
  No link to where the requirement for the code was derived from.
  Nesting was over seven deep.
  Nesting structure not clearly visible enough (we had to hunt for it).

The code presented was just plain ugly in other words and not worthy
of very much reviewer team effort to work through it in great detail.
The coder did eventually correct the above defects and with another
two passes through the review we managed to have the code in a
shape that met the coding rules, could pass a full Fagan inspection
regime, performed correctly during functional testing and within the
limits published by the code author in their commented code.

Viewing software as a component and demanding that the embedded
header comments are a clear statement of the required performance
of that components function, you get the feel of such software that
is closer to the hardware components situation. You wouldn't think of
using a chip without consulting its datasheet for performance and the
limits to its use would you?

[%X]

>>Measures for requirements specification I usually phrase as 
>>meeting the six Cs. Clear, Concise, Correct, Coherent, Complete 
>>and Confirmable.

>I also claim that this means the requirements cannot be written in 
>a natural language, like English. UML State Chart? Sure (Right, 
>Les?)? English? No way.

Requirements at the top level need to be written in not only English,
but also include diagrammatic representations that support the
clarification. If some Formal Math is applicable, then use it. I think
a requirements specification that is in only one representation is not
going to be understandable by a wide enough representation of the
stakeholders of a project. However you convey the intent of the
requirements specification, it has to be understood by the widest
possible audience (hence I prefer the mix).

>>As an aside, one function I saw had a McCabe score of 47. 
>>However, it was a very simple structure that was essentially a
>>non-nested CASE selector method and came with a very
>>reasonable justification as to why it was created that way.

>Agreed. I argue, however, that CASE needs to be treated different 
>than IF-THEN, for exactly the reason that it is so regular in  structure.

If you look how the CASE is implemented in machine language, you
are likely to find it is a very deeply nested IF -- THEN -- ELSE -- ENDIF
structure. However, that structure is auto-generated from the simpler
construct by the compiler. Using CASE just makes some of the intent
a little clearer and hence is a semantic simplification.

>I don¹t have a definitive alternative right now, research is 
>required on
>that one. But for complexity assessment purposes, it does need to 
>be
>treated differently (not so for testing purposes, tests cases must
>exercise each CASE.

As I have already stated, a high McCabe score is a reason to look
a little deeper into the code to see if things could have been improved.
It is not, of itself an indicator of bad software. Sometimes, you will
end up accepting a function with a high McCabe score. I just hope
that you have seen enough justification for doing so.

Regards

Paul E. Bennett IEng MIET
Systems Engineer
Lunar Mission One Ambassador
-- 
********************************************************************
Paul E. Bennett IEng MIET.....
Forth based HIDECS Consultancy.............
Mob: +44 (0)7811-639972
Tel: +44 (0)1392-426688
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************



More information about the systemsafety mailing list