[SystemSafety] McCabe’s cyclomatic complexity and accounting fraud

Derek M Jones derek at knosof.co.uk
Wed Mar 28 16:21:18 CEST 2018


Paul,

> There is the reported McCabe Complexity value for each function in a
> system. Yes, you can do things to reduce individual function complexity,
> and probably should. However, you then need to take the measure a step
> further. For every function that calls other functions, you have to sum the

I agree that the way to go is to measure a collection of functions
based on their caller/callee relationship.

This approach makes it much harder to commit accounting fraud and
might well produce more reproducible results.

> for the entire system on this basis. It becomes clear when you have too many
> functions with high complexity factors as it pushed up the average complexity
> value disproportionately. It still should not be the only measure though.

Where do the decisions in the code (that creates this 'complexity' come
from)?  The algorithm that is being implemented.

If the algorithm has lot of decision points, the code will contain
lots of decision points.  The measurement process needs to target the
algorithm first, and then compare the complexity of the algorithm with
the complexity of its implementation.  The code only needs looking at
if its complexity is much higher value than the algorithm.

-- 
Derek M. Jones           Software analysis
tel: +44 (0)1252 520667  blog:shape-of-code.coding-guidelines.com


More information about the systemsafety mailing list