[SystemSafety] Qualifying SW as "proven in use" [Measuring Software]

Thierry.Coq at dnv.com Thierry.Coq at dnv.com
Thu Jun 20 11:19:23 CEST 2013


Hi Peter,

To answer your questions:
1°) Yes, there is some objective evidence that there is a correlation between a low SQALE index and quality code. For example ITRIS has conducted a study where the "good quality" code is statistically linked to a lower SQALE index, for industrial software actually used in operations.

No, there is not enough evidence, we wish there would be more people working on getting the evidence.

But more than that, SQALE IS a quality model: it DEFINES what is "good quality" code. It defines objective, measurable REQUIREMENTS that can be achieved, and that can be objectively measured between vendors and owners. 
If a given check point is later found to be not relevant, there is no problem to discard the point.

2°) For high-critical software. Yes, SQALE has been used on high-critical software, where it basically automates the (pre-)defined quality rules. It is a useful tool to help what you call "experienced critical-system programmers". 
There is a published paper by Jean-Pierre Rosen and myself on ADA open-source code where you can find more information: we've used open source because most applications of SQALE in industry are confidential.

In addition, we're always looking for clear, unambiguous (automatable), without exceptions, checkpoints. For example, one of our current checks points is a higher limit of cyclomatic complexity V(G) of 15 and 60 for one operation (method/procedure/function/whatever you call it). The V(G) can be measured by many tools today, or approximated to an extent. The usual limit in the literature is around 10 (for example, here: http://www.sei.cmu.edu/reports/97hb001.pdf). The thresholds in SQALE are high in order to reduce discussions about real-life implementations. Our measurements show that most operations fall within this limit
But there are challenges to this check point. For example:
a) V(G) actually measures size and not complexity. This fact is used in SQALE as it uses the aggregate V(G) to compute densities (very interesting, btw). SQALE does not have any checkpoint on aggregate V(G), only on individual operations,
b) some operations are complex and breaking them down is "artificial". SQALE's approach is to require breakdown as a DESIGN decision. It's possible in some implementations of SQALE either to require a rewrite, or to have a review and flag the particular operation as having an "accepted non-compliance". The last approach is to accept high V(G) in an operation if it follows a particular pattern: the whole V(G) originates from one "CASE ... OF" or "SWITCH" construct. However a mix of "CASE OF", "IF" "WHEN" and other constructs would result in a non-compliance.

Another example is copy/paste. SQALE detects a non-compliance for identical code above 100 tokens. This is about 5-15 lines of code, depending on the language, IDENTICAL in 2 or several areas. What we have observed is that there are clear differences between projects. Some projects tend to have large copy/paste ratios, while other projects (and the two we analysed in our ADA paper) have really low copy/paste. We do know which has the better quality. Again we get challenges on the usefulness of copy/paste, although most quality plans identify this as a non-compliant practice:
a) copy/paste does not in fact increase the number of defects. SQALE's point of view is that copy/paste in fact increases the test effort and reduces testability, so it is  a non-compliance. Copy/paste may increase defects but the literature is not clear. It also reduces maintainability since a defect/change in one paste instance must be corrected in every instance, but SQALE's model puts testability before maintainability, so it counts copy/paste only once.
b) copy/paste on "generated code" has no sense. SQALE's point of view is that there is no reason people writing code generators should abstain from applying the quality rules applicable to hand-crafted code. In fact we would expect them to do a better job at writing code. 

Basically, SQALE is a framework to harmonize and hierarchize quality check points. Each of those check points (and their thresholds) may be challenged. But when performing assessments, the 10 worst pieces of code identified by their high aggregate SQALE index are much poorer quality, compared to the remainder of the software, as assessed by humans. You can try it yourself. Deliberately, SQALE has been made freely available so additional and independent results can be obtained. Several tool vendors have implemented SQALE to complement their measurement points, and the SQALE authors have no commercial interests with tool vendors.

Peter, any comments or suggestions for improvement coming from you would be really helpful. Or from this mail list ;-).

Thierry Coq
PS. All opinions are my own, not necessarily those of my employer.

-----Original Message-----
From: systemsafety-bounces at techfak.uni-bielefeld.de [mailto:systemsafety-bounces at techfak.uni-bielefeld.de] On Behalf Of Peter Bernard Ladkin
Sent: mercredi 19 juin 2013 17:07
To: systemsafety at techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Qualifying SW as "proven in use" [Measuring Software]

Thierry,

is there any objective evidence that using SQALE improves the quality of the code?

If yes, is there any objective evidence that using SQALE improves the quality of high-quality code written by experienced critical-system programmers?

If yes, where would that be found in the engineering literature?

PBL

Prof. Peter Bernard Ladkin, Faculty of Technology, University of Bielefeld, 33594 Bielefeld, Germany
Tel+msg +49 (0)521 880 7319  www.rvs.uni-bielefeld.de




_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE


**************************************************************************************
The contents of this e-mail message and any attachments are confidential and are intended solely for the addressee. If you have received this transmission in error, please immediately notify the sender by return e-mail and delete this message and its attachments. Any unauthorized use, copying or dissemination of this transmission is prohibited. Neither the confidentiality nor the integrity of this message can be vouched for following transmission on the Internet.
**************************************************************************************




More information about the systemsafety mailing list