[SystemSafety] Thorough Test Suite?

Dewi Daniels dewi.daniels at software-safety.com
Sun Nov 17 09:44:34 CET 2019


Martyn,

That link doesn't work for me. When I Google Crosstalk Magazine, I get
http://crosstalkonline.org/ <http://crosstalkonline.org/?>, which is an
expired domain. While there is a copy of the article available online at
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.638.6456&rep=rep1&type=pdf,
this
isn't an official copy.

I contested the claims made in the article concerning DO-178B in a post on
the old Safety Critical mailing list that used to be hosted by the
University of York. I also tried contacting Andy German, who now works for
Atkins, but got no response.

As I remember, Crosstalk was not a refereed journal. In any case, just
because a claim is made in an article, it doesn't necessarily mean that
it's true.

The static analysis described in the article was conducted by Lloyd's
Register and by Aerosystems in about 1996. Lloyd's Register was the lead
organisation. I was one of the team leaders at Lloyd's Register. While
working for Lloyd's Register, I also spent 6 months at Lockheed writing
part of the safety case for the Lockheed C-130J.

We only looked at software that had been assessed as DO-178B Level A or
Level B because it would have been too expensive to analyse all of the code
on the aircraft. The majority of the software that we analysed had not yet
been through the DO-178B verification process.

I personally analysed the worst C program, which had 500 anomalies per
thousand lines of code. The design didn't match the requirements and the
code didn't match the design. I was told afterwards that the supplier had
never developed safety-critical software before and that they had used a
single contractor to write the code. The software was rejected by the DER
and an alternate product was sourced from another vendor. It's ludicrous to
suppose that software that had been through the full DO-178B verification
process would have a residual defect rate of 500 anomalies per thousand
lines of code.

It should also be noted that we raised about seven categories of
anomalises, ranging from a software defect that could cause a catastrophic
failure condition to a spelling mistake in a document.

The combination of the static analysis being conducted in parallel with the
software development and the inclusion of all anomalies found, including
spelling mistakes, is why the reported anomaly rates are so high.

I think there are many conclusions that can be drawn from QinetiQ's
analysis of the C-130J data, but I do not believe the claims made in the
article concerning DO-178B are supported by the data. The article seems to
assume that all of the software had been through the full DO-178B
verification process (i.e. had passed SOI #4) before static analysis was
carried out, but this was not the case.

The data does show that the (single) SPARK program had a lower defect rate
than the average Ada program and that the average Ada progrram had a lower
defect rate than the average C program.  The data also shows that the
autogenerated code had a lower defect rate than hand-written code.

Yours,

Dewi Daniels | Director | Software Safety Limited

Telephone +44 7968 837742 | Email d <ddaniels at verocel.com>
ewi.daniels at software-safety.com

Software Safety Limited is a company registered in England and Wales.
Company number: 9390590. Registered office: Fairfield, 30F Bratton Road,
West Ashton, Trowbridge, United Kingdom BA14 6AZ


On Thu, 14 Nov 2019 at 17:29, Martyn Thomas <martyn at thomas-associates.co.uk>
wrote:

> Dewi
>
> Yes, that's the paper. It's unfortunate that Andy German drew the
> conclusions he did, if the data was as poor as you say. Why on earth did he
> say that? Shouldn't the paper (which I think is still available on the
> www.stsc.hill.af.mil/Crosstalk website) be withdrawn and a correction
> published?  It would seem to be grossly misleading - it certainly misled me.
>
> Martyn
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20191117/4c832b95/attachment.html>


More information about the systemsafety mailing list