[SystemSafety] Qualifying SW as "proven in use" [Measuring Software]

Steve Tockey Steve.Tockey at construx.com
Tue Jul 2 18:28:43 CEST 2013


PBL wrote:

"That sounds like the wrong way to assure SW."

If that's the only thing that's done, then of course it's the wrong
way--even though that's how it's done in the vast majority of (non-safety
critical) software organizations. As I have said many times:

    'Depending on testing alone as the sole means of assuring software
quality is a lost cause'.

And yes, you can quote me on that  :^)


"I guess it's OK if you don't mind if your SW croaks every hundred or
thousand hours or so. But that is hardly what one might term quality
assurance."

Most commercial software organizations don't actually care because it's
not their problem. Someone else has to suffer the consequences, not the
development organization. And further, the users essentially expect there
to be defects so they're not surprised when they happen. I am constantly
amazed by how insensitive people are to defective software.

I'd have to dig to find the reference but I did read a paper authored by a
tester at Microsoft that admitted they have released products with more
than 3000 known, open (i.e., not repaired) defects. Some people interpret
"Microsoft" on software as a product warning label.


"By which I take it you mean failures are seen. In which case not even the
above applies."

Again, I was commenting on *what normally happens*, not what *should
happen*.



"I had thought that the main point of testing a product which you hope to
be of moderate quality was to make sure you have the requirements right
and haven't forgotten some obvious things about the operating environment."

That depends...

First, it depends on what testing level you're talking about: unit,
integration/component, system, acceptance. What I think you're referring
to is some combination of system testing and acceptance testing.

But beyond that, there are better ways of making sure the requirements are
right well before the code is ever even written.


-- steve




-----Original Message-----
From: Peter Bernard Ladkin <ladkin at rvs.uni-bielefeld.de>
Date: Monday, July 1, 2013 8:43 PM
To: Steve Tockey <Steve.Tockey at construx.com>
Cc: Les Chambers <les at chambers.com.au>,
"systemsafety at techfak.uni-bielefeld.de"
<systemsafety at techfak.uni-bielefeld.de>
Subject: Re: [SystemSafety] Qualifying SW as "proven in use" [Measuring
Software]



On 2 Jul 2013, at 00:47, Steve Tockey <Steve.Tockey at construx.com> wrote:

> I think that sounds good in theory but it may not work effectively in
>practice. The issue is that almost all of the test teams I know don't
>have inside (aka "white box") knowledge of the software they are testing.
>They are approaching it purely from an external ("black box")
>perspective. They can't tell if the code has high cyclomatic complexity
>or not.

That sounds like the wrong way to assure SW. If you want to be assured
that the SW is reliable to the average frequency of one SW-caused failure
in 10^X operational hours, you need to observe 3 x 10^X hours of
failure-free operation to be 90% confident of it. Under the assumption
that you have perfect failure detection.

I guess it's OK if you don't mind if your SW croaks every hundred or
thousand hours or so. But that is hardly what one might term quality
assurance.

> In principle, testers should be given the authority to reject a piece of
>crap product. That's their job in all other industries. But in software
>(non-critical, mind you), testing is usually a window dressing that's
>mostly overridden if it meant threatening promised ship dates.

By which I take it you mean failures are seen. In which case not even the
above applies.

I had thought that the main point of testing a product which you hope to
be of moderate quality was to make sure you have the requirements right
and haven't forgotten some obvious things about the operating environment.

PBL

Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited



More information about the systemsafety mailing list