[SystemSafety] Qualifying SW as "proven in use" [Measuring Software]

Les Chambers les at chambers.com.au
Mon Jul 1 23:36:30 CEST 2013


Steve
One way to achieve this is to empower test teams. Management issues an encyclical that: SOFTWARE IS NOT A GIVEN. That is, "If it's too complex to test effectively - reject it. Don't waste your time composing feckless tests for crap software. Send it back to it heathen authors.
Kill it before it gets into production.
Les

On 02/07/2013, at 3:16 AM, Steve Tockey <Steve.Tockey at construx.com> wrote:

> 
> Martyn,
> My preference would be that things like low cyclomatic complexity be considered basic standards of professional practice, well before one even started talking about a safety case. Software with ridiculous complexities shouldn't even be allowed to start making a safety case in the first place.
> 
> 
> -- steve
> 
> 
> From: Martyn Thomas <martyn at thomas-associates.co.uk>
> Reply-To: "martyn at thomas-associates.co.uk" <martyn at thomas-associates.co.uk>
> Date: Monday, July 1, 2013 10:04 AM
> Cc: "systemsafety at techfak.uni-bielefeld.de" <systemsafety at techfak.uni-bielefeld.de>
> Subject: Re: [SystemSafety] Qualifying SW as "proven in use" [Measuring Software]
> 
> Steve
> 
> It would indeed be hard to make a strong safety  case for a system whose software was "full of defects". 
> 
> High cyclomatic complexity may make this more likely and if a regulator wanted to insist on low complexity as a certification criterion I doubt that few would complain. Simple is good - it reduces costs, in my experience.
> 
> But if a regulator allowed low complexity as a evidence for an acceptably low defect density, as part of a safety case, then I'd have strong reservations.  Let me put it this way: if there's serious money to be made by developing a tool that inputs arbitrary software and outputs software with low cyclomatic complexity, there won't be a shortage of candidate tools - but safety won't improve. And if you have a way to prove, reliably, that the output from such a tool is functionally equivalent to the input, then that's a major breakthrough and I'd like to discuss it further.
> 
> Martyn
> 
> On 01/07/2013 17:18, Steve Tockey wrote:
>> Martyn,
>> 
>> "The safety goal is to have sufficient evidence to justify high
>> confidence that the software has specific properties that have been
>> determined to be critical for the safety of a particular system in a
>> particular operating environment."
>> 
>> Agreed, but my fundamental issue is (ignoring the obviously contrived
>> cases where the defects are in non-safety related functionality) how could
>> software--or the larger system it's embedded in--be considered "safe" if
>> the software is full of defects? Surely there are many elements that go
>> into making safe software. But just as surely, IMHO, the quality of that
>> software is one of those elements. And if we can't get the software
>> quality right, then the others might be somewhat moot?
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20130702/0e2b2453/attachment.html>


More information about the systemsafety mailing list