[SystemSafety] Scalability of Formal Methods [was: Logic]

Peter Bishop pgb at adelard.com
Thu Feb 27 14:03:37 CET 2014


I think Patrick has a point.
To demonstrate safety you need to justify every step

requirements
requirement to code
code to object code
runtime behaviour of the chip

For example, there was a formal proof for requirements-to-code and 
code-to-object-code for the Sizewell B primary protection system
And the Viper chip was an attempt to prove correctness at chip level

But even if the whole chain is proved correct, to justify safety we also 
need to justify that the hardware / software system is robust to 
malfunction (e.g. like the single event upset that might have caused a 
malfunction in the Toyota engine controller).

Peter Bishop
Adelard

Patrick Graydon wrote:
> On 26 Feb 2014, at 22:33, Peter Bernard Ladkin <ladkin at rvs.uni-bielefeld.de> wrote:
> 
>> Fetzer is a red herring. His argument rests on an equivocation and admits of a trivial counterexample. Proved-correct is not a unary predicate of programs. What does "correct" mean, when speaking of programs? Answer: that it does what you want. So you have to have some idea of what you want, some specification. And then you may ask whether the program is correct wrt this specification. Or, as we may say, whether the program fulfils the specification. Correctness is (at least) a binary predicate, of which one argument is a program and another a specification of some kind, a saying-what-is-wanted. 
>>
>> Fetzer claims to show that programs can never be proved correct. Here is a simple, trivial counterexample to that claim. Consider the specification "P OR (NOT P)", where "P" is any proposition you care to express. Constructivists, who may doubt excluded middle, may rather consider "NOT P OR (NOT NOT P)". Call this specification S. Take any program you like: Prog. There is a rigorous proof that Prog fulfils S. For the classical statement, the proof is a couple lines long (think of it as the two lines of the truth table); for the constructivist about double that.
>>
>> See? Logic helps. Here, in showing why somebody who is arguing that program verification should lose its research funding is canonically mistaken.
>>
>> Not that the CACM would print such a refutation. The editor's comment when I sent it in was that Fetzer's article was somewhat old (a couple of years); did I have a more contemporary contribution  to CACM? That's when I realised the marketing people had taken over and I let my membership lapse.
> 
> My point in bringing up Fetzer’s article is that what I actually want to know is what my software will cause real chips to do in the real world.  What you are telling me is that you can prove what the software means according to idealised semantics.  I’m not even going to address that, because those are two different things … which is what I understood Fetzer to be saying.  Unless you can somehow deductively prove the behaviour of a real, physical chip?
> 
> I don’t disagree that knowing what software will do according to idealised semantics is useful.  But safety depends on knowing what will happen in the real world on real chips.  And if you can’t prove that, then you need to bring in a different form of evidence to complete the picture.  Some sort of inference based on observation.  Testing, in other words.
> 
> As to Martyn’s point, I care much less about whether responsibility for completing the evidence about what the software will cause real chips to do in the real world lies with a person who calls himself or herself a software engineer or any other job title (or combination of titles).  I care that the evidence *is* completed by someone.  I think showing that the software is correct according to some abstract semantics is a useful goal … provided that somebody covers the other gaps.  This begins with first being clear that those gaps exist.  Since I have met several people who seemed to think that software proof was going to tell them what the real chips would do in all circumstances, I think a little reminder of what it does mean is useful from time to time.  
> 
> — Patrick
> 
> Dr Patrick John Graydon
> Postdoctoral Research Fellow
> School of Innovation, Design, and Engineering (IDT)
> Mälardalens Högskola (MDH), Västerås, Sweden
> 
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE

-- 

Peter Bishop
Chief Scientist
Adelard LLP
Exmouth House, 3-11 Pine Street, London,EC1R 0JH
http://www.adelard.com
Recep:  +44-(0)20-7832 5850
Direct: +44-(0)20-7832 5855


More information about the systemsafety mailing list