[SystemSafety] Fwd: Re: Correcting the SECOND screw-up (grovelling apologies)

Olwen Morgan olwen at phaedsys.com
Tue Jul 14 17:52:59 CEST 2020


On 14/07/2020 13:48, Roberto Bagnara wrote:
> The code below is affected by undefined behavior, e.g., in the case:
>
>   sizeof(char) == sizeof(int) == 1
>   signedness of plain char == signed
>
> independently from the integer representation adopted.
> An example of such an architecture is TMS320C28x (where both chars and
> ints are 16 bits wide).  On that architecture, after a few iterations
> the loop will be entered with ch1 == 2730 and ch2 == 10922, so that
> the right shift by 2 position (ch1 << 2) would have to compute
> 10922 x 2^2 = 43688, which does not fit into int, i.e., into the result
> type, whence the undefined behavior.
>
And thank you for redirecting again me to the standard. I misread my 
copy of C99. You're right that the relevant behaviour is stated to be 
undefined rather than unspecified, which is what I had originally said. 
In this case, however, the term "undefined" merely hides a cop-out. The 
standard could easily define the results for 1C, 2C, and SM 
representations and leave others undefined -  but AFAI can see, WG14 
simply couldn't be bothered. You're TMS320C28X example is the first of 
which I've been told in which the program would not do its job. I'll try 
to fix the program.

The point of the example, however, is still borne out in that it is 
possible to write a simple program that, *despite its serving an 
unambiguous purpose and being readily testable*, nevertheless creates a 
problem for a verifier either because the relevant behaviour is 
unspecified or undefined. In these cases it doesn't much matter whether 
the standard says it's unspecified or undefined. If unspecified, a 
prover would need to be told what the signedness is in its configuration 
files, so a proof that the program determines signedness, though 
technically possible, is nevertheless redundant. In the undefined case, 
the prover has no basis on which to complete a proof at all.

That's why I originally called them "Gödelian oddities".


> Two working groups that,
>     apparently, you can insult with impunity on this list.  Ah, I am
>     also a FM researcher and practitioners: bingo!)
>
... and not without just cause .... They've done f*ck all to help 
engineers of critical embedded systems by tightening up the C standard 
to help with the development of provers. That leaves you with the need 
to *assume* "approximate" semantics as, for example, Astree does. Here, 
IMO, the eminently sensible French developers of Astree realised they 
could never know precisely what semantics the language standard actually 
intends in areas of implementation-dependency, so they got as close as 
they could and claimed no more than an approximation.

In standardisation you have two options: Either you standardise a thing 
itself or you standardise how its is described. Given the latitude that 
the C standard allows to the implementor, C direly needs a normative 
annexe specifying the form of the statements that describe 
implementation-defined, unspecified and undefined matters.  It also 
needs a much clearer technical basis for choosing whether something is 
implementation-defined, unspecified , or undefined. For this purpose, 
the original Pascal classification of implementation-defined, 
implementation-dependent and erroneous is much better, especially if it 
is accompanied by requirements for the form of descriptions of such 
features. But if you went that way, the developers of C compilers would 
have to start producing much better documentation than they currently 
do. ...

... And, as we all know, the production of concise and precise technical 
English persistently eludes the degenerate spawn of immigrants, slaves, 
and mutinous colonists who hold sway over WG14.





More information about the systemsafety mailing list