[SystemSafety] "Ripple20 vulnerabilities will haunt the IoT landscape for years to come"

Roderick Chapman rod at proteancode.com
Thu Jul 2 13:29:08 CEST 2020


On 01/07/2020 17:10, Olwen Morgan wrote:
> I'll defer to those more familiar with the Ada RM but I'd be very 
> surprised if the language specification were entirely free from 
> lacunae related to implementation-defined features that could, at 
> least in principle, lead to similar problems. Does the RM tie down the 
> compiler strictly enough to preclude different instances of 
> implementation-defined constructs behaving differently in different 
> places?

The Ada RM defines three levels-of-evil when it comes to ambiguous 
features. These set different challenges to verification tools. The 
levels are:

1. "Implementation-defined". This is something that is left to an 
implementation to choose, but it must be documented and a compiler must 
make a single choice that never changes - essentially it's always the 
same for all versions of the same compiler, and will not change with 
optimization and so on. Obvious example: the range of values of the 
predefined type "Integer".  Things like this are unlikely to change for 
the lifetime of a project, unless you change (say) the target machine 
ISA and the compiler.  Static analysis tools can be "configured" 
(hopefully just once) to "know" these values.

2. "Unspecified" - Compiler has a choice from a set of behaviors but the 
set might be very large, and the compiler is under no obligation to 
document its behaviour. Compilers are expected to produce binary 
equivalence for the same program, but behaviour might change given an 
arbitrarily small change in a program. These things can also be 
context-dependent, so changing line N of a program can change the 
behaviour of line N+1. Obvious examples in Ada: expression evaluation 
order, and parameter passing mechanism for "in out" mode record types.

3. "Erroneous" (This is equivalent to "undefined behaviour" in C). 
Anything goes, and all bets are off! Unrecoverable, in that once your 
program becomes erroneous then you can't recover to a well-defined 
state. Example in Ada: accessing an array outside its bounds if you have 
chosen to compile with runtime checks disabled.

The approach taken by SPARK is essentially:

1. Implementation-defined behaviours are known to the toolset by 
default, or can be overridden through a simple configuration file.

Some care is needed here to get these things right, but once you've got 
it right, they are very unlikely to change for the lifetime of a project.

2. Unspecified features are either eliminated by simple subsetting, or 
are rendered harmless by some combination of subsetting and analysis. 
For example - any expression evaluation order is allowed and valid in 
SPARK (so a compiler can do anything it likes), but several other rules 
in the language work together to ensure that the expression always 
evaluates to the same value, so it doesn't matter.

3. Erroneous behaviour. Eliminated by subsetting and/or various 
analyses. For example: unchecked buffer overflow is eliminated by 
data-flow analysis (to ensure variables are initialized) then 
theorem-proving (to show that index expressions are in range) and so on.

  - Rod


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/pipermail/systemsafety/attachments/20200702/a62ea781/attachment.html>


More information about the systemsafety mailing list