[SystemSafety] OpenSSL Bug

Peter Bernard Ladkin ladkin at rvs.uni-bielefeld.de
Mon Apr 14 22:41:33 CEST 2014


I find a discussion about "empirical evidence" beside the point.

Suppose it is known that people make lots of mistakes of type X. Suppose technical methods T are known to avoid, definitively, mistakes of type X, and T are practical. Suppose there is an area of engineering, SCS, in which mistakes of type X have potentially very serious consequences which we wish to avoid. 

Then we say: in SCS, using T is essential/best practice/the way to avoid lawsuits/whatever.

What would be the relevance of any "empirical evidence" that some subset of T is "effective" in avoiding E?

Since programming is a human endeavor, any "empirical evidence" that some set A of programmers in some artificial environment E  using some subset of T produced programs P with instances of error E less than, or marginally less than, some other subset B of programmers in E who didn't use T is subject to question on a number of fronts. What training/culture did A and B have in common? How does one determine that all relevant characteristics of E were taken into account? If it is possible to avoid E without using T, how do we know that most people in A weren't all cognisant of how to avoid E and few people in B were so cognisant, quite independent of using T? Did people in A+B know they were being assessed in avoiding E? If not explicitly, were they able to infer it covertly? And the people in A more capable of so inferring than those in B? And how do we determine that people in A and B didn't covertly find out what the point of the test was and determine to justify it by, respectively, paying more attention and paying less attention to what they were doing? You can go on for ever.

It is much easier with statistical methods on human populations to show that something you presumed didn't or shouldn't matter actually does matter. As with much experimentation, discovering a negative is straightforward and proving a positive almost impossible.

PBL 

Prof. Peter Bernard Ladkin, University of Bielefeld and Causalis Limited


More information about the systemsafety mailing list