[SystemSafety] Personal and corporate liabilities as a consequence of safety, security and other mistakes of similar importance

Peter Bernard Ladkin ladkin at causalis.com
Mon Oct 8 21:07:05 CEST 2018


Folks,

I recently wrote a paper with Martyn which is about to be submitted for publication (encouraged
after preliminary consultation with the editor) in a journal on ..... policy. There is a section on
programming culture, which largely consists of anecdotes. They are anecdotes which (I contend) are
persistent (over decades). And they are anecdotes which predict, in some sense, qualities of the
resulting SW.

An example. One such anecdote concerns what I call the "hero programmer". I met this creature in
1979 (at a university), endured its crowning 1984-1989 (and personally benefited from it in a
project), encountered it yet again in a (very large) lawsuit at the beginning of the decade, and
have heard of recent instances in a safety-related SW supplier this very year. A colleague suggests
it was previewed (with certain caveats) in Fred Brooks's Mythical Man Month.

Software engineering is riven with such phenomena, and they are different from cultural phenomena in
other areas of engineering.

There is virtually no "technically respectable" literature on how the phenomenology of programs
relates to the cultural aspects of programming.

I groked the phenomena a quarter-century ago as a result of a couple of decades of experience,
resolved to avoid the disadvantages, and personally all-but-failed in my management activity.

Discussions of these phenomena take place in this list. And presumably in other locations such as
this. I am very grateful for the considered contributions of SystemSafetyList contributors to the
discussion. My larger question concerns getting this intuitive knowledge and understanding into the
mainstream. For it is not there. There is no academic or pseudo-academic market which I know for
technical papers on "programmers and their managers behave <this way>; here are the consequences for
systems".

To reiterate my canonical example: in the 1990's, 80% or more of CERT advisories concerned buffer
overflows in network-visible code. This phenomenon was avoidable for thirty years by this point.
"Everybody" knew it. But AFAIK there is no definitive paper in which this is observed and acknowledged.

How are we going to get this material in the engineering-scientific literature? It surely cannot
remain as backchat amongst mailing-list subscribers, for it is far too consequential.

PBL

Prof. Peter Bernard Ladkin, Bielefeld, Germany
MoreInCommon
Je suis Charlie
Tel+msg +49 (0)521 880 7319  www.rvs-bi.de





-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20181008/8cfa0154/attachment.sig>


More information about the systemsafety mailing list