[SystemSafety] "Safe" and "Unsafe" Languages

Gergely Buday gbuday at gmail.com
Mon Jan 23 08:13:12 CET 2017


Regarding division by zero and such run-time errors, type-safe (this is the
proper terminus technicus) languages adopt exceptions. Whether your program
handles exceptions correctly is another question. There are tools for
analyzing that, at least in the research domain.

2017. jan. 23. 6:42 ezt írta ("Peter Bernard Ladkin" <ladkin at causalis.com>):

> There is an article in ACM TechNews from last week that talks about the
> programming language Nim,
> which apparently has the language Rust's "safe-by-default" behaviour (it
> will be item 3 in Risks
> 30.10 when this edition goes up on the Risks Forum Digest WWW site). I
> think it was about a decade
> or so that I read a short note from Andy Wellings which talked about a
> "safe" programming language.
>
> Apparently, it is common parlance amongst the programming language
> community to talk about "safe"
> languages. Is that right?
>
> Here is an example from Rust. https://doc.rust-lang.org/
> nomicon/meet-safe-and-unsafe.html
> Apparently, they seem to mean something such as: write your programs in
> Rust and they'll not have
> run-time errors. I suspect a much weaker claim is justified, namely there
> are different kinds of
> "X-safe", for example type-safe and memory-safe, and Rust is type-safe and
> maybe memory-safe. Pascal
> and Modula were type-safe; I don't know what memory-safe means but
> presumably it includes absence of
> memory leaks. Note that it doesn't make complete sense to talk about a
> programming language as being
> type-safe simpliciter: whether a concrete program is type-safe also
> depends on whether your compiler
> reliably preserves the semantics of types.
>
> It is not clear to me what calling a language "safe" means in terms of
> concrete scientific
> assertions about the language. For example, I doubt it is practically
> possible to have a
> general-purpose language, all valid programs in which are guaranteed to be
> free of run-time error.
> E.g., You'd have to guard against any expression used as the denominator
> in a division being
> digitally-non-zero, and although there are industrially-practical ways of
> doing that with specific
> programs using Hoare logic, I don't know any way of ruling that out a
> priori in a language
> definition for a practical language.
>
> I see no problems with the use of terms such as type-safe or (if it has a
> concrete meaning)
> memory-safe. They clearly refer to specific properties and there is little
> or no chance of confusion
> with the general term "safe" as used in functional safety (although many
> people such as myself do
> not use that term unqualified; I prefer to speak of "adequately safe"
> rather than "safe"
> simpliciter, because safety is a matter of degree, not of a binary
> property which a system has or
> does not have).
>
> There is another use of the term "safety", in "safety property" in
> concurrency theory. Here, there
> is a theorem that any program property may be expressed as a conjunction
> of a "safety property" with
> a "liveness property". A safety property is an invariant, "always X"; a
> liveness property is a
> task-progression assertion "always eventually Y". They are often expressed
> as saying "nothing bad
> [that would be NOT-X] ever happens" and "something good (Y) always
> eventually happens". Here, the
> relevant term is "safety property", and people tend to talk about
> "fulfilling a safety property"
> rather than saying "safe".
>
> I have occasionally but regularly come across confusing use of the term
> "safe" by
> concurrency/algorithm/logical-program-reliability specialists where the
> correctness of a program is
> called "safety" if the program has some important function in some way. I
> often object that people
> are talking about the reliability of a program, its ability to do what is
> specified of it, but not
> safety, which is a system property partially and essentially defined by
> the environment in which the
> program runs. Sometimes this is acknowledged graciously; sometimes not.
> The people who use the term
> in this way are often distinguished computer scientists, so I do see it as
> important to say
> something, and am disappointed when the response is a metaphorical shrug.
>
> Standardised functional-safety conceptions don't necessarily help avoid
> this confusion between
> safety and reliability. Functional safety is assured in IEC 61508 through
> reliability conditions on
> specific functions introduced to mitigate a hazard with unacceptable risk.
>
> The phenomenon is probably difficult socially to avoid. Back twenty years
> ago, in CERT's early days
> when it was possible to count vulnerabilities, some 80% of vulnerabilities
> were said to be due to
> buffer overflows, which are avoided if source code is written in
> strongly-typed languages and the
> compiler enforces strong typing correctly. Calling such languages "safe"
> comes from knowing that
> programs written in the language and compiled with a semantics-preserving
> compiler are *free from
> such unwanted behaviour*. For functional-safety people, the "unwanted
> behaviour" is that causing
> possible "harm", and "harm" is damage to people, property, environment,
> what have you, outside the
> vocabulary of the system itself.
>
> Do people see such memes, cases in which a word with an important
> technical meaning has three or
> four such meanings, as well as a colloquial meaning, as problematic? If
> so, what is it reasonable to
> do about it?
>
> PBL
>
> Prof. Peter Bernard Ladkin, Bielefeld, Germany
> MoreInCommon
> Je suis Charlie
> Tel+msg +49 (0)521 880 7319  www.rvs-bi.de
>
>
>
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20170123/3061803a/attachment.html>


More information about the systemsafety mailing list