[SystemSafety] Component Reliability and System Safety
Olwen Morgan
olwen.morgan at btinternet.com
Sun Sep 16 16:54:46 CEST 2018
Yep - what Tom says matches my experience in industry. The only client
I've ever worked for who knew how to go about safety-critical systems
development was Altran-Praxis using SPARK Ada. And even there the
process was limited by the need to use MISRA C on autogenerated GUI code.
regards,
Olwen
On 14/09/18 23:08, Tom Ferrell wrote:
>
> Warning – long post; got get some coffee first…
>
> Since we cannot seem to move on from the “Why do people still use
> MISRA Coding standards in 2018” question, I want to respond to this
> latest post. Collectively, this community frequently, and I believe,
> correctly point to the success of DO-178C in helping minimize software
> defects, thus contributing to commercial air transport’s enviable
> safety record. Yet, what does DO-178C really say about coding
> standards? Let me be your tour guide.
>
> From 4.1e (Software Planning Process Objectives): “Software
> development standards consistent with the system safety objectives for
> the software to be produced are defined (see 4.5).”
>
> From 4.2b (Software Planning Process Activities): “The software
> development standards to be used for the project should be defined or
> selected.”
>
> From 4.2g (Software Planning Process Activities): “For the planning
> process to be complete, the software plans and software development
> standards should be under change control and reviews of them completed
> (see 4.6).
>
> From 4.4 (Software Life Cycle Environment Planning): “Examples of
> how the software environment [generally interpreted as tools] chosen
> can have a beneficial effect on the software include enforcing
> standards, detecting errors, and implementing error prevention and
> fault tolerance methods.”
>
> From 4.4.1 (Software Development Environment): “The software
> verification process activities or software development standards,
> which include consideration of the software level, should be defined
> to reduce potential software development environment-related errors.”
>
> From 4.5 (Software Development Standards): This section is about half
> a page, so I am going to summarize using key phrases:
>
> “Software development standards include the Software Requirements
> Standard, the Software Design Standard, and the Software Code
> Standards. The software verification process uses these standards as
> a basis for evaluating the compliance of actual outputs of a process
> with intended outputs.”
>
> Standards “should comply with section 11 [see next]; “enable software
> components…to be uniformly designed and implemented; “should disallow
> the use of constructs or methods that produce outputs that cannot be
> verified or that are not compatible with safety-related requirements;”
> and, should consider “robustness.”
>
> “Note 1: In developing standards, consideration can be given to
> previous experience. Constraints and rules on development, design,
> and coding methods can be included to control complexity. Defensive
> programming practices may be considered to improve robustness.”
>
> From 4.6 (Review of the Software Planning Process): Reviews of
> the software planning process are conducted to ensure that the
> software plans and software development standards comply with the
> guidance of this document [DO-178C] and means are provided to execute
> them.”
>
> At this point, I am going to switch from noting all references to
> ‘software development standards’ to focusing in on Software Coding
> Standards just as DO-178C shifts to a more specific standards references.
>
> From 5.3.2b (Software Coding Process Activities): “The Source Code
> should conform to the Software Coding Standards.”
>
> I would suggest we pause here and note the following:
>
> 1.Aside from generalities on error prevention and consistency with
> system safety objectives, DO-178C has said absolutely nothing about
> Software Code Standard contents.
>
> 2.Reuse of standards (consideration of previous experience) has been
> suggested.
>
> 3.Review of standards is required but does not prohibit credit for an
> external review (say by an industry consensus group).
>
> 4.Establishes a clear requirement for review of compliance to standards.
>
> Section 6 (Software Verification) of DO-178C outlines multiple layers
> of review and analysis activities including the following for “Reviews
> and Analyses of Source Code:”
>
> From section 6.3.4d: “Conformance to standards: The objective is to
> ensure that the Software Code Standards were followed during the
> development of the code, for example complexity restrictions and code
> constraints. Complexity includes the degree of coupling between the
> software components, the nesting levels for control structures, and
> the complexity of logical or numeric expressions. The analysis also
> ensure deviations are justified.”
>
> There are various incidental references to the Software Development
> Standards in terms of QA and CM of same. None of these impact
> standard content, so we will zoom ahead to THE section that is
> supposed to drive Software Code Standard content, section 11.8. Since
> this is the section most relevant to our discussion, I am including it
> in its entirety:
>
> “Software Code Standards define the programming languages, methods,
> rules, and tools to be used to code the software. These standards
> should include:
>
> a.Programming language(s) to be used and/or defined subset (s). For a
> programming language, reference to the data that unambiguously defines
> the syntax, the control behavior, the data behavior, and side effects
> of the language. This may require limiting the use of some features
> of a language.
>
> b.Source Code presentation standards, for example line length
> restriction, indentation, and blank line usage and Source Code
> documentation standards, for example, name of the author, revision
> history, inputs and outputs, and affected global data.
>
> c.Naming conventions for components, subprograms, variables, and
> constants.
>
> d.Conditions and constraints imposed on permitted coding conventions,
> such as the degree of coupling between software components and the
> complexity of logical or numerical expressions and rationale for their
> use.
>
> e.Constraints on the use of coding tools.”
>
> For item a, companies generally callout a specific version of the
> Language Reference Manual. There are also some standard limitations
> that appear (e.g., no recursion, no dynamic memory allocation). It
> has been my experience that very few companies ever declare a true
> subset of Ada or C, but will occasionally adopt a defined subset for C++.
>
> For item b, this is generally regarding as readability standards –
> facilitates review but does not specifically restrict the language in
> any appreciable way.
>
> For item c, also generally a readability and maintainability aid, but
> no explicit limitation or requirement tied to software level or system
> safety.
>
> For item d, companies may impose some meaningful restrictions and
> formats, e.g. standards for pointer arithmetic, fixed point scaling,
> use of SI units, etc.
>
> For item e, generally talks to compiler and linker settings to ensure
> maximum verbosity in warnings, structured treatment of memory access, etc.
>
> The real take away is that DO-178C leaves the majority of standards
> content decisions to the company applying the standard. This means
> that there is wide variation in what is included or excluded, and what
> is not even considered. I, along with the majority of FAA DERs that I
> know and interact with on a regular basis, push companies toward
> industry standards like MISRA because they help ensure at least a
> foundation is in place that has had some broader review and provides
> the rationale for why a restriction of guideline exists. We encourage
> companies to add items to their standards from their own experiences
> and lessons-learned, and we tell them about forums like this one to
> learn more and understand both the underlying rationale and, just as
> importantly, the limitations of such industry standards. I, for one,
> got quite a lot out of the relatively recent exchange on cyclomatic
> complexity
>
> Some of you may consider this all to be quite “banal” since it is not
> grounded in academic terms with footnotes to each and every seminal OR
> esoteric paper on the topic, but this is the reality of coding
> standards as applied in civil aviation in 2018. Can we do better, of
> course. Is that the current trajectory we are one – NO. The FAA has
> a current mandate to be less prescriptive and yes, the statements
> above are considered by industry to be too prescriptive. Just ask
> your favorite representative to industry groups like GAMA and AEA. My
> plea to this erudite community is to offer that ‘something better’
> before tearing down the few things that are seemingly working for us now.
>
> Me <- climbing down off my soapbox now. Let the slings and arrows ensue.
>
> *From:*systemsafety
> [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] *On
> Behalf Of *David Crocker
> *Sent:* Friday, September 14, 2018 3:43 PM
> *To:* systemsafety at lists.techfak.uni-bielefeld.de; Paul Sherwood;
> Peter Bernard Ladkin
> *Subject:* Re: [SystemSafety] Component Reliability and System Safety
>
> Sorry, spelling autocorrect changed my attempt at entering "asserting"
> to "assisting".
>
> On 14 September 2018 20:40:16 BST, David Crocker
> <dcrocker at eschertech.com> wrote:
>
> >>
> Those people could **just** use static analysis tools, and get the
> same benefit.
> <<
>
> You are assisting that static analysis tools that don't enforce the
> MISRA guidelines provide as much safety as those that do. What are
> your grounds for that assertion? The MISRA guidelines have been
> produced and by a group of people from different backgrounds with
> experience of critical software written in C, and input from a much
> larger group, and gone though revisions. I doubt that any single
> vendor of a static analysis tool has the same breadth of experience.
> Of course there is a of of overlap, and there are some MISRA rules
> that I find questionable; but I'd rather use a published set of rules
> that have undergone scrutiny than an unpublished or poorly-documented
> set of rules that perhaps just one individual working for a tool
> vendor thought were good and easy to implement.
>
> >>
> Your answer doesn't address the system safety part of my question at
> all, afaict...
> <<
>
> System safety requires at the very least a good set of requirements,
> an assembly of components that will meet those requirements if the
> components behave correctly, and components that behave correctly for
> all inputs they receive. MISRA helps with the last of those.
>
> On 14 September 2018 14:52:30 BST, Paul Sherwood
> <paul.sherwood at codethink.co.uk> wrote:
>
> On 2018-09-14 08:03, Peter Bernard Ladkin wrote:
> <snip>
>
> [Paul Sherwood, I think] Why is MISRA C still considered relevant to
> system safety in 2018?
>
> (Banal question? Banal answer!)
> I'm sorry you consider my question banal. I mentioned your comment to an
> eminent friend (who has had to deal with the human fallout from multiple
> accidents) and he said "There are no banal questions about safety.
> Anyone asking questions and interested in safety is to be applauded."
>
> Are list members here normally prone to sniping at each other? Is the
> community OK with that? I confess I can be quite harsh myself, but I try
> to give new contributors the benefit of the doubt.
> Because many people use C for
> programming small embedded systems and
> adhering to MISRA C coding guidelines enables the use of static
> analysis tools which go some way
> (but not all the way) to showing that the code does what you have said
> you want it to do.
> Those people could **just** use static analysis tools, and get the same
> benefit. Your answer doesn't address the system safety part of my
> question at all, afaict, but I found other answers more helpful in that
> regard.
>
> br
> Paul
> ------------------------------------------------------------------------
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
>
>
> --
> Sent from my Android device with K-9 Mail. Please excuse my brevity.
>
>
>
> _______________________________________________
> The System Safety Mailing List
> systemsafety at TechFak.Uni-Bielefeld.DE
--
Olwen Morgan CITP, MBCS olwen.morgan at btinternet.com +44 (0) 7854 899667
Carmarthenshire, Wales, UK
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20180916/0dd8c2d2/attachment-0001.html>
More information about the systemsafety
mailing list