[SystemSafety] A Common Programming Language for the Department of Defense

paul_e.bennett at topmail.co.uk paul_e.bennett at topmail.co.uk
Tue May 2 00:43:05 CEST 2017


On 01/05/2017 at 6:57 PM, "Steve Tockey" <Steve.Tockey at construx.com> wrote:
>
>It’s interesting to me that most people even think that the 
>problems outlined in Section A.1. could ever be solved by a 
>programming language (IMHO):

Some of the problems are more system wide than alluded to. So you are correct
in your thinking.

>*) Responsiveness—this is rooted in crappy requirements and 
>uninformed, amateur design practices (e.g., lack of application of 
>fundamental design principles)

This raises the arguments about distributed networked systems versus 
centrally controlled systems. There are standards in existence that specify
the maximum time a user should wait for a confirmation that something is 
happening at the user interface. Sadly, much of the software on our own
PC's will fall foul of those standards. Sometimes, a networked solution
can be faster than a centralised control solution as much of the work can 
be hived out to the individual controllers across the network. Of course, 
getting decent requirements is key to permitting such thinking to be
considered.

>*) Reliability—this is caused by not paying attention to code 
>semantics (e.g., lack of design-by-contract)

I have found a "Component Oriented" method works well here. Each
function (sub-routine performing a single simple task) can be considered
as a component with hard enough surfaces that CofC can be applied the
once on the proviso that component rules are followed (think about the way
we deal with mechanical components and do the same for software).

>*) Cost—my data shows that about 60% of the cost of a typical 
>software project is reworking mistakes made earlier (caused by 
>crappy requirements and design and inattention to up-front quality)

I find most of my cost goes into testing. Then, I start testing on day one
of receiving requirements because I first test the requirements handed to me.
The six C's guidelines hold sway in this territory and all six C's have to be in
place for a requirement to be usable. [I have told this group what the six C's
are in previous posts but will list them if asked.]

>*) Modifiability—again, lack of application of fundamental design 
>principles

This is from developing decent architectural frameworks that will allow significant
structural changes without weakening the structures. Some of the reason I tend
towards distributed solutions.

>*) Timeliness—again, 60% rework caused by crappy requirements and 
>design drive most of this

Also how honest we are allowed to be about our estimations of time required to
accomplish a proper design. Impossibly short time-scales should be resisted
with extreme vigour.

>*) Transferability—we have to finally admit that code is 
>inherently un-reusable. Nearly 70 years and we still haven’t 
>solved it? It’s time to look for alternate solutions. . .

Now I find that an interesting statement. I have written code that has
run on several processors without modification. Regardless of one being
16 bit, the next being 32 bit, and then the final one being 64 bit. Same
software has also seen use on 18 bit and 21 bit processors to.

The reason for me being able to accomplish the above is because I only
have o design for a simple abstract processor which has been implemented 
on almost every processor produced (clue in the sig).

>*) Efficiency—this will always and forever be a problem. Moore’s 
>Law helps on the supply side, but the customers continue to demand 
>more complex applications that they would not have even dreamt of 
>10 years ago. Demand for high performance will always outstrip 
>supply

There is probably too much lazy programming going on. Why does it take
so much memory to update an app I already have on my phone? I can still
do amazing controls in under 64k. Yet, my phone app update will be 30Mb
or more and it didn't seem to be that complicated an app in the first place.
If we demanded 100% fully certified software, where its operation was a
guaranteed success, then we would halt the majority of the industry.

>These are not problems that were ever caused by one programming 
>language vs. another. With the exception of the last one, these 
>have always been—and will always be—methodological issues (HOW you 
>do requirements work, HOW you do design work, HOW you qualify 
>people to be doing work in these areas in the first place, . . .). 
>Until we fundamentally re-think HOW we should develop software in 
>the first place, none of these problems would ever be solved. 
>Thinking they can be solved by one magic programming language is 
>pretty darned naïve.

I can already produce robust code for very high integrity solutions in the
controls world. Where most of the difficulty lies is in the requirements end
and I have re-written more than my fair share after asking clients what they
were really aiming for. After seeing some requirements documents I even
considered that there should be a McCabe Cyclomatic Complexity measue
on the requirements document and an ideal number for it to fall below as an
aggregate before it was deemed acceptable to an implementer. I note that
there are requirements specification tools around but are they anywhere
near a solution to the complexity problem or do the help exacerbate the
problem.

Regards

Paul E. Bennett IEng MIET
Systems Engineer

-- 
********************************************************************
Paul E. Bennett IEng MIET.....<email://Paul_E.Bennett@topmail.co.uk>
Forth based HIDECS Consultancy.............<http://www.hidecs.co.uk>
Mob: +44 (0)7811-639972
Tel: +44 (0)1392-426688
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************



More information about the systemsafety mailing list