[SystemSafety] "Security Risk" and Probability

Andy Ashworth andy at the-ashworths.org
Sat Nov 4 00:42:39 CET 2017




	
		
		
	
		
		The operational environment of a warship is totally different to that of an aircraft.
In an aircraft, a single person is responsible for the controls puts, backed up by a second pilot with duplicated controls.
On the bridge of a warship, the person responsible for the determination of course and speed has no direct input to the controls. Instead, the responsible officer issues verbal helm and engine orders to the quarter-master(s) who is/are the actual operator(s). 
In the case of the McCain, my understanding is that the command had made the decision to split the helm and engine inputs to two operators, however the mode selection was not correctly made. The nominal helm operator then declared a loss of steering gear while the secondary operator actually had steering control but didn’t realize as this station was only meant to be controlling the engines. In theory the bridge command process is relatively robust, however, in this case the process being followed did not match the system configuration.
Andy
		

		Get Outlook for iOS
	





On Fri, Nov 3, 2017 at 6:56 PM -0400, "Pekka Pihlajasaari" <pekka at data.co.za> wrote:










Displaying of information is one thing, control authority is another. Aircraft either have mirrored controls (Boeing) or a priority scheme (appended AF 477) that permit the operators to, either through brute force or a priority scheme, to force controls into the desired position. Naval vessels (appended USS John S. McCain), by contrast, explicitly identify one of a multiple of controls as authoritative and require assignment to shift authority.

The three schemes described each have different mechanisms for providing operator feedback and enforcement of transition of control between alternative locations. The differences in the operation of an Airbus and the Arleigh Burge-class guided missile destroyer controls suggest a radical difference in operational culture. Context may well be the determinant. Boeing pilots believe in brute force, Airbus pilots prefer the mean input and sailors insist on explicit transfer of control. All appear reasonable under specific circumstances. Perhaps mandated operator feedback needs to be highlighted to prevent loss of situational awareness.
 
Regards
Pekka Pihlajasaari
--
pekka at data.co.za	Data Abstraction (Pty) Ltd	+27 11 484 9664
--
AF 477			https://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp090601.en.pdf
USS John S. McCain	http://www.navy.mil/submit/display.asp?story_id=103130


-----Original Message-----
From: systemsafety [mailto:systemsafety-bounces at lists.techfak.uni-bielefeld.de] On Behalf Of Derek M Jones
Sent: 03 November 2017 14:42
To: systemsafety at lists.techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] "Security Risk" and Probability

Stuart,

> This is often a feature of display systems - displaying wrong information is a safety concern, occluding the display because the information is known to be suspect, or the display simply going off, is not a safety concern; thus availability is not necessarily a safety concern.

https://arstechnica.co.uk/information-technology/2017/11/uss-mccain-collision-ultimately-caused-by-ui-confusion/


-- 
Derek M. Jones           Software analysis
tel: +44 (0)1252 520667  blog:shape-of-code.coding-guidelines.com
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE
_______________________________________________
The System Safety Mailing List
systemsafety at TechFak.Uni-Bielefeld.DE





-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20171103/ce0b29b0/attachment.html>


More information about the systemsafety mailing list