[SystemSafety] Risk Based Planning and Assessment - tank farms

Loebl, Andy loeblas at ornl.gov
Tue Dec 18 20:51:11 CET 2012


Thanks Carl;
I will get your paper and then see if I can get the book!
In my experience there are few people willing to agree, but I sense that most users of risk analysis do so more out of a desperation then because the trust experts.  I have never yet experienced a truly well-grounded use of this process.  I certainly am appalled that such is the norm for nuclear power facilities, like Fukashima, it is anyone's guess if we need to protect against a 50 year event or a 1000 year event.  What's worse is that many of these experts get lulled into believing that this is year 1 of any such judgment.

Hi Jim;
I have served in several related positions in academia and research in the city/regional/industrial park/brownfields development arena.  I have been a graduate school of planning professor and also served as the idea person and lead of the brownfield reutilization of formerly used nuclear facilities and land for the U.S. Department of Energy.  I have worked a lot with city planning folks since my PhD. Is in sociology and Demography.  As I mentioned to Carl, most of the planning folks seem to be looking into job risk mitigation.  I have also worked in areas of industrial chemical storage and security.  My experience is of the conventional wisdom/colloquial-corroborating  kind.  U.S. planners and land use management professionals have serious and conflicting issues and constituencies.  These are of several types:

*        "not in my back yard"

*        "I want it, but put it over there"

*        "Jobs are important but I'd rather commute."

*        There is the love canal syndrome

*        People and professionals seem to see Bophal as a one-of-a-kind, and at that time, I worked for Union Carbide

*        The distribution of manufacturing and storage of hazardous materials, while largely isolated by fences and signage, there is normally little buffering or other forms of mitigation exercised, officially.  In our corn belt states, for example, grain silos exploding is not uncommon.

*        I suppose a good analogy might be appropriate here.  I once knew a commercial airline pilot.  I once asked him if he worried much about in-flight incidents.  He told me; "Well, andy, the way I see it there is a minimum of 5000 feet between my airplane and any other airplane, that is a wide margin in my way of thinking.  (this was before the terrorist attacks in the 1970's)  When you drive home there is only a 6 inch yellow line separating you from oncoming traffic and your relative speed is the same as mine."

*        I believe all humans, whether in high risk environments or not, become accustomed to their risks.  In doing so they become lax or lose their training edge.  They are also sometimes irrational, like the nuclear physicist frustrated with screwing the top of a container holding a plutonium source by manipulating the process robotically and remotely.  Eventually he just lifted the hood, screwed the top on with his hands, then checked himself into the hospital.  I also think that the "experts" get accustomed to the importance of their risks but fail to account for the general insensitivity by humans and worker populations.  So in the urban planning environment, the "experts" recommended high rise apartment buildings for those with poor housing and low incomes.  They knew nothing about crime nor did they factor anti-social tendencies or proclivities into the housing environments they were creating.  Then when crime went up in those facilities, even the police refused to patrol inside those buildings.  They all were eventually torn down long before their life cycle ended.  To summarize this bullet, in the U.S. people do not seem to be too concerned about how land is used, nor are they informed of potential risk, and they prefer it that way, they are very much more concerned with appearances.  People even have the audacity to move into homes and apartments that are in dangerous environments; near military chemical weapons facilities, in flood plains, on the slopes of volcanos, near fireworks facilities, etc.  then when they get concerned, they use the excuse that the facility should move, even though it preceded their decision, because they want to feel safer or maintain property values.

*        I cannot understand why people flock to living on the San Andreas fault in California.

*        I read  Charles Perrow's book Normal Accidents and I am baffled by risks people take or the risks 'programmed' into human activities, complex or not.  The 'experts' seem to need to support their employers or clients and contrive ways to appear safe that have little to do with safety.

*        Nancy Leveson taught me the difference between safety and security and why it becomes too easily confused.

*        In an open society like America, it is very difficult to make secure that which should be.  We have just had another tragic "gun" incident here, but it really was not about guns as much as it was about humans.  PLUS, the risk of that happening, particularly in that community is, really, just as likely there as anywhere else.  We have had 61 mass incidents in the last 30 years, and even at that, we kill each other more frequently with our automobiles in one week than all those killed in those incidents.  I would venture to say that just drunk drivers kill more people in one week by driving their automobiles than all those incidents put together.
I often feel that these sorts of anomalies arise from risk assessment thinking models of the experts and the general population.

If you want more land use examples let me know.

andy

From: systemsafety-bounces at techfak.uni-bielefeld.de [mailto:systemsafety-bounces at techfak.uni-bielefeld.de] On Behalf Of Carl Sandom
Sent: Tuesday, December 18, 2012 5:12 AM
To: systemsafety at techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Risk Based Planning and Assessment - tank farms

Andy,


You're definitely not alone in your thinking. An interesting book on the topic of risk assessment which is in agreement with your view is: The Black Swan: The Impact of the Highly Improbable by Nicholas Taleb. I presented a paper at the Australian System Safety Conference in 2011 where I raised some of the problems I have encountered relating to risk assessments, one problem typically being an over reliance on 'expert judgement' (you can find the paper at http://www.isys-integrity.com/Papers.htm if you're interested). In my experience there is nearly always a need to improve the validity of safety assurance (i.e. risk assessment) claims through the use of meta-evidence (e.g.. present evidence about evidence to answer questions such as: who made the expert judgements?, how competent were they to do so? etc.).

Best Regards
Carl
_________________________________
Dr. Carl Sandom PhD CEng FIET MIEHF
Director and Consultant
iSys Integrity Limited
10 Gainsborough Drive
Sherborne
Dorset, DT9 6DR
United Kingdom
_________________________________


From: systemsafety-bounces at techfak.uni-bielefeld.de [mailto:systemsafety-bounces at techfak.uni-bielefeld.de] On Behalf Of James Ronback
Sent: 18 December 2012 04:34
To: systemsafety at techfak.uni-bielefeld.de
Subject: Re: [SystemSafety] Risk Based Planning and Assessment - tank farms

Andy,

Thanks for your thoughts on the subject.

In addition to natural hazards, I'm also interested in how effectively city planning departments make use of risk based planning and assessment for developing industrial areas near residential ones, e.g. jet fuel tank farms. I gather some recommendations were made in the UK after the Buncefield incident in 2005, but regulations are still in the works in the UK since nothing has been published yet.

Jim

COMAH - Buncefield: Why did it happen? - HSE
http://www.hse.gov.uk/comah/buncefield/buncefield-report.pdf

Learning the Lessons  from Buncefield
http://www.dsb.no/Global/Farlige%20stoffer/Dokumenter/Sevesokonferanse%202010/Sevesokonferansen%202010%20-%20Learning%20the%20lessons%20from%20Buncefield.pdf

-------- Original Message --------
Subject:

RE: [SystemSafety] Risk Based Planning and Assessment

Date:

Mon, 17 Dec 2012 10:51:41 -0500

From:

Loebl, Andy <loeblas at ornl.gov><mailto:loeblas at ornl.gov>

To:

James Ronback <jim_ronback at dccnet.com><mailto:jim_ronback at dccnet.com>



My experience in the United States is with the Department of Homeland Security and the Nuclear Regulatory Commission.

To me risk based planning and assessment processes to not hold my confidence.  Even probabilistic risk assessment

methods, as employed here, seems both weak and unsubstantiated.  To me, the practice is based on "expert judgment"

which I interpret as someone, somewhere, is guessing and somehow this is codified and legitimatized.  Unless, of

course, there is someone higher up in the decision making process who disagrees with the expert judgment, for

personal, political, or even professional reasons and decides to change that judgment, which in turn becomes

codified and legitimatized.



For example, regardless of the probability, how many mega-quakes, like Yellowstone, are we willing to accept?

Since the average frequency has been ~600,000 years between eruptions and the last one was 640,000 years ago,

does that mean we should move out of the areas likely to be affected or what should those people living in the

affected areas do?  Mt. Fugi is another example.  People in Japan have populated the hillsides of the volcano,

and like Mt. St. Helens, Fugi will erupt certainly soon, in geologic time, but what should be done today?



In short, the only way to really have confidence in the guessing that goes into even the best risk assessments,

users of those assessments cannot be allowed to use those assessments at a scale for which it is not intended.

Determining risk values, as the U.S. Coast Guard and homeland security does, and quantifies the results at an

interval scale, to my way of thinking, is taking 'judgment'   too far.



I think I am alone at this thought.  I also have not studied my objections enough to offer alternatives.  I hope

all this makes sense to you.



andy



-----Original Message-----

From: systemsafety-bounces at techfak.uni-bielefeld.de<mailto:systemsafety-bounces at techfak.uni-bielefeld.de> [mailto:systemsafety-bounces at techfak.uni-bielefeld.de]

On Behalf Of James Ronback

Sent: Sunday, December 16, 2012 5:17 PM

To: systemsafety at techfak.uni-bielefeld.de<mailto:systemsafety at techfak.uni-bielefeld.de>

Subject: [SystemSafety] Risk Based Planning and Assessment





Which countries have risk based planning and assessment processes that

you would recommend or deprecate?



Jim Ronback, P. Eng. (System Safety Engineer - retired)





-----

No virus found in this message.

Checked by AVG - www.avg.com<http://www.avg.com>

Version: 2013.0.2805 / Virus Database: 2634/5954 - Release Date: 12/12/12



_______________________________________________

The System Safety Mailing List

systemsafety at TechFak.Uni-Bielefeld.DE<mailto:systemsafety at TechFak.Uni-Bielefeld.DE>








No virus found in this message.
Checked by AVG - www.avg.com<http://www.avg.com>
Version: 2013.0.2805 / Virus Database: 2637/5965 - Release Date: 12/16/12
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.techfak.uni-bielefeld.de/mailman/private/systemsafety/attachments/20121218/86d043b7/attachment-0001.htm>


More information about the systemsafety mailing list