#98 – WATCH OUT FOR RISKY INTERRELATIONSHIPS – JAMES J. KLINE

The title sounds mundane and the approbation common sense.  But, in the field of risk, common sense and conventional wisdom are not always correct.  The1998  Piper Alpha Oil Platform disaster in the North Sea is a case in point.  The disaster killed 165 workers and 2 rescuers and caused billions of pounds in property damage.  Originally, the blame was placed on poor management.  However, a complete review indicated that a series of interrelated actions, which sociologist Charles Perrow calls “normal accidents”, were the cause.

“Normal accidents” are not normal because of the consequences, but because of the seeming insignificance actions and their interrelationships.  In the Piper Alpha case, there were four seemly unrelated contributors. The first and perhaps most important was a Common Mode Failure (CMF) caused by the linkage of safety systems to a common electrical system.  The second was the failure of night shift to let day shift know that they had taken several vapor pumps down for repair.  This caused day shift to use a valve without a safety seal.  This allowed vapor to escape. The vapor caused the fire.  The third was the use of less trained employees, when regular employees were not available.  Lastly, employees were not sufficiently trained or aware of the consequences of turning off the automatic fire suppression system in order to avoid drivers being sucked into the underwater pumps.

Thus, often over looked mundane actions ended up causing billions of pounds of damage.  In fact, with the exception of CMF, it is difficult to see how a risk analyst would catch most of them.  How does one know what degree of training and knowledge is needed before individuals understand all the implications of their actions?  From one day to the next, how much of a problem does a lack of communication from one shift to another create?

What makes it more problematic is that expert opinion frequently misestimates the impact of a variable.  For instance, in academe, conventional wisdom indicates having large endowments, lots of highly respected academics and high levels of graduate students lead to larger amounts of National Institute of Health (NIH) funds and advancement in university ranking.  Yet, the regression results from my PhD dissertation indicate that none of those variables contributes in a statistically significant way to the receipt of NIH funds. A correlation analysis of those variables shows a strong relationship with University Ranking.  Thus, these variables contribute to University Ranking.  However, university ranking contributes little to the receipt of NIH funds.  In this instance, the conventional wisdom of the professionals is off the mark.

Equally problematic is the fact that statistical (probabilistic) models may not, because of the low probability associated with some variables, accurately identify a variable or collection of variables that can cause or contribute to a catastrophic failure.

There are no easy solutions to these types of issues.  Ensuring that safety procedures are comprehensive and employees are trained and practiced on these procedures is one step.  Using more than one risk analysis tool to identify potential problems is another.  Talking to employees about issues and potential problems is also useful.  End the end, however, what it comes down to is the talent of the risk analyst and the willingness to look at the possible interrelationships of seeming insignificant variables.

Bio:

James Kline PhD (abd) is an ASQ certified Six Sigma Green Belt and Manager of Quality/Organizational Excellence.  He has consulted on quality and workforce development issues for the State of Oregon and the League of Oregon Cities.  He currently works for the US Postal Service, while completing his dissertation.

 

Leave a Reply

Your email address will not be published. Required fields are marked *