This article discusses my impressions of the CERM Boot Camp I attended from February 29 to March 4, 2016. The presenters were Greg Hutchins and Ed Perkins. At the outset I need to note that I have known Greg for more than 30 years. We also bid on a couple of jobs together. I met Ed a couple of times prior to the Boot Camp. Continue reading
Category Archives: Government@Risk – Jim Kline
#122 – IS FAA AGREEMENT WITH BOEING A TEMPLATE FOR RISK? – JAMES KLINE PH.D.
Featured
In the past two years, a number of global companies have been fined for committing major violations of law or safety regulations. General Motors (GM) was fined $35 million. A US Congressional investigation determined that GM engineers looking at ignition switch complaints, made a design change, but failed to issue a recall notice. Continue reading
#121 – RISK AND POKA YOKE – JAMES KLINE PH.D.
Featured
On March 28, 1979 there was a cascading failure in reactor number 2 at Three Mile Island. This failure allowed large amounts of nuclear reactor coolant to escape. The accident coalesced the anti-nuclear movement and ultimately caused the decline in nuclear plant construction in the United States. Continue reading
#111 – ISO 9001:2015 IS OUT: WHAT NOW? – JAMES KLINE
Featured
The ISO 9001.2015 standard is out. Two important questions are: How did we get here? What happens now? Greg Hutchins, a Risk and Quality Expert, in his book “ISO: Risk Based Thinking 2015 Edition”, says we have gotten here through an evolutionary process, which, because of VUCA (Volatility, Uncertainty, Complexity and Ambiguity), requires a greater assessment of the external environment than is the case with just a quality focus. Continue reading
#98 – WATCH OUT FOR RISKY INTERRELATIONSHIPS – JAMES J. KLINE
The title sounds mundane and the approbation common sense. But, in the field of risk, common sense and conventional wisdom are not always correct. The1998 Piper Alpha Oil Platform disaster in the North Sea is a case in point. The disaster killed 165 workers and 2 rescuers and caused billions of pounds in property damage. Originally, the blame was placed on poor management. However, a complete review indicated that a series of interrelated actions, which sociologist Charles Perrow calls “normal accidents”, were the cause.
“Normal accidents” are not normal because of the consequences, but because of the seeming insignificance actions and their interrelationships. In the Piper Alpha case, there were four seemly unrelated contributors. The first and perhaps most important was a Common Mode Failure (CMF) caused by the linkage of safety systems to a common electrical system. The second was the failure of night shift to let day shift know that they had taken several vapor pumps down for repair. This caused day shift to use a valve without a safety seal. This allowed vapor to escape. The vapor caused the fire. The third was the use of less trained employees, when regular employees were not available. Lastly, employees were not sufficiently trained or aware of the consequences of turning off the automatic fire suppression system in order to avoid drivers being sucked into the underwater pumps.
Thus, often over looked mundane actions ended up causing billions of pounds of damage. In fact, with the exception of CMF, it is difficult to see how a risk analyst would catch most of them. How does one know what degree of training and knowledge is needed before individuals understand all the implications of their actions? From one day to the next, how much of a problem does a lack of communication from one shift to another create?
What makes it more problematic is that expert opinion frequently misestimates the impact of a variable. For instance, in academe, conventional wisdom indicates having large endowments, lots of highly respected academics and high levels of graduate students lead to larger amounts of National Institute of Health (NIH) funds and advancement in university ranking. Yet, the regression results from my PhD dissertation indicate that none of those variables contributes in a statistically significant way to the receipt of NIH funds. A correlation analysis of those variables shows a strong relationship with University Ranking. Thus, these variables contribute to University Ranking. However, university ranking contributes little to the receipt of NIH funds. In this instance, the conventional wisdom of the professionals is off the mark.
Equally problematic is the fact that statistical (probabilistic) models may not, because of the low probability associated with some variables, accurately identify a variable or collection of variables that can cause or contribute to a catastrophic failure.
There are no easy solutions to these types of issues. Ensuring that safety procedures are comprehensive and employees are trained and practiced on these procedures is one step. Using more than one risk analysis tool to identify potential problems is another. Talking to employees about issues and potential problems is also useful. End the end, however, what it comes down to is the talent of the risk analyst and the willingness to look at the possible interrelationships of seeming insignificant variables.
Bio:
James Kline PhD (abd) is an ASQ certified Six Sigma Green Belt and Manager of Quality/Organizational Excellence. He has consulted on quality and workforce development issues for the State of Oregon and the League of Oregon Cities. He currently works for the US Postal Service, while completing his dissertation.