What is a FMEA? The acronym stands for Failure Modes and Effects Analysis. This is quite a mouthful, but it takes a systematic engineering approach and looks for how “it” can “fail” instead of how “it” can succeed. Continue reading
Category Archives: Tips&Tools@Risk™
#106 – 5 POINTS TO REMEMBER FOR A SMOOTHER MANAGEMENT SYSTEM AUDIT – WILL HUGGETT
Featured
My client, a leader in innovative rail friction management, was recently preparing for an integrated ISO 14001/OHSAS 18001 audit and I was asked to provide onsite training to prepare their employees. The company and its staff are well versed in audits, having been registered under ISO 9001 for several years. In addition to the required EHS awareness training, I was also asked to provide my thoughts on what to expect during the upcoming audit. Continue reading
#102 – BARCODE VERIFIERS AND ERM – JOHN NACHTRIEB
The tiny little town I live in is a crossroad with a gas station, a post office and a small diner which is a Saturday morning breakfast ritual for us and most of the town. Last spring an elderly gentleman was parking his pickup when his foot slipped from the brake to the gas pedal, and he crashed through the kitchen wall, injuring a cook and causing extensive damage to the building. Now, almost five months later, a crude sign in the window says “open soon.” Continue reading
#98 – WHY IS CORPORATE GOVERNANCE BROKEN? – GREG CARROLL
Last night I made the mistake of attending a local IT Forum meeting. In addition to the usual cliché ridden talk of establishing a “silicon valley” locally, perceptions and strategies were 20 year out of date and based on requiring government lead. Bureaucracy driven innovation, now that’ll work! Needless to say I left early. That got me thinking about the problem with our field of Governance, Risk and Compliance. Why, with the number of fertile minds that exist in our field, is it still a case of an irresistible force meeting an immovable object. The paradox I believe, like our would-be entrepreneurs above, is one of approach. Continue reading
#98 – WATCH OUT FOR RISKY INTERRELATIONSHIPS – JAMES J. KLINE
The title sounds mundane and the approbation common sense. But, in the field of risk, common sense and conventional wisdom are not always correct. The1998 Piper Alpha Oil Platform disaster in the North Sea is a case in point. The disaster killed 165 workers and 2 rescuers and caused billions of pounds in property damage. Originally, the blame was placed on poor management. However, a complete review indicated that a series of interrelated actions, which sociologist Charles Perrow calls “normal accidents”, were the cause.
“Normal accidents” are not normal because of the consequences, but because of the seeming insignificance actions and their interrelationships. In the Piper Alpha case, there were four seemly unrelated contributors. The first and perhaps most important was a Common Mode Failure (CMF) caused by the linkage of safety systems to a common electrical system. The second was the failure of night shift to let day shift know that they had taken several vapor pumps down for repair. This caused day shift to use a valve without a safety seal. This allowed vapor to escape. The vapor caused the fire. The third was the use of less trained employees, when regular employees were not available. Lastly, employees were not sufficiently trained or aware of the consequences of turning off the automatic fire suppression system in order to avoid drivers being sucked into the underwater pumps.
Thus, often over looked mundane actions ended up causing billions of pounds of damage. In fact, with the exception of CMF, it is difficult to see how a risk analyst would catch most of them. How does one know what degree of training and knowledge is needed before individuals understand all the implications of their actions? From one day to the next, how much of a problem does a lack of communication from one shift to another create?
What makes it more problematic is that expert opinion frequently misestimates the impact of a variable. For instance, in academe, conventional wisdom indicates having large endowments, lots of highly respected academics and high levels of graduate students lead to larger amounts of National Institute of Health (NIH) funds and advancement in university ranking. Yet, the regression results from my PhD dissertation indicate that none of those variables contributes in a statistically significant way to the receipt of NIH funds. A correlation analysis of those variables shows a strong relationship with University Ranking. Thus, these variables contribute to University Ranking. However, university ranking contributes little to the receipt of NIH funds. In this instance, the conventional wisdom of the professionals is off the mark.
Equally problematic is the fact that statistical (probabilistic) models may not, because of the low probability associated with some variables, accurately identify a variable or collection of variables that can cause or contribute to a catastrophic failure.
There are no easy solutions to these types of issues. Ensuring that safety procedures are comprehensive and employees are trained and practiced on these procedures is one step. Using more than one risk analysis tool to identify potential problems is another. Talking to employees about issues and potential problems is also useful. End the end, however, what it comes down to is the talent of the risk analyst and the willingness to look at the possible interrelationships of seeming insignificant variables.
Bio:
James Kline PhD (abd) is an ASQ certified Six Sigma Green Belt and Manager of Quality/Organizational Excellence. He has consulted on quality and workforce development issues for the State of Oregon and the League of Oregon Cities. He currently works for the US Postal Service, while completing his dissertation.