Anyone who has conducted a search on ‘risk analysis’, ‘managing risk’, ‘risk management’ or any other permutation would have discovered that the subject of risk analysis has been around for a very long time and has been covered by numerous authors. Still, the daunting challenge remains; how can one conduct process risk analysis without the help of a PhD in statistics?
FMEA FUNDAMENTALS
A popular technique often invoked by various authors is Failure Mode and Effect Analysis or FMEA, developed several decades ago. This simple and controversial technique relies on the assignment of subjective ordinal numbers (usually using a 1-10 Likert type scale), to estimate probabilities for three events: 1) the difficulty (D) of detecting a failure, 2) the severity (S) of the failure and, 3) the likelihood of occurrence (O) of the failure. These three subjectively estimated ordinal numbers are multiplied to ‘compute’ Risk Priority Numbers or RPNs for various process steps. The RPNs are then ranked from highest to lowest and the process steps with the highest RPNs are then analyzed to see how process improvements can be designed to help reduce the RPN—ideally to zero.
One of the criticisms of FMEA is that the ordinal values are subjectively assigned and are not based on factual or data driven (objective) evidence. Unfortunately, data are not always available for each process step and in such cases subjective estimation is the only mean available. To help minimize the error in estimation one can, if possible, have several individuals, usually process engineers familiar with the process, provide their own subjective values to estimate probabilities and the results can either be averaged or the low and high values can be eliminated as is done in some Olympic scoring events. Still, a more serious objection has been raised by numerous authors including Donald Wheeler who points out that it is a mathematical absurdity to multiply ordinal numbers to generate a RPNs—the multiplication of numbers can only be performed on ratio type numbers; that is, numbers for which an absolute zero point has been defined. For suggestions on how to improve scoring readers are encouraged to refer to the short article posted by Donald J Wheeler at:
http://www.qualitydigest.com/inside/quality-insider-article/problems-risk-priority-numbers.html
BAYESIAN STATISTICS
The assignment of subjective probabilities or the estimation of probabilities based on subjective opinions is not limited to FMEAs; in fact, there is a vast field of statistics known as Bayesian statistics that relies on subjective a priori probabilities (i.e., before the fact) to compute more refined estimates of a posteriori (after the fact or after new estimates are provided), probabilities. Bayesian statistics is widely defended by Bayesian statisticians who argue that it is the only valid method in any decision making process.
Given the popularity of risk analysis and its obvious association with decision analysis, it is in fact surprising to see that Bayesian statistics has not yet been mentioned when commenting on ISO 9001:2015. Bayesian statistics is not complicated but it is, in this author’s opinion, cumbersome and for simple cases, often but not always, proves the obvious in a round about and tedious computational process based on ratios of conditional probabilities that invariably lead to what is known as The Rule of Bayes.
No doubt, Bayesian statisticians will seriously object to my dismissive assessment and one must admit that Bayesian statistics is valuable in medical diagnosis (but also in other fields), to estimate the probability of having a particular medical condition given certain pre-existing conditions. Regardless, I don’t foresee industrialists embracing Bayesian statistics in the near future.
RISKS OF RISK ANALYSIS
But there are other difficulties associated with any risk analysis study and that has to do with what is referred to as known-unknown and unknown-unknown. There are many cases of known-unknown in other words, cases where we can enumerate or list what is unknown. For example we may suspect that humidity, or even barometric pressure may affect a particular process but we do not yet (or may never) know how these variables may or may not affect a particular process. But even in less constraining cases where one is aware that a risk is known to exist but its probability of occurrence is either unknown or is grossly underestimated, corrective measures are often not implemented.
A recent example of a known but apparently under estimated risk was the recent security debacle suffered by the supermarket store Target—a breech in security that exposed 40 million customers to cyber criminals. The possibility of having cyber criminals siphon information at various transaction points was known but was either wrongly assumed (or subjectively assessed) to be a low risk or was simply ignored as being a serious threat.
Why is the US the only major economy to use magnetic card technology developed in the 1960s when a better technology has long been available? Why not convert, as many countries did over twelve years ago to smart credit cards loaded with a digital chip. In Europe, the use of such a technology has helped reduce the risk of cyber theft to zero.
UNKNOWN UNKNOWNS
The assignment of subjective probabilities is only possible when one deals with what is known but how does one analyze what has been referred to as ‘unknown-unknown,’ events that have been popularized by financial analysts as ‘Black Swans’ also known as extreme outliers. Unknown-unknown would include all the variables that might have an impact on a process but for which we do not know of their existence let alone, that they might affect a process.
How does one assess risks for unknown-unknown (Black Swans)? You simply can’t and yet these are the very risk events that could wipe you out in a few minutes or a few hours. It is impossible to give examples of unknown-unknown except perhaps to invoke the imaginary world of science fiction.
Unknown variables or events; these infamous or notorious Black Swans, could never be considered let alone analyzed during the most rigorous risk analysis! So what is the point? Well, I suppose one can always estimate the risk of what you think you know and hope for the best about the rest hoping that the probability of the unknown ever occurring is very, very small or at least will not occur during your shift.
Bio:
James Lamprecht is a management consultant and Six Sigma Master Black Belt. In his career spanning over three decades, Dr. Lamprecht has worked as a consultant, teacher, and statistician. He has audited over one hundred companies here and abroad and has conducted hundreds of seminars and classes in applied industrial statistics, ISO 9001 and Six Sigma. He has authored 11 books including Interpreting ISO 9001:2000 with Statistical Methodology (ASQ Quality Press, 2001), Applied Data Analysis for Process Improvement: A Practical Guide to Six Sigma Black Belt Statistics (ASQ Quality Press, 2005) and Dare To Be Different: Reflections on Certain Business Practices with Renato Ricci (ASQ Quality Press, 2009). Dr. Lamprecht who has consulted in Europe, Canada and Latin America received his doctorate from UCLA.