#198 – FIVE PRIMARY REASONS FOR THE FAILURE OF PREDICTIVE ANALYTICS IN ERM – GREG CARROLL

GregCarrollRegardless of the hype surrounding Predictive Analytics, and even the fact there are some excellent and relatively inexpensive tools available, not only has its implementation been weak, but a 2017 Gartner survey found in many areas investment is going backwards.

In previous articles on the Top 10 Disruptive Technologies that will change Risk Management as we know it in the 2020s I covered

10: Scenario Analysis – to provide operational management with decision marking collateral

9: Big Data – to identify trends and evolving risk

8: Neural Networking – to identify and map real world interrelationships

This week I look at No. 7 – Predictive Analytics and the reasons for poor adoption rates to-date.

What is a Predictive Analytics?

Predictive analytics is the use of data, statistical algorithms and machine learning techniques to identify the likelihood of future outcomes based on historical data. The goal is to go beyond knowing what has happened to providing a best assessment of what will happen in the future.

But with only 37% of Predictive Analytics projects making it into production and 15% producing measurable benefits, Predictive Analytics is facing a watershed.

5 Reasons for Failure

The 5 primary reasons for this apparent abject failure are:

  1. Being IT oriented, not operationally based
  2. Attempting too much too soon, not supporting existing management
  3. Not having measurable outcomes
  4. Emphasis on data visualisation not operational use
  5. Poor selection of input factors, garbage in garbage out.

As Analysis is one of the primary tenants of ISO31000, I don’t believe I need to justify the importance of Predictive Analytics, but there is clear need to reassess our approach to its implementation.

Let’s look at each of these issues in turn.

Being IT oriented, not operationally based

The primary reason for the “failure to launch” of Predictive Analytics projects is that they tend to be IT department projects not based on providing a solution to an operational need.  Predictive Analytics needs to be firmly based on solving actual operational problems, initiated by and managed by operational management.  Look for a very specific issue requiring insight, not a company-wide or even department wide solution.  This will evolve with time.

Attempting too much too soon, not supporting existing management

The number two issue is trying to achieve an advanced Predictive Analytics solution involving Machine Learning, which is better referred to as ‘Augmented Analytics’.  I will cover this more advanced method of Artificial Intelligence (AI) later in this series. With only 37% of Predictive Analytics projects making it into operational use, it’s a matter of learning to walk before you run.  You need to put in place the appropriate management systems to collect, evaluate and disseminate the collateral gained from Predictive Analytics before jumping into advances “black-box” predictions provided by Machine Learning.

Basic Predictive Analytics are easily understood, implemented and available today.  They have the distinct advantage of their findings being understood and trusted by operational management as they are much more in line with gut-feel.

Not having measurable outcomes

Let’s start with basics and I’ll tell you what isn’t an outcome.  It isn’t a count or percentage.  In essence it involves a comparison to a baseline and an assessment of it direction as a minimum.  In its simplest form a Trend diagram could be considered a predictive analytic.  A ratio (percentage) isn’t, but Ratio Analysis is, as it involves a comparison.  I believe the single most important metric in Predictive Analytics is Rate of Change which can imply criticality.

In my next article I will look at typical operational outcomes that can be targeted with Predictive Analytics. The final key though, is that the outcome must be measurable.  Have a goal measurable in terms of change to either a tactical or strategic objective. “Awareness” is not an outcome

Emphasis on data visualisation not operational use

The leads on to what I see as the major problem with current risk practices, the obsession with dashboards and visualization.  I believe this is a result of Risk Register based systems as it is there main purpose i.e. to graph data. Hence the prevalence of heat maps. As it is almost universally accepted that risk not a discrete value, rendering them on a heat map is worthless.  Visualization is an awareness tool providing a “feel-good” to those who produce them, but adds little insight to operational decision making.  It only adds to today’s information overload on managers.

Output from Predictive Analytics needs to:

  1. Be incorporated in operational processes like workflow steps and email notification. This gives it context and provides collateral to real-time decision making
  2. Be in plain English using tools such as NLQ/NLG – Natural Language Query (asking)/Generation (answering). Being in English not only makes it relevant and easily digestible but can be integrated into apps and speech aids.

Poor selection of input factors – garbage in garbage out

The term “risk profile” need not be daunting.  It, like a customer profile, should be the characteristics that uniquely identify them (like demographics and persona) and their needs (preferences and purpose of use).  Forget the word “risk” and think about characteristics.  Think about:

  • What characteristics uniquely identify the threat or situation
  • How does the situation develop and evolve (yes scenario analysis again)
  • Who is involved and how do you measure that involvement
  • When does it reach its risk threshold and how is that measured
  • Why is it a problem i.e. effect and consequences

Although this is not an exhaustive list, it will get you started.  Selection of inputs is where Big Data becomes invaluable.  Having defined the input factors, you then select appropriate metrics from your Big Data inventories.

Predictive Analytics is a numeric analysis but there are methods of transforming qualitative values to numeric. The last issue I listed is important.  Some issues, or in some circumstances, the only warning you get is anecdotal.  That is you start to see its effect.  Including consequential metrics just ensures you are covering all bases.

As for the actual Predictive Analytics calculations and methodology most are dependent on what you are looking for and the tool you are using.  Here I would suggest seek specific external advice (there are some real uses of consultants).

In Summary

To benefit from the real opportunities available from Predictive Analytics ensure:

  1. Identify a specific threat or area of prevention for improvement
  2. Develop a “risk profile” of that area detailing its drivers and influence
  3. Research and select a set of data points that are metrics for those drivers or influences
  4. Build a benchmark set of metrics against which actual observations can be matched
  5. Set up an operational management decision framework (checklist) which measures against this benchmark
  6. Identify points of operational decisions (coal face) where Analytics are to be delivered to affect outcomes
  7. Define analytic models to predict occurrence, severity or outcomes of the threat or improvement
  8. Develop NLQ/NLG interfaces to the analytic models and integrate into operational touch points
  9. Measure actual changes in outcomes and quantify achievements.

A lot of this might sound familiar to Scenario Analysis decision making.  That’s because it is, which is why I covered it earlier as it provides the framework to actually exploit the benefits of Predictive Analytics. Conversely, the lack of a Scenario Analysis framework is a reason why a lot of Predictive Analytics systems are considered ineffective and little more than window dressing.

Next week I will continue with Part 2 of Predictive Analytics covering suggestion on Using Predictive Analytics in Risk Management.

Bio:

Greg Carroll 
- Founder & Technical Director, Fast Track Australia Pty Ltd.  Greg Carroll has 30 years’ experience addressing risk management systems in life-and-death environments like the Australian Department of Defence and the Victorian Infectious Diseases Laboratories among others. He has also worked for decades with top tier multinationals like Motorola, Fosters and Serco.

In 1981 he founded Fast Track (www.fasttrack365.com) which specialises in regulatory compliance and enterprise risk management for medium and large organisations. The company deploys enterprise-wide solutions for Quality, Risk, Environmental, OHS, Supplier, and Innovation Management.

His book “Mastering 21st Century Risk Management” is available from the www.fasttrack365.com website.

 

Leave a Reply

Your email address will not be published.