#38 – MISUNDERSTANDING OF RISKS – UMBERTO TUNESI

Umberto Tunesi pixIt is generally accepted that modern industry and society is sufficiently well equipped to fight environmental risks, and that the practices in use are also sufficiently effective.

Nevertheless, tremendous mistakes have been – and will certainly be – made that led – and will lead – to the destruction of whole forests and of many animal species.

It may sound a bitter joke or a science fiction script that one day someone may come up with the idea that to save the Environment, human genocides would be the best solution.

Solution, this is the problem: Mankind and scientific communities still generally think of short-term solutions, therefore containment, nor corrective neither preventive action.

It’s like having toothache and having the tooth pulled away, instead of curing it.

Though much is said about “doing for the Environment”, in general terms we – as a Species – have a scant understanding of what we mean by Environment: the Eskimos’ language has some eight different words to mean “snow”, that’s their environment; the Sahara Bedouins would probably have the same approach for sand or wind – but, for the western observer – it’s all about “environment”.

There’s a further common misunderstanding, which is that Environment only means Biosphere, therefore neglecting essential human life factors such as social life, economics, industry, politics.

Since the mid sixties’ boom, there’ve been economical crises every fifteen years or so; they’re recurrent phenomena, therefore, like comets. But, beside weeping on the crises’ consequences, no effective  action was taken to predict them and to reduce their adverse effects.

As a matter of fact, we know much more about comets.

In the technical business it is pretty much the same. In spite of years long development of new models, the automotive industry has to recall tens of million cars because of dangerous failures. And the aerospace industry is even worse: beside very critical failures in design procedures, like in the case of the Airbus 380, effective or improved aircraft design and construction depend on crashes, causing hundreds deaths.

Economical down-turns are linked to social unrest: first-hand, immediate analyses would conclude  that they cause social unrest, so the diagnosis would be to inject new blood to economy. But there is seldom evidence that social unrest is only connected to poor economy: this is the most frequent mistake done when analyzing critical or risky situations. There seldom is a holistic, non-teleological approach, in other words, a full vision of the problem is lacking; and, even worse, an adequate understanding of the problem, too.

I’m not against Globalization, on the opposite I find that – under some given conditions – is a good way to make business.

But, let’s just imagine that our Company sells snowboards all over the World; we’ve found a reliable manufacturer in a low-labor-cost Country, say central Africa. We send him the necessary specifications and drawings, and provide him with the necessary machinery and raw materials. He sends us production samples, we test them, we find them OK, so we green-light him mass production.

May be we also audit his Company before starting mass production.

But, while we know what a snowboard is and has to do, our supplier and his work-force never saw snow in their lives: did we assess this risk? Specifications and drawings say a lot but don’t say everything: there’s no training or instruction that can effectively replace direct experience.

The snowboard example clearly shows how a priority risk was misunderstood, or underestimated.

During my apprentice auditor training I was obsessed with the question “what can go wrong?”; then, as years went by, fashion changed and the key question became “what is likely to go wrong?”, implying some kind of statistical approach. We must not forget Disraeli’s quote: “There are three kinds of lies: lies, damn lies, and statistics.”

 

My guess, for what it can be worth, is that we – both as risk-oriented auditors and consultants – must always keep on the move and not restrict ourselves to the Standards’ requirements, unless they support our knowledge.

Also because there’ll always be some hostile auditee, or his or her consultant, who’ll provocatively ask us “where’s written in the Standard?”.

As in any war – and risk management is much like a war – Intelligence plays a fundamental role; especially HUMINT – human intelligence – as compared to “device” intelligence.

Therefore, in our planning and planning reviews of risks management plans, let’s not risk to value “humint” contribution to our plans’ success: one thing is working at a computer screen, another is working on the field.

Leave a Reply

Your email address will not be published. Required fields are marked *