#361 – PROJECT EMERGENCIES: RESPONSE OR REACTION – MALCOLM PEART

In project management we can’t always be in control of the environment around us.  We can only forecast rather than predict risk and despite our ‘reasonable’ or even ‘best’ efforts to mitigate risk; shit happens and emergencies ensue!  It’s not just physical emergencies but also those related to time and cost; remember overbudget or late projects can create an emergency for shareholders and stakeholders alike.

Maybe it’s because we tend to look at the ‘big risks’ or the ‘top ten’ after some semiquantitative assessment but then fail to consider that risks can change with time as more information becomes available.  Or, maybe it is because only those risks that can be clearly defined and are ‘likely’ are communicated to the eyes and ears on the ground.  Those low probability, high impact risks can tend to slip under any risk radar.

Then there is the matter of ‘optimism bias’ and, perhaps and overconfident belief that ‘nothing can go wrong’ or that ‘risk only happens to other people’. Or perhaps it’s just a matter of management not appreciating what is going on and ‘taking their eye off the ball’, if they ever had their eye on the ball.

Consequential Disproportionality and Risk

Risks happen, hazards are realized, and, because of our forecasting ability, some believe that everything can be managed or at least planned for.  But forecasts rely on incomplete information, experience and ‘lessons learned’ coupled, perhaps with expert judgment and possibly a Monte Carlo simulation.  This is fine until the forecast is wrong, everybody sees it and we suffer the symptoms of shock, denial and even anger.

We then realise all too quickly that the risk mitigation planning was inadequate, a crisis is upon us, and a catastrophe is looming on a rapidly approaching horizon.  We are now up to our neck in alligators ‘cos we got it wrong and we have an emergency.  The as-advertised cool calm and collected response is now superseded by events.  What ensues is, at best, knee-jerk reflex reaction and, at worst, beheaded chicken syndrome with, hopefully, well concealed panic

But why?  The consequences of a hazard may well be inversely proportional to the triggering event as small or low likelihood events are oftentimes overlooked or even ignored.  Many major disasters are a consequence of seemingly ‘small’ matters; NASA’s Challenger crash resulted from the failure of an ‘O’-ring; Deep Water Horizon had a faulty pipe fitting, and a small but neglected leaking pipe at Union Carbide’s Bhopal plant caused devastation in India.  We all know the fictional Dutch story of how a little boy plugged a small hole in a dyke and saved Holland from flooding; this only goes to reinforce the concept of consequential disproportionality.

Engineering emergencies can cause millions of dollars’ worth of damage, disruption and delay but emergencies can arise from something as innocuous as office procedures, just look at Bearings Bank in the 90’s.  The 2008 banking collapse cost trillions but was a consequence of many relatively small events culminating in a worldwide economic disaster; perhaps the sight of quick profits blinded the foresight of risk proponents.

Optimism Bias and Groupthink

The UK Government investigated “optimism bias” in 2003 in recognition that project appraisers have a systematic tendency to be overly optimistic.  There was a mandate that a project’s estimated costs and durations should be modified to reflect such optimism through empirically based adjustments and create realistic expectations.  However, and despite further mandates costs and durations of major infrastructure project continue to be exceeded.  Perhaps politicians and financiers need to be educated that if there is a need for infrastructure it inevitably costs what it will cost and takes its time too.  They also need to realise that those who develop infrastructure can only ‘sell’ their product with puffery and optimism or a countries development will stagnate.  Success and failure cannot always be measured in terms of time and cost and yet the lessons based upon delays and outturn costs continue to be attended.

Despite statistics people believe that they will not be affected by negative events and optimism bias prevails.  Such optimism is independent of gender, ethnicity, nationality and age but is, it seems, dependent on political persuasion and which political party is in power.  Even when risk is appraised by experienced and competent people there can be a tendency for the group to convince themselves that risks will be unlikely, or that planned contingency measures will be so robust that nothing can go wrong – this is groupthink.

Most people want their ventures to be successful and anybody who would ‘rock the boat’ or raise controversial issues may well alienate themselves from the group.  Unfortunately, by wanting to belong, group members make efforts to cultivate harmony and conformity and avoid conflict, even the constructive kind.  Decisions become a compromise leading to dysfunctional behavior whereby alternative views are dismissed and critical thinking is curtailed.  Nay-sayers are outcast and the remaining herd believe they have become stronger as everybody ‘goes with the flow’ and the ‘weaker’ non-conformists have been outcast, or rather ostracised.

Emergency Management

The recognized emergency management process is characterized by ‘mitigation’, ‘preparedness’, ‘response’ and ‘recovery’.  Mitigation identifies hazards while preparedness requires obtaining equipment/technology and the training and drilling of personnel in expectation of the predicted emergency.  If the emergency happens there is the ‘response’ and the controlled implementation of plans which are then adapted accordingly.  After any emergency there is the recovery during which the aftermath is addressed.

If the planning is correct, then the recovery boils down to responding to the situation according to plan and all’s well that ends well.  However, if the emergency is unplanned and unexpected the resultant shock can create panic and effort is then wasted as people protect their position by blaming anybody and everybody who was also unable to see the future properly.

After any emergency there is the recovery; if the planning was correct and the response successful then the team merely did their job and the response was controlled.  Sometimes it may be perceived that the emergency was ‘business as usual’; just look at the Y2K prediction of potential global mayhem because of a couple if digits in the date (another small thing by the way).  However, if the emergency response is an uncontrolled reaction an enquiry will inevitably be convened.  After all, and in true human behavior, blame needs to be apportioned, the guilty punished, and the non-involved can even be recognised or promoted; we only need to look at COVID and the current events (January 2022) that are unfolding.

In the aftermath of an enquiry there is usually the mandatory imposition of more rigorous controls, new rules and regulations or even statutory legislation.  These may provide ‘assurance’ that the last emergency shouldn’t repeat itself but a culture of blame and coverup, and ways of avoiding bureaucratic hurdles can result while the real lesson behind these controls is sadly forgotten.  As Churchill said, “Those who fail to learn from history are condemned to repeat it” and if we do not understand why we do something then the resultant blind obedience may lead to shortsightedness when it comes to foresight.  As they say rules and regulations are ‘for the obedience of fools and guidance of wise men’.

Conclusions

Risk identification is at the core of forecasting emergencies and making adequate preparation.  However, an overly optimistic view of risk will inevitably reduce the ability of a project team to respond to a crisis in an effective of efficient manner.  Effort may then be wasted as it is spent dealing with the shock, anger and denial of unplanned and unexpected events.  Effort is also wasted on blaming others as parties get in with their story first rather than dealing with the issue at hand.

If there is groupthink it can be difficult for any realist to advocate that risks can happen.  When one is part of a herd of optimists it can be easier to ‘go with the flow’ rather than risk being ostracised.  But when an emergency happens that flow may well become a raging torrent and saying ‘I told you so’ after the fact will not control the stampede as the herd runs for cover.

Emergencies happen and crises will occur if risks and scenarios are missed or, worse, ignored.  Low probability events can hide away in a risk register and are overlooked by optimists, but low probability does not mean no probability.  All risks can materialise if you wait long enough, so it is important that probabilities are reviewed as projects progress.  Risk identification is not just a one-off exercise.  Any risk register and response planning must keep up with the times because forecasts are time bound and, given that they are merely a prediction of what should have happened if what actually happened, didn’t, we need to keep forecasts current, or risk being ambushed by our own complacency.

Bio:

Malcolm Peart is an UK Chartered Engineer & Chartered Geologist with over thirty-five years’ international experience in multicultural environments on large multidisciplinary infrastructure projects including rail, metro, hydro, airports, tunnels, roads and bridges. Skills include project management, contract administration & procurement, and design & construction management skills as Client, Consultant, and Contractor.

 

Leave a Reply

Your email address will not be published. Required fields are marked *