#42 – WHEN IS A BLACK SWAN NOT A BLACK SWAN? – GEARY SIKICH

UntitledINTRODUCTION
There seem to be a lot of sightings of ‘black swans’ lately. Should we be concerned or are we wishfully thinking, caught up in media hype; or are we misinterpreting what a black swan event really is? The term black swan has become a popular buzzword for many; including, contingency planners, risk managers and consultants. However, are there really that many occurrences that qualify to meet the requirement of being termed a black swan or are we just caught up in the popularity of the moment? 

The definition of a black swan according to Nassim Taleb, author of the book ‘The Black Swan: The Impact of the Highly Improbable’ is:

A black swan is a highly improbable event with three principal characteristics: it is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was. 

THE PROLEM
There is a general lack of knowledge when it comes to rare events with serious consequences. This is due to the rarity of the occurrence of such events. In his book, Taleb states that “the effect of a single observation, event or element plays a disproportionate role in decision-making creating estimation errors when projecting the severity of the consequences of the event. The depth of consequence and the breadth of consequence are underestimated resulting in surprise at the impact of the event.”

To quote again from Taleb, “The problem, simply stated (which I have had to repeat continuously) is about the degradation of knowledge when it comes to rare events (“tail events”), with serious consequences in some domains I call “Extremistan” (where these events play a large role, manifested by the disproportionate role of one single observation, event, or element, in the aggregate properties). I hold that this is a severe and consequential statistical and epistemological problem as we cannot assess the degree of knowledge that allows us to gauge the severity of the estimation errors. Alas, nobody has examined this problem in the history of thought, let alone try to start classifying decision-making and robustness under various types of ignorance and the setting of boundaries of statistical and empirical knowledge. Furthermore, to be more aggressive, while limits like those attributed to Gödel bear massive philosophical consequences, but we can’t do much about them, I believe that the limits to empirical and statistical knowledge I have shown have both practical (if not vital) importance and we can do a lot with them in terms of solutions, with the “fourth quadrant approach”, by ranking decisions based on the severity of the potential estimation error of the pair probability times consequence (Taleb, 2009; Makridakis and Taleb, 2009; Blyth, 2010, this issue).” 

There was a great deal of intense media focus (crisis of the moment) on the eruption of the Icelandic volcano Eyjafjallajökull and the recent Deepwater Horizon catastrophe. Note that less attention was paid by the media to a subsequent sinking of the Aban Pearl, an offshore platform in Venezuela that occurred on 13 May 2010.

Some have classified the recent eruption of Eyjafjallajökull and the Deepwater Horizon catastrophe as black swan events. If these are black swans, then shouldn’t we classify the Aban Pearl also a black swan? Or is the Aban Pearl not a black swan because it did not get the media attention that the Deepwater Horizon has been receiving? Please not also that Taleb’s definition of a black swan consists of three elements:

“it is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random” 

While the above cited events have met some of the criteria for a black swan – unpredictability; the massive impact of each is yet to be determined and we have yet to see explanations that make these events appear less random.

CHALLENGES FOR PLANNERS, STRATEGISTS, AND CEO’S
I would not necessarily classify the recent eruption of Eyjafjallajökull and the Deepwater Horizon catastrophe or the Aban Pearl sinking as black swan events although their impact (yet to be fully determined) may be far reaching. As they are not unexpected – the volcano existed and has erupted before and offshore rigs have exploded and sunk before (e.g., Piper Alpha 6 July 1988, killing 168 and costing $1,270,000,000.00). The three events cited do have black swan qualities when viewed in context to

today’s complex global environment. This I believe is the strategist, planner and CEO’s greatest challenge – to develop strategies that are flexible enough to adapt to unforeseen circumstances while meeting corporate goals and objectives. This requires a rethinking of contingency planning, competitive intelligence activities and cross-functional relationships internally and externally.

Figure one, below, depicts the effect of an outlier event that triggers independent events and reactionary events that result in a cumulative black swan event/effect.

SIKICH 2

 

 

 

 

 

 

 

 

Figure one recognizes four elements:

  • Agents (outlier events) acting in parallel
  • Continuously changing circumstances
  • Reactionary response creates potential cascades resulting in cumulative effects
  • Lack of pattern recognition leads to a failure to anticipate the future.

How does one overcome the cumulative effect of outlier events? We have to rethink business operations and begin to focus on what I will term ‘Strategy at the edge of chaos.’ This should not be considered a radically new concept in management thinking; rather it recognizes that while strategic concepts are the threshold of management theory, appropriate strategic responses do not always happen fast enough. Markets are not in a static equilibrium; the recent crisis in Europe has cascaded from Greece to concerns over the banking systems in Spain, Portugal and may even see Germany leave the European Union. Markets and organizations tend to be reactive, evolving and difficult to predict and control.

COMPLEX ADAPTIVE SYSTEMS
Unpredictability is the new normal. Rigid forecasts, cast in stone, cannot be changed without reputational damage; therefore strategists, planners and CEO’s are better served to make assumptions – an assumption can be changed, adjusted – assumptions are flexible and less damaging to an enterprise’s (or person’s) reputation. Unpredictability can be positive or negative. Never under estimate the impact of change (we live in a rapidly changing, interconnected world), inflation (this is not just monetary inflation, it includes the inflated impact of improbable events), opportunity (recognize the ‘white swan’ effect) and the ultimate consumer (most often overlooked in contingency plans is the effect of loss of customers).

12 STEPS TO GET FROM HERE TO THERE AND TEMPER THE IMPACT OF BLACK SWANS
Michael J. Kami author of the book, ‘Trigger Points: how to make decisions three times faster,’ wrote that an increased rate of knowledge creates increased unpredictability. Stanley Davis and Christopher Meyer, authors of the book ‘Blur: The Speed of Change in the Connected Economy,’ cite ‘speed – connectivity – intangibles’ as key driving forces. If we take these points in the context of the black swan as defined by Taleb we see that our increasingly complex systems (globalized economy, etc.) are at risk. Kami outlines 12 steps in his book that provide some useful insight. How you apply them to your enterprise can possibly lead to a greater ability to temper the impact of a black swan event(s).

Step 1: Where Are We? Develop an External Environment Profile

Key focal point: What are the key factors in our external environment and how much can we control them?

Step 2: Where Are We? Develop an Internal Environment Profile

Key focal point: Build detailed snapshots of your business activities as they are at present.

Step 3: Where Are We Going? Develop Assumptions about the Future External Environment

Key focal point: Catalog future influences systematically; know your key challenges and threats.

Step 4: Where Can We Go? Develop a Capabilities Profile

Key focal point: What are our strengths and needs? How are we doing in our key results and activities areas?

Step 5: Where Might We Go? Develop Future Internal Environment Assumptions

Key focal point: Build assumptions, potentials, etc. Do not build predictions or forecasts! Assess what the future business situation might look like.

Step 6: Where Do We Want to Go? Develop Objectives

Key focal point: Create a pyramid of objectives; redefine your business; set functional objectives.

Step 7: What Do We Have to Do? Develop a Gap Analysis Profile

Key focal point: What will be the effect of new external forces? What assumptions can we make about future changes to our environment?

Step 8: What Could We Do? Opportunities and Problems

Key focal point: Act to fill the gaps. Conduct an opportunity-problem feasibility analysis; risk analysis assessment; resource-requirements assessment. Build action program proposals.

Step 9: What Should We Do? Select Strategy and Program Objectives

Key focal point: Classify strategy and program objectives; make explicit commitments; adjust objectives.

Step 10: How Can We Do It? Implementation

Key focal point: Evaluate the impact of new programs.

Step 11: How Are We Doing? Control

Key focal point: Monitor external environment. Analyze fiscal and physical variances. Conduct an overall assessment.

Step 12: Change What’s not Working Revise, Control, Remain Flexible

Key focal point: Revise strategy and program objectives as needed; revise explicit commitments as needed; adjust objectives as needed.

I would add the following comments to Kami’s 12 points and the Davis, Meyer point on speed, connectivity, and intangibles. Understanding the complexity of the event can facilitate the ability of the organization to adapt if it can broaden its strategic approach. Within the context of complexity, touchpoints that are not recognized create potential chaos for an enterprise and for complex systems. Positive and negative feedback systems need to be observed/acted on promptly. The biggest single threat to an enterprise will be staying with a previously successful business model too long and not being able to adapt to the fluidity of situations (i.e., black swans). The failure to recognize weak cause-and-effect linkages, small and isolated changes can have huge impacts. Complexity (ever growing) will make the strategic challenge more urgent for strategists, planners and CEOs.

Taleb offers the following two definitions: The first is for ‘Mediocristan’; a domain dominated by the mediocre, with few extreme successes or failures. In Mediocristan no single observation can meaningfully affect the aggregate. In Mediocristan the present is being described and the future forecasted through heavy reliance on past historical information. There is a heavy dependence on independent probabilities.

The second is for ‘Extremeistan’; a domain where the total can be conceivably impacted by a single observation. In Extremeistan it is recognized that the most important events by far cannot be predicted; therefore there is less dependence on theory. Extremeistan is focused on conditional probabilities. Rare events must always be unexpected, otherwise they would not occur and they would not be rare.

When faced with the unexpected presence of the unexpected, normality believers (Mediocristanians) will tremble and exacerbate the downfall. Common sense dictates that reliance on the record of the past (history) as a tool to forecast the future is not very useful. You will never be able to capture all the variables that affect decision making. We forget that there is something new in the picture that distorts everything so much that it makes past references useless. Put simply, today we face asymmetric threats (black swans and white swans) that can include the use of surprise in all its operational and strategic dimensions and the introduction of and use of products/services in ways unplanned by your organization and the markets that you serve. Asymmetric threats (not fighting fair) also include the prospect of an opponent designing a strategy that fundamentally alters the market that you operate in.

CONCLUSION
Taleb, in the revised 2nd edition, of ‘The Black Swan’ posits the following: “How much more difficult is it to recreate and ice cube from a puddle than it is to forecast the shape of the puddle from the ice cube?” His point is that we confuse the two arrows: Ice cube to Puddle is not the same as Puddle to Ice cube. Ice cubes and puddles come in different sizes, shapes, etc. Thinking that we can go from theory to practice and practice to theory creates the potential for failure.

While the Icelandic volcano will have non-regulatory consequences that could as yet, be far reaching, the regulatory deluge to be expected as a result of Deepwater Horizon could be a watershed event for the offshore drilling industry, much as the Oil Pollution Act of 1990 changed many oil companies’ shipping operations.

It takes 85 million barrels of oil per day globally, as well as millions of tons of coal and billions of cubic feet of natural gas to enable modern society to operate as it does. We fail to see transparent vulnerabilities because they are all too recognizable and therefore are dismissed all too readily. In order to overcome the trap of transparent vulnerabilities we need to overcome our natural tendency toward diagnostic bias.

A diagnostic bias is created when four elements combine to create a barrier to effective decision making. Recognizing diagnostic bias before it debilitates effective decision making can make all the difference in day-to-day operations. It is essential in crisis situations to avert compounding initial errors. The four elements of diagnostic bias are:

  • Labeling
  • Loss aversion
  • Commitment
  • Value attribution.

Labeling creates blinders; it prevents you from seeing what is clearly before your face – all you see is the label. Loss aversion essentially is how far you are willing to go (continue on a course) to avoid loss. Closely linked to loss aversion, commitment is a powerful force that shapes our thinking and decision making. Commitment takes the form of rigidity and inflexibility of focus. Once we are committed to a course of action it is very difficult to recognize objective data because we tend to see what we want to see; casting aside information that conflicts with our vision of reality. First encounters, initial impressions shape the value we attribute and therefore shape our future perception. Once we attribute a certain value, it dramatically alters our perception of subsequent information even when the value attributed (assigned) is completely arbitrary.

Recognize that we are all swayed by factors that have nothing to do with logic or reason. There is a natural tendency not to see transparent vulnerabilities due to diagnostic biases. We make diagnostic errors when we narrow down our field of possibilities and zero in on a single interpretation of a situation or person. While constructs help us to quickly assess a situation and form a temporary hypothesis about how to react (initial opinion) they are restrictive in that they are based on limited time exposure, limited data and overlook transparent vulnerabilities.

The best strategy to deal with disoriented thinking is to be mindful (aware) and observe things for what they are (situational awareness) not for what they appear to be. Accept that your initial impressions could be wrong. Do not rely too heavily on preemptive judgments; they can short circuit more rational evaluations. Are we asking the right questions? When was the last time you asked, “What variables (outliers, transparent vulnerabilities) have we overlooked?”

My colleague, John Stagl adds the following regarding value. Value = the perception of the receiver regarding the product or service that is being posited. Value is, therefore, never absolute. Value is set by the receiver.

Some final thoughts: 

  •  If your organization is content with reacting to events it may not fair well;
  •  Innovative, aggressive thinking is one key to surviving;
  •  Recognition that theory is limited in usefulness is a key driving force;
  •  Strategically nimble organizations will benefit;
  •  Constantly question assumptions about what is ‘normal’.

Ten blind spots: 

#1: Not Stopping to Think

#2: What You Don’t Know Can Hurt You

#3: Not Noticing

#4: Not Seeing Yourself

#5: My Side Bias

#6: Trapped by Categories

#7: Jumping to Conclusions

#8: Fuzzy Evidence

#9: Missing Hidden Causes

#10: Missing the Big Picture.

In a crisis you get one chance – your first and last. Being lucky does not mean that you are good. Luck runs out eventually.

About the author 

Geary Sikich is a Principal with Logical Management Systems, Corp., a consulting and executive education firm with a focus on enterprise risk management and issues analysis; the firm’s web site is www.logicalmanagement.com. Geary is also engaged in the development and financing of private placement offerings in the alternative energy sector (biofuels, etc.), multi-media entertainment and advertising technology and food products. Geary developed LMSCARVER the ‘Active Analysis’ framework, which directly links key value drivers to operating processes and activities. LMSCARVER provides a framework that enables a progressive approach to business planning, scenario planning, performance assessment and goal setting. Geary is an Adjunct Professor at Norwich University, where he teaches enterprise risk management (ERM) and contingency planning electives in the MSBC program. He is presently active in executive education, where he has developed and delivered courses in enterprise risk management, contingency planning, performance management and analytics. Geary is a frequent speaker on business continuity issues business performance management. He is the author of over 200 published articles and four books, his latest being ‘Protecting Your Business in Pandemic,’ published in June 2008 (available on Amazon.com).

Geary is a frequent speaker on high profile continuity issues, having developed and validated over 1,800 plans and conducted over 250 seminars and workshops worldwide for over 100 clients. Geary consults on a regular basis with companies worldwide on business-continuity and crisis management issues.

gsikich@logicalmanagement.com or g.sikich@att.net

REFERENCES 

Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.

Davis, Stanley M., Christopher Meyer, Blur: The Speed of Change in the Connected Economy, (1998).

Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3

Levene, Lord, “Changing Risk Environment for Global Business.” Union League Club of Chicago, April 8, 2003.

Orlov, Dimitry, “Reinventing Collapse” New Society Publishers; First Printing edition (June 1, 2008), ISBN-10: 0865716064, ISBN-13: 978-0865716063

Oil Rig Disasters, http://www.oilrigdisasters.co.uk/

Sikich, Geary W., The Financial Side of Crisis, 5th Annual Seminar on Crisis Management and Risk Communication, American Petroleum Institute, 1994

Sikich, Geary W., Managing Crisis at the Speed of Light, Disaster Recovery Journal Conference, 1999

Sikich, Geary W., Business Continuity & Crisis Management in the Internet/E-Business Era, Teltech, 2000

Sikich, Geary W., What is there to know about a crisis, John Liner Review, Volume 14, No. 4, 2001

Sikich, Geary W., The World We Live in: Are You Prepared for Disaster, Crisis Communication Series, Placeware and ConferZone web-based conference series Part I, January 24, 2002

Sikich, Geary W., September 11 Aftermath: Ten Things Your Organization Can Do Now, John Liner Review, Winter 2002, Volume 15, Number 4

Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002

Sikich, Geary W., “Aftermath September 11th, Can Your Organization Afford to Wait”, New York State Bar Association, Federal and Commercial Litigation, Spring Conference, May 2002

Sikich, Geary W., “Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty,” PennWell Publishing, 2003

Sikich, Geary W., “It Can’t Happen Here: All Hazards Crisis Management Planning”, PennWell Publishing 1993.

Sikich Geary W., “The Emergency Management Planning Handbook”, McGraw Hill, 1995.

Sikich Geary W., Stagl, John M., “The Economic Consequences of a Pandemic”, Discover Financial Services Business Continuity Summit, 2005.

Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739

Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, 2007, Random House – ISBN 978-1-4000-6351-2

Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, Second Edition 2010, Random House – ISBN 978-0-8129-7381-5

Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930

Taleb, N.N., Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers; NYU Poly Institute October 18, 2009

Vail, Jeff, The Logic of Collapse, www.karavans.com/collapse2.html, 2006

Venezuelan natural gas rig sinks; http://news.bbc.co.uk/go/pr/fr/-/2/hi/americas/8679981.stm, Published: 2010/05/13 14:38:59 GMT © BBC MMX

This article is Copyright© Geary W. Sikich 2010. World rights reserved. 

http://www.continuitycentral.com/feature0778.html

Leave a Reply

Your email address will not be published. Required fields are marked *