#2 – ON RISK – THE HUMAN ASPECT OF TECHNOLOGICAL CHANGE – DAN DONAHOE – DECISIONS@RISK™

My favorite economics slide is hand-drawn depiction of the Malthusian Dilemma (run-away population) growth as drawn by a 1996 Nobel Prize in Chemistry laureate, the late Richard Smalley, who co-discovered a spherical carbon structure he named buckminsterfullerene after Buckminster Fuller, a futurist who both coined the term “spaceship earth” and invented the geodesic dome. Smalley scrawled title “Technology Got Us into This” above a historical plot of the earth’s population showing the population explosion (from fewer than 1 billion people to over 7 billion in just 200 years) and hints that mankind has made a Faustian bargain with the Devil. Modern man spent roughly 150,000 years living off the land before the dawn of the industrial age, and, now, our species has “sold our soul” by being trapped in a never-ending need for an upward spiral of technology. For without continued technical evolution, we can expect shortages that will produce cataclysmic events beginning with economic collapse and ending with potential extinction.

Moreover, required improvements in technology must be revolutionary.  The necessary technical evolution includes new sustainable technologies, because mankind is approaching natural limits. So-called “peak oil” [production] has either been passed or soon will be passed. The known reserves of many minerals are uncertain.  Despite many doubters, mankind is affecting the balance of gases in the atmosphere. This should be no surprise, since the earth’s atmosphere is only 62 miles thick (since the earth is 7920 miles in diameter, the ratio of thickness to diameters is less than 1%) and mountain climbers call five miles elevation the “death zone”. We inhabit a thin zone on this planet.

Although these risks due to population pressure are relatively well understood, there is an anti-intellectual spectrum of political response to these challenges suggests that something interesting in the way humans reason. The question might be “Why do people oversimplify problems?”. So risk analysis requires more than good mathematical modeling, solid theoretical frameworks and repeatable empirical evidence.

Fortunately, enlightening work on human cognition by two psychologists, the late Amos Tversky and Daniel Kahneman, was recognized in the 2002 Nobel Prize in Economics. Their work provided insights into the way we think and explained systematic errors hard-wired into our heads.  They described how emotions and intuition play major roles in our reasoning by allowing for fast decision making, but they also introduce systematic errors. Please note that we are not focusing on “smart” and “stupid” here; instead we are focusing on how we all think. Instead we have built-in biases. Examples include a bias against cutting losses after a bad bet by walking away; a built-in focus on changes rather than in more rational assessment of end states. There is too much to cover here, and I encourage the interested reader to contact me for references.

Since we are all prone to systematic errors, how should we proceed in performing risk analysis or creating policy implementation based on risk assessment?  The best answer is, first, perform solid analysis, second, understand the types of errors we humans systematically make and, third, engage colleagues who openly share their concerns in rational discussions and, fourth, incorporate criticisms with humility. Thus, we best address our human challenges by being human. Our future rests on our humanity.

Leave a Reply

Your email address will not be published.