#258 – THIS IS YOUR BRAIN ON RISK – ANDREW SHEVES

There are lots of things that are hard about risk and risk management: you are often dealing with abstracts and potential events; showing success can be challenging when your job sometimes means nothing happens; you might not be seen as adding value

But the biggest challenges always seem to concern discussions about risk. These discussions are hard, even when we have set out clear definitions for what we mean by risk and have everyone on the same page theory-wise.

They’re hard because our brains aren’t optimized for these kinds of conversations.  If we aren’t careful, we can end up skewing things to the point where our discussions aren’t effective. Sometimes, we can actually cause problems.

Let me give you two quick examples. Firstly, how people’s experiences shape the conversation.

After the Arab Spring in the late 2000s, many companies were forced to evacuate large numbers of staff from North Africa and from around the Middle East.  In some cases, people had gone from living in a stable, quiet location to being in the middle of a war zone in a very short period.

In the 12-24 months afterwards, this experience was still very raw and top-of-mind even for staff who hadn’t been directly involved. This meant that discussions around this experience could dominate any ‘risk’ conversation you had with these firms, often at the expense of time spent considering any other risk.

Here’s another example but this time it’s how what you say influences the conversation.

I was managing a risk assessment workshop with a company’s leadership team and, as soon as the words left my mouth, I instantly regretted it.

“For example, what if there was a road traffic accident involving senior staff?”

From that point on, road traffic accidents cropped up in the conversations at a much high frequency than was warranted.  I had accidentally dropped an anchor around which the conversation was now circling.

I’m sure these examples aren’t that unusual and you may well have your own experiences of risk conversations gone askew.

But what’s actually happening here and how can you avoid skewing the conversation when you’re talking about risk?

What’s actually going on?

There are three main things to keep in mind when you are having a conversation about risk.

Firstly, as I noted before, our brains aren’t optimized for abstract thinking: we are hard-wired to react instinctively in many situations.  Secondly, our thinking and reaction to things is influenced our own experiences and culture. Finally, we react to the type of information that is presented to us, the way it is presented and the environment we are in when we receive it.

These factors all fall broadly under the category of System One thinking: thinking that “operates automatically and quickly with little or no effort and no sense of voluntary control”. Daniel Khaneman, Thinking Fast and Slow*.

The key point is that this is an activity that is automatic and fast.  This is great if a saber-toothed tiger is chasing you but terrible if you need to have an objective, thoughtful discussion about abstract risks.

Instead, we need System Two thinking which “allocates attention to the effortful mental activities that demand it, including complex calculations” (Khaneman). But how can we do that when System One thinking is automatic?

Four ways to have a better discussion

Here are four things to consider when planning or framing a risk discussion.  These won’t overcome all your challenges but these should help tamp down System One thinking to make room for more considered System Two thoughts.

Biases (1): differences we might see.

We all have biases whether we like it or not.  Some are unconscious such as a tendency to feel more comfortable with people who look and sound like us. Similarly, we can feel uncomfortable with people who appear to be different and then there are ingrained biases around sex, sexuality, and race. We need to be mindful if we sense that these are subconsciously influencing a discussion.

But we also have to consider more conscious biases where the subconscious comfort we have with people like us becomes an active dislike of people who are different. These eventually lead to racism, sexism and religious discrimination which are more serious problems and impediments to a conversation but are also easier to spot.

Biases (2): differences we might not see.

But what about more subtle cultural biases.  Americans view things differently than the French do but Hispanic Americans might view some things differently from Irish Americans.  Yankees fans (boo!) will react to hearing ‘Boston’ even if we aren’t talking about the Red Sox.  (And Mets fans will also react to writing the word ‘Yankees’ it seems!) Engineers and liberal arts graduates think about things differently. Parents view things differently than DINK-y couples.

Importantly, we all inhabit multiple cultures simultaneously and these will affect our thinking in different ways, at different times, depending on the situation.  So one minute we think like an engineer and the next as a Yankees fan.  Imagine if you have two engineers who are Yankees fan but one is a mom of Irish descent and the other is a Hispanic dad. They simultaneously inhabit multiple cultures, some of which are shared and some of which are different.

Think about how you frame things and which ‘culture’ you are appealing to in your conversation.

Think about and acknowledge experiences.

If a group has had a unique, shared experience, this can overshadow any discussion.  Instead of trying to avoid the topic (only to have it come up anyway), consider addressing it first.  Get the ‘elephant in the room’ out of the way first but then be strict about not allowing it to dominate the discussions. It will still be there at the back of people’s minds but you can eliminate some of the influence this way.

Similarly, someone who has experienced a significant personal event might bring those feelings into the conversation but won’t be as obvious as higher-profile, group experience. Be mindful of how people are talking, the points they are focussing on or return. These can highlight other factors that are affecting their thinking.

And what about your relationship with the group: are you an outsider or an insider? And if you’re an outsider, how do you deal with the fact that you aren’t part of the in-group?  In the Arab Spring example I gave above, this was a problem and it could be difficult to work with some of the groups because I was an outsider: “you won’t get it”.

Mind your language.

Think about what you say, the examples you use and even how you present the information. Something as innocuous as the font used or writing a percentage instead of a fraction can elicit very different responses. Beware of ‘anchoring’ a conversation about a topic or point, especially when it comes to numbers.

Remember, any number you give in an example will influence people’s thinking.  If you say ’is this more of less than $1,000’, the answer you get will tend to be higher, even if it is only in the tens of dollars range. You have anchored their evaluation to $1,000.

Think about your environment.

Are people looking at posters reminding them of an anti-harassment policy? Did you just finish a fire evacuation drill? These are environmental triggers or experiences that are immediately available to the individual meaning that these will come to mind more easily as examples. Also, is their line manager in the room? That might also make a difference in how they behave.

If you’ve ever watched a ‘mind-reader’ at work, spoken to a car salesman or seen an ad for a product, you will have seen some of these techniques at work.  These can be very subtle but are very powerful.

As risk managers, we need to be careful that we don’t allow these factors to influence our risk discussions too much. Keeping these points in mind won’t overcome all the difficulties associated with risk discussions but these should help take some of the subjectivity, bias, and system one thinking out of your discussions.

It takes time and requires care and attention on your part but it’s worth it to have a  richer, more thorough and more productive conversation about risk.

You can find a longer piece on risk perspective and communications here.

Andrew Sheves Bio

Andrew Sheves is a risk, crisis, and security manager with over 25 years of experience managing risk in the commercial sector and in government. He has provided risk, security, and crisis management support worldwide to clients ranging from Fortune Five oil and gas firms, pharmaceutical majors and banks to NGOs, schools and high net worth individuals. This has allowed him to work at every stage of the risk management cycle from the field to the boardroom. During this time, Andrew has been involved in the response to a range of major incidents including offshore blowout, terrorism, civil unrest, pipeline spill, cyber attack, coup d’etat, and kidnapping.

Andrew has distilled these experiences down to first principles to develop the KISS Risk Management framework, a straightforward, effective and robust approach to risk management. This aims to make high-quality risk management tools, resources, and training accessible to as many people as possible, particularly those starting out in the field of risk.  He has also developed the dcdr.io risk management software platform and several online assessment tools to complement the KISS framework.

Andrew has an MSc in Risk, Crisis and Disaster Management from Leicester Univerity and has written articles for several publications including the RUSI Journal, ASIS Security Manager Managzine and the International Association of Emergency Managers Bulletin.

Email – andrew@andrewsheves.com
Website – https://andrewsheves.com
Software – https://dcdr.io
Linkedin – https://www.linkedin.com/in/sheves/

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *