#128 – ALARM FATIGUE – JEFF HARRIS

AAA-Jeff-150x150In the mid 1960’s, a man witnessed an awful multi-car pileup in San Francisco. That man also happened to have a PhD in psychology and was an inventor as well and he decided to do something about rear end collisions. The result was a third brake light installed at the top of the rear window in automobiles. Testing showed that the addition of the third light reduced rear end collisions by 50 to 60%.

Alerts designed to notify us of increased risk of an adverse outcome are all around us. My daughter’s car has an indicator that notifies her when the tire pressure is low. I get a notification from my cell phone provider when our data limit is approaching. Arguably, the industry with the most alerts in in the modern hospitals. Physician order entry systems flag potential medication problems which is fine, but one study put the number of alerts physicians at one hospital must override to successfully input a medication at a whopping 17,000 per month. Pharmacists at the same hospital had to deal with 175,000 alerts per month. It sounds bad, until you consider a different study that looked at nurses in an ICU unit processing 381,560 alerts per month. That’s 12,700 blinking, shrieking or otherwise annoying alerts a day. In one of the most understated examples of modern nomenclature, this is known as “alert fatigue”, which suggests it is an annoyance, rather that a safety issue.

Clearly, it is a safety issue. A 2011 Boston Globe investigation found 200 deaths nationwide due to alarm fatigue. There are numerous case reports about heart monitors being turned off because of over-sensitive alarms followed by patient deaths because the alarm wasn’t turned back on. It’s easy to pass judgment in these cases, but how would you deal with 12,700 alarms to process in a day?

The human brain is very good at blocking out distractions when immersed in a cognitive-heavy process. After so many overrides, a distraction is what they become. Software vendors and health care organizations are hesitant to reduce the level of alerts out of fear of liability. What would happen if, in the era of the “smart car”, the brake lights come on not when the brakes are applied, but when the algorithm decides they should come on to alert drivers behind the vehicle. Of course the auto companies would err on the side of caution, resulting in the lights being on for longer periods of time than before because of liability.

How long would it take for the human brain to filter out this as a distraction? I’m thinking not very long.

Bio:

Jeff Harris is a Pharmacist with over 25 years of leadership experience in hospital, retail, and home health environments. Due to a spinal cord injury, he is currently on long term disability.  Jeff is passionate about patient safety, risk management and cybersecurity issues in healthcare.  He continues to research and write about improving healthcare on a pro-bono basis.

Leave a Reply

Your email address will not be published. Required fields are marked *