Analyzing alarm data was no easy task
Simply gathering the data about clinical alarms wasn’t enough to help The Johns Hopkins Hospital improve patient safety. Those numbers have to be broken down into meaningful parts.
When Maria Cvach, MSN, RN, CCRN, assistant director of nursing clinical standards, receives an alarm data download from the clinical engineering group, she cuts the data down from 29 fields to six, including such key data as the bed number, why the alarm sounded, and how long it sounded.
“Every physiological monitor on every patient generates lots of data,” Cvach says. “For example, we had to fine-tune 315 separate parameters for the monitor default parameters. Once we saw a download, we began to understand how many alarm conditions were occurring on the units.”
Members of the alarms task force decided to use the average number of patient alarm conditions per bed per day as the key metric that would guide their efforts. That metric has proved useful and is still used today to evaluate improvement efforts. Whenever alarms changes are implemented, they first record baseline alarm condition data and then measure changes to that key metric as the improvement efforts moved forward.
Using that monitor data, Cvach was able to identify that a huge portion of the alarms were low priority, inactionable “nuisance” alarms. “And even many of the true alarms were not clinically significant,” she says.
Members of the task force decided to tackle the alarms problem on several fronts. Their goal was to eliminate as many nuisance alarm conditions as possible and quiet the cacophony of sounds coming from monitors, infusion pumps, ventilators, bed exit systems, and the multiple other devices that beep or buzz and combine to make hospitals anything but the quiet, healing environments they were intended to be.