Culture of safety’ yields dramatic results

System wins Eisenberg award

A pharmacist noticed that two patients with the same name were admitted to the hospital on the same day. The pharmacist notified each unit and posted signs in the pharmacy warning staff of the coincidence, and encouraging them to use extra caution to verify that they were giving the correct medicine to the correct patient.

A nurse recognized a co-worker for being a great "wingman" when he stopped her from interrupting another nurse getting medications out of the Pyxis machine. She realized that by interrupting the nurse she could have caused additional stress which could have triggered an error — all for the desire to save what turned out to be 10 seconds.

These are two examples of the impact of a systemwide "culture of safety" which won Sentara Healthcare the 2005 John M. Eisenberg Patient Safety and Quality award from the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) and the National Quality Forum for "Innovation in Patient Safety and Quality at a Local or Organizational Level."

The initiative began at Sentara Norfolk General Hospital (SNGH) in the fall of 2002 and incorporated successful safety principles from the nuclear power and aviation industries. It has since been implemented systemwide in Sentara’s other five hospitals and other sites of care.

"It’s about every employee learning our systems to prevent errors, patient injuries, and workplace accidents, and actively applying that knowledge in their work," says Gary Yates, MD, chief medical officer. This involves not only clinical staff, but also housekeepers, dietary, security, and environmental services staff, he adds.

The initiative begins with a three-part assessment that includes a safety culture survey, a common cause analysis of past events and an assessment of current improvement practices and outcomes. Specific recommendations are tailored to each site but always include development of behavior-based expectations (BBEs) and revision of the facility’s event analysis approach.

BBEs are developed by task forces of staff from various departments who work with the results of the common cause analysis and lists of proven error prevention tools from other high-reliability industries. Here are several BBEs:

  • Communicate clearly, using repeat backs and read backs, clarifying questions and phonetic and numeric clarifications.
  • Have a questioning attitude, using "validate and verify."
  • Hand off effectively by using the "5P" technique: What is the patient or project? What is the plan? What is the purpose? Are there any problems? Do you need to take any precautions?
  • Never leave your "wingman." "This is all about being a good team member and coworker, checking each others work along the way, and peer coaching, which is about encouraging safe behavior and discouraging unsafe behavior," says Carole Stockmeier, a director of the Sentara Safety Initiative.

For each BBE, there are tools and techniques, such as use of the mnemonic STAR (Stop, Think, Act, Review) for pay attention to detail. "This allows your brain to catch up to what your hands are ready to do," she says. "The BBEs have given us a new language. When you walk through the halls, you start to hear the language of the new culture, with people saying Let me ask a clarifying question.’"

To address physician behavior, a parallel process was developed by a group of physician leaders and nurses. "These teams also looked at the assessment results and added behaviors that were important for the physician community," says Stockmeier. These include physician-to-physician communication for all consultations and establishment of a coordinating physician.

Since multiple doctors are caring for each patient, one member of the team must be the point person for the family and also for the nursing staff. "If a problem comes up, they must know exactly who to go to, so there is no confusion about who exactly is the captain of the ship," says Gene Burke, MD, vice president for medical affairs at SNGH.

Generally, when a patient came to the hospital, the doctor acting as the coordinating physician was the one who admitted the patient. "But as we become more specialty focused, sometimes that broke down," says Burke. "And if the doctor who admits the patient feels that he or she isn’t the best person to do that job, he arranges with another doctor to do it. There is more formalized recognition that there is one doctor who is the go to’ person."

The organization’s event analysis program also was revised to be more efficient and effective in identifying root causes and determining common causes of past events. "We have revamped our approach to root cause analysis for serious events, so it is quicker and much more operationally based," says Shannon Sayles, RN, a director of the patient safety initiative.

Previously, all the responsibility for analyzing the event, developing the action plan, and tracking progress fell on the shoulders of the quality management department.

"Now it is much more embedded in operations, with a senior leader serving as a sponsor to that particular project," says Sayles. "There is now an incredible awareness across the organization about cause analysis that we didn’t necessarily have before. It’s no longer stepping outside your role — it is your role."

The organization also trained a large number of staff from a wide range of departments including security, nurses, and pharmacists analysts, in a two-day workshop. They now participate in the various levels of event analysis that occur in the organization.

"The more diverse you make your problem solving team, the better. If you have people all of one mindset looking at a problem, they will have a bias for the solution," says Yates.

Instead of gathering staff together and using typical tools such as flow charts and fishbone diagrams, there is more focus on individual interviews of all involved people, even those who were only indirectly involved, says Sayles. Individual failures within the event are analyzed and coded using charts that outline the various types of human error, system interface failure modes.

"Before, we looked at root causes much more superficially," she says. "If we had a wrong-site procedure, we might have said the root cause was they didn’t follow the procedure for time out, and left it at that, and reeducated staff."

During interviews with front-line staff in a recent analysis project, it was discovered that staff did the time out procedure differently depending on the physician. "Some of the staff are willing to confront the physicians and some aren’t," says Sayles. "We would have never gotten that type of information before."

In addition to human error and process failure modes, the new system also looks at management system failure modes. Instead of saying that staff didn’t follow the time out procedure, the focus is on leadership responsibility to build accountability to be sure the procedure is always followed.

The results of event analysis including individual failure modes are put into a database. These data become part of periodic common cause analyses that look for common themes and issues across a wide variety of cases. "As we get more cases in the database, we can break it down into the specific types of human and system failure modes," says Sayles.

In order to reinforce the behavior expectations, a group of "safety coaches" is developed in each facility. "These are front-line employees from all departments. We train them in observation techniques and how to coach and give constructive feedback," says Stockmeier. "The focus isn’t to audit and catch people doing it wrong, but to observe them doing it right and tell them how well they are doing."

Results to date have been promising, such as a 50% decrease in the sentinel and serious event rate. Senior leadership is encouraged by these outcomes, but knows that they are just a "few miles in to a long marathon," says Burke. "This is all about making it stick,’ and not allowing the work to become a flavor of the month,’" he says.