Measuring safety culture — can it be done?

Determine where culture needs strengthening

Everyone knows that in order to have the kind of hospital that gets an A grade in safety from The Leapfrog Group, you need to have an organization whose culture values safety. But how do you know that you do? And is there a way you can measure it?

Yes, says David M. Gaba, MD, associate dean for immersive and simulation-based learning and director of the Center for Immersive and Simulation-based Learning (CISL) at Stanford University School of Medicine. He recently spoke on the topic at a conference of High Reliability Organizations (HROs) sponsored by The Joint Commission.

People often compare healthcare and its safety efforts to industries such as aviation or nuclear power — industries that have an element of intrinsic hazard, where the kind of culture you have can determine if there are failures or successes, and where failures can cost lives.

"People aren't airplanes, I know, but much of what we do is intrinsically hazardous, and we have regular situations that can harm patients, if not kill them," he says.

That's why people make the comparisons and aim to improve safety records to the reliability level of those kinds of industry. It will never be the same — he uses the analogy that if safety is a clock, then aviation is at four and healthcare is at eight. "We don't need to get to four o'clock, but we do need to change the time."

This emphasis on other industries has a parallel when you try to measure the culture of safety, Gaba notes, because to do so in a way that is even close to accurate, you have to use ethnographic and anthropologic techniques, with embedded observers. "But that's expensive, hard to do on a large scale, and takes a long time."

So what do you do? Forget about measuring the entire culture. To use a weather analogy, you may want to describe the climate in a location in general. But to get there, you need to know what the weather is outside on a day-to-day basis, he says. Using interviews and surveys, you can learn about the way things look on the surface. "It's not very precise, but it can give you a really good handle on things and point you to some of the things you need to study further."

Gaba objects to looking for the easy or cheap way to do something — like measuring safety culture. "We have done easy and cheap stuff and we are still hurting people," he says. "Let's be realistic: There are constraints of both time and money. But we can either look at the same things we've always measured and not change, or do something harder, that's different and can promote change."

His objection to "easy" noted, Gaba says that the kind of study you want to do isn't really very hard, and can help you determine where your culture needs strengthening.

For example, in his research, Gaba and his co-investigators use the same kinds of questions you see in surveys done by organizations such as the Agency for Healthcare Research and Quality (AHRQ). But rather than look at the majority answers — those that might indicate you do something 80% of the time or that 90% of respondents think you have a really responsive organization — look at the minority answers.

"You want to know how big a minority answer needs to be before it makes a difference," he explains. In HROs, at a minority response of 10%, the powers that be get worried. At that level, the H in HRO is jeopardized. For an example of how Gaba has looked at those minority answers to gauge hospital safety, see his 2003 paper in Quality and Safety in Healthcare, "The culture of safety: results of an organization-wide survey in 15 California hospitals" (http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1743680/pdf/v012p00112.pdf).

He also mentions a study of naval aviators and health care professionals where the difference in those minority "problematic answers" was considered. For naval aviators, the problematic response rate was 4%. For healthcare professionals, it was around three times that level. That comparison holds for most of the 90 hospitals in the study, and for all work areas. There were a very few that were down near the aviation percentage, and some that were as high as 20% in problematic response rates. "Who those were and what they do differently, we haven't figured out yet," he says.

So knowing your problematic response rate is a start. It might not tell you what's going on, he says, but it can give you an idea of the degree of problem.

Rewarding staff for speaking up

In another study, Gaba looked at senior executives and managers versus ordinary workers. "There is a 5% or so gap between them in how they view the safety culture," he says. Managers and executives think things are better, safer, on average, than the front line staff.

To Gaba, that means that those upper-echelon workers need to get out and about more often and see what really goes on. "If you ask them if they have seen x, y, or z, they say no more often than the staff who do the work. If you ask them if staff takes safety seriously, they say yes more often than other workers. They can't just sit in their offices. They have to get out and look under the rocks."

One way to improve the safety culture once you have an idea of how good it is, he says, is to actually reward people for speaking up and raising "credible safety concerns" even if it turns out they are wrong. If someone says something, it's checked, and it turns out to be nothing, Gaba says people will often grumble about wasted time or effort. But having someone who is willing to speak up regardless of the potential for causing a delay or some work that proves unnecessary in the end is exactly the kind of culture you should strive for. "We need to reward them for that, or next time they might not speak up."

Calls for help are good, and people who are willing to make them are good for the organization, he says. "That's the kind of culture we need to have."

"It's always a struggle to figure out how to measure dedication to safety," says Roger Paveza, manager of Illinois-based Assurance Safety Consulting, who has 18 years experience in healthcare. One thing he thinks is helpful is to look first not at the responses of the common employee, but to start with management and executives. "Safety is a management process, not something that other people do," he says.

Consider how management implements policies and procedures, for instance. If they tout their great safety manual but don't do anything to ensure that the great information in that manual is used and flows down the chain to other employees — to every employee — then you have a gap.

Roxanne Osborne, RN, a risk management consultant for healthcare at the company, says you should also ask employees about how they see the involvement of leadership and management in issues of safety. Like Gaba, she says there is valuable information about your safety culture in any gap between what they view as their organization's safety culture and their leadership's role in it and what management says about that subject.

Perception surveys can also serve to open the eyes of management to deficiencies in safety culture. If the scores given by staff are low, says Paveza, leadership may re-evaluate its position, actions, and emphasis on safety.

"Questions about whether staff feels supported, if they feel mistakes are reviewed positively, if they feel comfortable bringing up issues — they are all telling about a safety culture," says Osborne. Measurement is important.

But Gaba emphasizes that you should not just look at the big numbers you get in such a survey, but the small minority proportion, as it can give you information that is just as vital.

For more information on this topic, contact:

  • David Gaba, MD, Associate Dean for Immersive and Simulation-based Learning and Director of the Center for Immersive and Simulation based Learning (CISL) at Stanford University School of Medicine. Stanford, CA. Email: gaba@stanford.edu.
  • Roger Paveza, manager, and Roxanne Osborne, RN, Healthcare Risk Management Consultant. Assurance Safety Consulting. Schaumburg, IL. Telephone: (847) 463-7223.