11 mistakes that cripple your root-cause analysis

You might think you’ve done enough root-cause analyses (RCAs) that they’re old hat by now and you can just cruise through the process. If so, you’re probably making common mistakes that create inconclusive analyses that fall short of addressing the problem.

Even if you’re not cocky and still feel intimidated by the RCA process, you’re susceptible to the same fatal errors, according to experts who have conducted plenty of RCAs and seen more than a few fall flat. Vigilance is the way to keep your RCA skills sharp and avoid wasting your time, says Patrice Spath, RHIT, a consultant in Forest Grove, OR, who specializes in helping health care providers comply with the standards set by the Joint Commission on Accreditation of Healthcare Organizations (JCAHO).

Risk managers have adopted RCAs as a standard tool, as JCAHO intended, but using it too frequently can dull its edge. "This is a tool that we all use on a regular basis and it’s easy to get complacent about how it’s done," Spath says. "Any risk manager would benefit by stepping back and assessing just how you’re doing this analysis. There probably is room for improvement."

Kenneth A. Hirsch, MD, PhD, director of Medical Risk Management Associates, a consulting firm and a practicing psychiatrist in Honolulu, seconds that warning. His company specializes in assisting providers with RCAs and other investigations. When Healthcare Risk Management asked Spath and Hirsch for the most common mistakes risk managers make with RCAs, they identified 11 errors to avoid:

1. Not doing enough RCAs.

Spath says the most common error occurs when selecting which incidents or issues require an RCA. Near misses are overlooked far too often, she says. "The biggest mistake with RCAs is that people don’t do enough of them," Spath says. "Because an RCA takes resources, people are reluctant to initiate an RCA and then they end up having a more severe event later on that could have been prevented if they had done the RCA on a near miss or a pattern of near misses."

Risk managers go wrong when they rely only on JCAHO’s criteria for when to do an RCA, Spath says. You’re better off if you consider those criteria the minimum standards for when an RCA is required and then create your own internal decision-making process, she says.

The criteria will differ for each health care provider, but Spath suggests setting certain benchmarks that signal a potential problem. For instance, you may want to flag all cases in which the length of stay was more than two days beyond the expected period.

Having clear, objective criteria will ensure that you do more RCAs on potential sentinel events, near misses, and other patterns that might otherwise fail to prompt a full investigation, she says. "When a situation comes to your attention, there should be a discussion among the risk manager and other senior leaders about whether an RCA would be valuable. Often that discussion doesn’t occur, or if it does you don’t have any objective criteria to base a decision on," she says. "So they look at their schedule, see that they’re busy already, and it’s easy decide that an RCA isn’t really needed here."

2. Including too few people.

Organizations often attempt to do an RCA with only one or two people, instead of as a team, Hirsch says. It is more expensive to do the analysis as a team and it takes longer, but the quality of the product is infinitely better, he adds. The first major effort in any RCA is coming up with a very detailed sequence of events. If you don’t include all the people who were involved, major steps are overlooked, Hirsch notes. And it is not enough to have a manager saying, "Here’s what I think the staff do in that situation," or "Here’s what they’re supposed to do."

"You have to have the staff there to say, I know we’re supposed to do it this way, but we can’t because we’re short staffed,’" he says. "Interview-ing them is not a good substitute because they feel intimidated and end up just answering your specific questions. You need them to participate so they can relate what they know."

Otherwise, Hirsch says, you will miss variations from the process, and the real-world practices that people build in to make up for those shortfalls. "My RCAs are open to every person in the department," he says. "We make sure certain people attend the meeting, and then anyone else is free to come if they want. Some will attend only the first or second meeting, and that’s OK."

3. Not including your attorney.

If hospital counsel is present at the RCA meetings, the proceedings become attorney work product, which provides an added layer of protection from future disclosure, Hirsch says.

4. Failure to flowchart the processes.

People tend to avoid flowcharting during an RCA because it is tedious, he says. Maybe so, but you should do it anyway. "It’s a nuisance but the visual display really helps people more than just stating the sequence of events," Hirsch says. "And ask questions. Between these two blocks, is there anything else that happens? If you’re skipping steps, you’re missing possible sources of variation."

5. Focusing too much on a single root cause.

The name of the RCA implies, intentionally or not, you’re trying to find one single cause. That’s rarely the case, Hirsch says. "I think we should focus more on contributory causes. It’s more likely a number of factors that came together, not just one huge thing," he says. "Look for the multiple things that increased the likelihood of this happening."

6. Performing an insufficient barrier analysis.

A good RCA should include time spent brainstorming on what barriers should have prevented or minimized the impact of the incident. The barrier can be physical, such as a handrail, or a policy or procedure that should have prevented the bad outcome. "Most people do a barrier analysis, but they don’t do enough," Hirsch says. "Ask what barriers existed and didn’t work optimally, and why. And what barriers didn’t exist but might have been helpful?"

7. Not going deep enough with the overall analysis.

RCA teams have a tendency to stop before getting deep enough, Hirsch says. Often, the analysis stops right at the point where the problem can be attributed to a lack of resources or shortcomings in leadership. That shouldn’t be the end unless you’ve determined why there was a lack of resources and what leadership could have done differently. "You can always ask another question, but know you’ve gone deep enough if you can no longer meaningfully ask why or how," he says. "You can stop when you just don’t have the information and can’t get it. Or when you have the answers and can make an impact by changing them."

8. Overlooking environmental issues.

RCAs often fail to address environmental issues, meaning all the factors that affected the individuals involved. Was the person distracted? Was it noisy in the area? Was the employee on mental overload because you were asking her to do too much at one time?

9. Not including leadership in your RCA.

For RCAs related to sentinel events, you always should have someone from senior leadership involved in every meeting of the team, Hirsch notes. "When JCAHO sees involvement of senior leadership, you’ve scored big points. And if they see that you had no leadership involvement, you’re going to get hit for that and deservedly so," he says. "Upper management doesn’t have to be all that active in the meetings, but they must be there and participating. Be sure to document who’s present in your meeting records."

For other RCAs, it is not as crucial to have senior leaders involved. But those meetings could be good way for them to get experience and be ready for the sentinel event RCAs, Hirsch says.

10. Not following through.

Most facilities don’t have a good tracking system set up for the various improvements coming out of an RCA, Hirsch says. You identify the contributory factors and what to do about them, what must be measured in the follow-up, what deadlines must be met, but then there is no system to assess how all that turns out. "A lot of RCA teams will come up with final recommendations and leave it at that," he says. "They never really know if it was implemented, or if it was implemented, they don’t know if it had an impact."

Many RCA teams also do not adequately apprise senior leadership of their results, Hirsch says. It is typical for teams to specify many details of each recommendation, such as who is going to implement it and by what deadline, but Hirsch says there should be one more item on the checklist: leadership response.

11. Not spreading the word of what you learn.

People tend to make the most of the information gleaned from the RCA in the area in which the incident occurred, but those lessons learned often are not disseminated through the organization, Spath says. If the incident occurred in the intensive care unit, such as mixing medications improperly because the label can be misread easily, that information could be applicable to other units in the hospital. "Often the people who learn the most from the RCA are the ones in the units where that incident occurred, which sounds fine at first," Spath says. "An RCA should be much more useful to the organization as a whole. Too often there is no mechanism for providing that information to others in the organization."