The trusted source for
healthcare information and
Health care organizations are experiencing the mixed blessing of the best of times and the worst of times: Data are everywhere, providing a rich opportunity to streamline operations, improve outcomes, and save money. But it’s also a time when the sheer volume of requests for data and the necessity to report the results to various audiences are vexing many a hospital administrator.
"We work with hundreds of hospitals, and it is very common for all of them to gather data, try to explain them in too short a time, and not do anything with them afterwards," says Angie Merkel, MBA, director of the Ann Arbor, MI-based consulting firm The Medstat Group. "And as data become more important to the hospitals and more departments and people depend on them, more reports are generated."
The topic was a popular one at Medstat’s recent client conference. "Every head in this presentation was nodding," says Merkel. "Everyone understands and identifies with this challenge. Hospitals have a lot of sources of information and data and many users and many audiences with varied levels of sophistication. Most hospitals have to gather data and present them in a meaningful manner to many audiences so they can understand them, internalize them, and put them to use. It’s a real challenge."
But not all hospitals are up to the challenge, Merkel says. If your hospital can’t manage data effectively, "you can’t use your data as a quality improvement tool," she points out.
Andree Joyeaux, director of communications at Medstat, says that it’s just as bad to have the right data presented poorly as it is to have poor data presented well. "Or you can have the right data, well-presented, but without action steps. If you fall down on any one component, you lose the use of them."
There’s no point in doing all the work to collect data if you can’t use them, says Judy Sikes, PhD, CPHQ, director of accreditation/medical staff services at Parkview Medical Center in Pueblo, CO. Sikes gave a presentation at the Medstat conference based on her experience at Parkview, a 305-bed hospital. "There is so much information and only 15 minutes to talk about it," she says. "We have a very involved board, but I don’t think they have enough information to be a resource."
Remedial classes were held at the conference to get attendees up to speed. Sikes did inservices on how to analyze a chart and asked board members what they needed to help make the data more meaningful. "Until that happened, I would spend all this time putting these data together into these nice reports. I’d give it to them and even though they cared, it was just too much information in too short a period of time, with too little supporting information."
An example is the ORYX indicator on mortalities for acute myocardial infarctions (AMI). "I would present data that shows we were 1% lower than the national average [of 4%] and to comparable ORYX data," says Sikes. "They’d think that was great. But it’s not enough. We want to look at that 3% and see what we can do better."
Another type of data used is the documentation of the Glascow Coma Scale for head injury patients in the emergency department. The hospital was about 42% compliant. "I looked at that and thought we could do better, but if you look at the national data, [that rate is] 37%, and [the rate for] ORYX data is 30%," she says. "You have to question if this is an important indicator, because it’s just not enough to be better than the rest."
Looking deeper into the data, Sikes and her team found that there was a problem in data collection. "When discharging people, we had put a code of 20 for discharge to home, but that was coming across as discharge to acute care bed in our software."
A manual chart review changed the data to show Parkview was doing even better than the 45%. "We are still working on it, but we think now we are at 58% to 60% compliant. We don’t have a lot of cases. There are a lot of exclusions for this indicator. But the board can really get into this story. It’s not just, We’re at 42% — end of story.’ They understand that it’s not enough to just be better."
The goal can’t simply be to reach 100% compliance for every target, Sikes explains. "The goal has to be doing better than we did last time." The number is important, she admits, "but it is more than simply a number. We have to try to think of the difference between numerative and analytic statistics. If you say we are at 3% mortality rate and want to know all about that 3%, you can find out how many patients came from out of town, who were the insurers, who were the physicians, and whether they got early referrals. We can find out maybe that a lot of these mortalities were people who missed physician appointments. Maybe reducing that number further is as easy as instituting a reminder program."
When board members hear more than numbers, Sikes adds, "they are more likely to help with resources and direction."
Another trick Sikes has learned is to weed out extraneous data. She sat down with the CEO and the vice president of medical affairs to try to come up with a way to provide more information in less time. "We just have to weed out the nonessential numbers. We can’t ask board members to take home everything. What is the information that is really vital?" She took a typical ORYX report and looked at it. "One data collection tool indicates the day of the week that an AMI patient came in. That’s five pages of graphs, but it doesn’t make a difference. I’d rather give them something about what causes the 3%."
Sikes asked department heads what might be potential causes for those mortalities. It’s really hard to get physicians to commit their time to serving on committees, she says. "But if you can take them data that are valid, they will look at them and respond. Show them what will make a difference to even one patient, show it concisely, and they will look at it."
Parkview now has more physician participation on the quality improvement committee than ever before. "One physician even wants to work on a policy on wrong-site surgeries and sentinel events. We’ve never even had this happen here, but he wants to create a system to make sure it never does."
If you don’t look at data from all sides, you can’t find relevant information, Sikes argues. "You have to collect and report these data for [the Joint Commission on Accreditation of Healthcare Organizations] anyway. You might as well make a difference with them. When we first started doing this in 1998, everyone collected data, but 90% wasn’t earth-shattering — things like how many phone calls do you get a day. Now we ask the people who do the jobs what data are important."
With such small profit margins in health care and declining reimbursements, "you can’t afford to stay in business if you aren’t doing productive things that make your organization better."
There are morale reasons for making the data count, too, she adds. "We have a shortage of nurses around the country, and there is more documentation and paperwork they have to do than ever. If you ask them to collect more data, but ensure [the information] is meaningful and will help the patient, they will do it. If not, they won’t like it."
Taking a single piece of data and acting on it without looking deeper into its meanings can lead to mistakes, too, says Sikes. "Your decisions may not be valid if you don’t have the whole picture. You can actually do harm if you only look at one characteristic of a piece of data."
The problem, Sikes says, is that there is too much paper and too little time. Also, different stakeholders often need different materials. While process improvement committees may need to see some data monthly, the whole board may only need to see them annually. Physicians may want to see quarterly data. "I’ve taught classes on data collection for years," says Sikes. "I have found it’s a real waste of time to use nonpertinent examples, both in teaching and in reporting." For instance, when she first started teaching senior managers, most of the examples were about manufacturing. "That was really hard to translate for us. But once you make it interesting, they are more likely to learn."
Now she uses a particular committee’s or department’s own reports for teaching. The same goes for reporting. Why give a surgeon information on claims denials? If it doesn’t speak to the listener, he or she won’t hear it. "All the data we take are tailored to the audience we take them to," she adds.
Make sure the design of your reports is appropriate. Sikes says to make sure that extensive use of graphs and charts is made with a short analysis, pertinent information, and any recommendations included. Sikes’ phrasing is boxed and follows these lines: "We looked at this data and found X. Therefore we met with X and discovered X. Based on that information, we recommend the following action: XX."
How can you tell if you need to overhaul your data collection and analysis systems? Here are three signs, says Sikes:
If you need more indications that something is wrong with your data reporting and analysis system, does this scenario sound familiar? You report your data quarterly, make copies of graph pages, allow 15 minutes at a board meeting to present and answer the questions, and get very few questions because no one understands the material.
• Andree Joyeaux, Communications Director, and Angie Merkel, MBA, Director, The Medstat Group, 777 E. Eisenhower Parkway, Ann Arbor, MI 48108. Telephone: (734) 913-3000.
• Judy Sikes, PhD, CPHQ, Director of Accreditation/ Medical Staff Services, Parkview Medical Center, 400 W. 16th St., Pueblo, CO 81003. Telephone: (719) 584-4650.