Benchmarking project becomes learning lesson
Benchmarking project becomes learning lesson
Lack of focus dooms joint effort
Benchmarking can lead to some wonderful improvements in your wound care program, but only if the study is done correctly. That’s the hard-learned lesson of participants in a benchmarking study that they later found lacked focus, good data collection, and follow-up.
After spending nearly a year to benchmark best practices in wound care in 1995, a group of six Southeastern hospitals abandoned the project. "We ended up with no guidelines of care from our basic results," says study participant Catherine Newhouse, RPT, director of rehabilitation services at the 588-bed Spartanburg (SC) Regional Medical Center.
"We were looking at a very broad area with very little in journals that have addressed best practices issues," notes participant Harriet Jeffords, RPT, HSHA, director of rehabilitation services at the 333-bed McLeod Regional Medical Center in Florence, SC. "But we weren’t able to draw any conclusions from the information we got."
Donna Beck, PT, director of rehabilitation at Anderson (SC) Area Medical Center, with 579 beds, adds, "Essentially, we did all this work and it sounded great, but we didn’t get anything back from it."
For a year, these program managers and their hospitals coordinated a plan that they believed would produce results, but which turned out to be a cruel disappointment. Study managers identified several reasons for the study’s failure:
• Insufficient data.
Not all participating hospitals collected data on the minimal number of diagnoses and therefore had to be dropped from the study.
• Inconsistent data.
The types of wounds for which data were collected ranged too widely.
• Too many caregivers.
Too many different caregivers were involved with wound care, and they did not all have the same skill levels in identifying the condition of wounds according to the data collection tool.
• Lack of carry through.
Compliance with collection slipped as time went on.
Good start
The study began with a standardized data collection tool and a lot of input from the six hospitals. Caregivers chosen to collect the data were educated on how to use the tool to ensure data was collected uniformly, Newhouse says.
The hospitals also developed an initial assessment tool to identify:
• type and amount of tissue removed from the wound;
• stage of the wound;
• wound color;
• whether granulation was present;
• amount of tunneling;
• amount of undermining;
• edema;
• wound depth, width, and length in centimeters.
Assessments were to be done on the first visit, at every sixth visit, and again at discharge, Newhouse says.
Treatment options were broken down into categories such as:
• whether irrigation was done by hand or whirlpool;
• whether an antiseptic such as Betadine was added to the whirlpool;
• whether a topical debridement was done and whether it was done chemically or mechanically with scissors or tweezers.
Wounds were assigned a number according to their condition. This number was put into a computer for analysis. The hospitals also developed a tool for measuring the reliability of wound ratings to ensure caregivers were assessing wounds in the same way across all sites, Newhouse says.
Despite the participants’ best efforts and education, the wound care data collection instrument proved to be just too complicated, says Beck. "I don’t think everybody consistently followed the guidelines that needed to be done on initial assessment and post treatment. Every time you saw someone, you were supposed to measure some basic things such as exudation and wound color. Some people just didn’t do it," Newhouse explains.
Lack of focus’
Another problem was the lack of focus, participants say. Data should have been collected on one wound care diagnosis such as chronic ulcer or diabetic ulcer or burn, says Newhouse. "We did every wound that came in."
For example, data was collected on abscesses, amputation that hadn’t healed, arterial ulcers, gangrene, and pressure ulcers. "If we had just looked at our volumes of wounds and what area seemed to have the most problems. Or, if we identified the most problematic patients across the sites, we could have focused our study on one wound condition and gotten information that would have been valuable," Newhouse emphasizes.
In addition, not enough education was provided to caregivers on how to collect data, Jeffords says. "It would have helped to have some type of film on data collection to provide reinforcement," she says. "We weren’t there to reinforce the practices as the study was being done."
Newhouse says, "We learned we needed to focus our study and improve data collection." In fact, some of the data was questionable. For example, one hospital reported it saw a patient 437 times and only got a 45% improvement in the wound. "We thought, no way, some of these numbers are screwy," she continues. "It was a good idea, but we did too much."
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.