JCAHO implements quality indicator measurements
JCAHO implements quality indicator measurements
Revamped inspection criteria to begin next year
A tougher set of standards must be met by health systems seeking future accreditation from the Joint Commission on Healthcare Organizations (JCAHO). For the first time, JCAHO has implemented requirements that consider quality of care.Meet these new requirements, and your health system will have the seal of approval for all to see, including payers shopping around for a health system to contract with and patients looking to get the best care. Fail to meet the quality requirements, and you may see payers and patients go to a competing health system.
The new quality indicator tracking system, called ORYX, will be phased in gradually over the next several years.
Meeting the requirements
Health systems seeking Joint Commission accreditation have until the end of 1997 to select an approved performance measurement system and to select a minimum of two quality indicators that relate to at least 20% of a health system’s patient population. (A list of about 60 approved reporting software manufacturers has been published on JCAHO’s Web site. The address is http://www.jcaho.org ).Data will begin being reported to the commission no later than the first quarter of 1999 and then quarterly from then on, says Dennis O’Leary, MD, Joint Commission president.
"There’s a clear expectation of a demonstration of improvement," says O’Leary. "We’ve raised the crossbar."
While meeting the requirements will mean more work in the long run, hospitals will benefit from the objective feedback, he adds.
The benefits listed by the Joint Commission of the stepped-up accreditation include:
• Because the indicators will be monitored and compared over time, hospitals will get an "early warning" of problem areas, which then can be corrected early rather than waiting for problems to become more serious — and costly — before they are corrected.
• Areas of excellence within an organization will be identified, which will help hospitals select best practice criteria.
• Accreditation will serve as objective proof to payers and the public that quality care is being delivered in your institution, thereby increasing the amount of community trust in your organization.
Most hospitals aren’t expected to have any trouble meeting the initial requirements since many already benchmark quality indicators to some degree. "We fully expect that the first two indicators hospitals choose will be ones that make a favorable impression," O’Leary adds. "But the requirement levels will go up over time, so within a few years it won’t be so easy to identify such positive indicators. Our first priority is to get everyone on the train, then improve the process."
Some skepticism exists
Some professionals, however, express concern with the new initiative. "As I see it, the crux of the problem is trying to compare apples and oranges," says Patrice Spath, ART, a Forest Grove, OR, health care consultant specializing in quality and resource management issues. "If the Joint Commission takes measures from one [quality indicator measurement system] and bundles them with measures from another project, the outcome will be unreliable data because different projects maintain different data definitions. We would like to think that every indicator project defines, for example, patient falls in the same way. But they don’t. If I’m counting patient falls and using the definition applied by my project, my number of patient falls is not comparable to those of another project using a different data definition."In order to minimize such discrepancies, O’Leary says the approved information collection systems are and will be continually audited to ensure data quality, and that over time definitions will be standardized. This "should minimize anticipated problems," he adds. In the beginning, hospitals also will only be compared to other hospitals using the same data collection systems.
Although Spath believes the collection of quality data on a local level is a good thing, she does have reservations about the value of quality reporting on a broader level — especially considering the added cost to purchase, maintain, and operate such information systems. Hardware and software costs range from about $10,000 to more than $100,000. But that’s only the beginning. "Running them is resource intensive," Spath says. Someone has to collect, format, and submit the data. This could result in a significant economic burden on hospitals."
In addition, she argues that hospitals with quality improvement staff usually already know where the problem areas are. "I’m all in favor of individual providers having comparative benchmarking data to evaluate their performance. At the local level, that data have a real impact on patient care improvement. But how will sending that data to the Joint Commission improve patient care quality?" Spath asks.
Whether it does or not, hospital and health system executives better get used to the idea of having quality comparisons made, says Dennis Hulet, FSA, principal of Milliman & Robertson in Seattle. In a decade or less, Hulet predicts, payers and the public routinely will seek those data to make health care purchasing decisions. Two major stumbling blocks, Hulet says, have been the sophistication of information collection systems and the lack of a central agency like the Joint Commission to distribute performance information. "I see both of those obstacles being overcome in the future."
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.