JCAHO's ORYX system may have data flaws

Audit should improve quality

Some of the performance measurement systems linking up with the Joint Commission on Accreditation of Healthcare Organizations to report infection data do not perform risk adjustments, and are collecting data that are not recommended for comparison between hospitals, a Centers for Disease Control and Prevention official reported recently in Orlando at the annual conference of the Society for Healthcare Epidemiology of America (SHEA).

"I believe some of these so-called aggregating groups have aggregated their responsibilities," said Robert Gaynes, MD, director of the CDC's National Nosocomial Infection Surveillance (NNIS) system. "They must develop consistency in data collection. . . . The level of risk adjustment was either nonexistent, or at best, limited [or] clouded."

According to ORYX plans launched last year, each accredited hospital and long-term care organization in the United States must select (or already be participating in) one or more performance measurement systems that have been approved by the Joint Commission. Initial requirements call for institutions to select two clinical measures that cover at least 20% of their patient population. The number of measures and the percentage of patients covered will increase as the project is phased in over the next few years. After the selection process is over, the facilities are to begin submitting data by the first quarter of 1999 to the performance measurement systems, which will then report findings to the Joint Commission. Some of the systems will collect infection rate data and some will use other clinical measures.

Overall rates, administrative data found

In an attempt to catalog the various approaches proposed by the systems thus far, Gaynes and colleagues worked with the Joint Commission in contacting 211 of the systems that had signed on as of February 1998. The effort was somewhat hampered by confidentially agreements the companies have made with the Joint Commission, and Gaynes did not specify which system reported which approach. Nonetheless, Gaynes reported that of the 130 systems that responded, 28 (21.5%) reported monitoring some aspect of infection rates.

Gaynes expressed concern about the approaches described by the systems. For example, one reported collecting only an overall nosocomial infection rate by admissions or discharges, two others collect overall rates and another infection measure, and six other systems indicated that "upon request" they will collect an overall rate of infections at all sites divided by the number of patients or patient days. The CDC recommends against using such overall rates for comparisons, including, for example, an overall surgical-site infection (SSI) rate divided by the variety of procedures, he said. In addition, Gaynes reported a reliance within the systems on coded administrative data on infections rather than active clinical surveillance.

NNIS can join through APIC link

Reviewing additional information available on 22 of the 28 systems that collect infection data, Gaynes found that 17 calculate SSI rates. Of those, however, 10 rely on administrative data sets rather than active surveillance.

"Nosocomial infection rate indicators are often based on administrative coded data, which is often biased since it usually functions for purposes of financial concern rather than purposes of clinical concern," he told SHEA attendees. "A number of studies have shown that less accurate information is obtained from coded data than by active surveillance by trained infection control practitioners. Second, it is surprisingly common that a number of rates are calculated that are currently not recommended for comparative purposes."

The CDC will report the findings to the Joint Commission, which plans to address such problems by developing "clinical core measures" to standardize data collection, Gaynes added. That is an area where SHEA and the Association for Professionals in Infection Control and Epidemiology (APIC) could provide important input, he said.

"If you have an indicator in the field of infection control, they will describe a half-dozen things that you might collect, but you have to collect it this certain way," he explained. "Now how are they going to determine that? That is potentially where the SHEA organization, perhaps APIC and others, may be able to help guide the Joint Commission."

In that regard, the CDC has linked up with APIC to form a new performance measurement system - NNIS-APIC - in order to allow the approximately 265 NNIS hospitals to voluntarily report infection data to the Joint Commission to meet ORYX requirements, Gaynes announced. The arrangement was made in part because NNIS system data are confidential under public health law, he noted.

"On the other hand, it does raise the second question, which is: If you are sending data to what is essentially the regulator of the hospital, is there the potential for a bias in the data?" he said. "If that's the case, could it, in fact, contaminate the database for those [NNIS] hospitals not choosing to participate? That is an unanswered question and one we are going to have to look at as we move into this. There is a variety of different view points and we agonized long and hard about whether we should participate."

Participation, even at arm's length, gives the CDC a chance to provide input into the ORYX program, and there are other favorable signs that the system will be improved considerably after a shake-out period. For example, plans call for an audit of the performance measurement systems by a Joint Commission advisory board that includes top national experts, noted Walter Hierholzer Jr., MD, epidemiologist at Yale New Haven (CT) Hospital,

"The bar to get in has been low, but an audit phase has started now," he told SHEA attendees. "The audit phase is going to conducted by this [Joint Commission] scientific advisory board. My expectation is that large numbers of these performance measurement providers are going to fall out. In fact, some of them have already."

As the ORYX systems and other infection rate reporting initiatives take shape, an important element to preserve is the importance of the individual hospital analyzing its own data, Gaynes noted.

"This cannot become a 'black box' where they send in data and occasionally get some sort of feedback that may or may not be relevant to what they actually want to look at," he said. "The individual hospital must be allowed to analyze their own data or else you're going end up with poorer and poorer quality."