JCAHO’s ORYX could set national benchmarks

But managers are leery of new costs

If you are an optimist, the new outcomes reporting requirements announced in February by the Joint Commission on Accreditation of Healthcare Organizations in Oakbrook Terrace, IL, could be seen as a benchmarking bonanza. That’s especially true if the new reporting effort, ORYX, leads to more standardization in outcomes reporting.

On the other hand, if your hospital system is not set up to collect and report outcomes electronically, getting ready for the new reporting requirement could put a big hole in your budget. And even if you are already using one of the approved vendors to assimilate data, your costs may go up because of Joint Commission requirements.

Then, there’s the pessimistic perspective: The Joint Commission tried this in 1996, with the ill-fated IMSystem (indicators measurement), and that effort has gone nowhere.

ORYX — named for an African gazelle — is the initiative that integrates performance measures into the accreditation process. All hospitals and long-term care facilities must report electronically to the Joint Commission two outcomes measurements through one of a list of approved vendors by the end of 1998.

Thereafter, the hospitals will report quarterly. In turn, the Joint Commission will provide quarterly reports from its database to all hospitals. Ad hoc reports will be available at an additional cost. Other health care organizations accreditated by JCAHO will have similar reporting requirements phased in over the next four to six years.

Starting up ORYX, the Joint Commission says, will cost hospitals about $10,000 on average, but that doesn’t include the hidden costs of adopting a new system. Hospitals may eed to hire new personnel or at least train current staff to perform this function. Ongoing maintenance costs should range near $11,000 annually.

Eventually, JCAHO would become the repository of immense databases of outcomes from which it would pull benchmarks. Fortunately, a majority of hospitals have little to fear from the first stage of the new requirements.

"Most hospitals are already benchmarking their performance," notes Indun Whetsell, RN, quality improvement coordinator at the Regional Medical Center of Orangeburg (SC) and Calhoun Counties. "If a hospital is already involved in some type of comparative project, such as South Carolina’s Quality Indicator Project or the Maryland Hospital Quality Indicator Project, that reporting system needs only to be blessed by the Joint Commission, and a hospital can continue doing what it’s doing."

Hidden costs are substantial

Others are less comfortable with the new requirement, however. "Some of the approved systems are relatively inexpensive," comments Patrice Spath, ART, consultant in health care quality and resource management in Forest Grove, OR. "Others cost up to $100,000. But costs go beyond buying hardware and software. Running them is resource-intensive. Someone has to collect and format and submit the data. This could result in a significant economic burden on hospitals."

"We anticipate that what we charge hospitals will increase now," says Karen Reeves, RN, vice president of professional services for the South Carolina Hospital Association in Columbia, "because the Joint Commission will have specific requirements for data transmission and analysis."

A couple of years after South Carolina’s Quality Indicator Project started, it charged hospitals $400 annually to participate in the acute care indicators. Today, it charges $800 each year. "But tomorrow?" asks Reeves. "I don’t know exactly what we’ll have to charge because the contract we executed didn’t specify all the future programming changes that may be required for data submission.

"We’ve been trying to ascertain exactly what the Joint Commission will be requiring of participants. Will the Joint Commission be responsible for developing charts, graphs, and reports, or will hospitals? None of that is outlined in the contract."

Costs will be passed on

"We did commit to payments of $5,000 per year over the next three years," says Reeves, "in order to be considered on the list of 60 systems that met the initial screening criteria. We’ll have to pass on some of those costs to our participating hospitals."

Dennis S. O’Leary, MD, president of the Joint Commission, says ORYX should not trigger survey fee increases by itself, but he does warn hospitals that the Joint Commission has gone three years with no survey fee increase, and some change is likely.

Can accreditation be denied based on performance measures? With ORYX, the Joint Commission will be reviewing performance data quarterly, so the evaluation process will be more continuous. At first the performance data will be compared in the context of a system-defined cohort — that is, among other hospitals using the same system, O’Leary says.

A clear and present scrutiny

"There’s a clear expectation of a demonstration of improvement," says O’Leary. "We’ve raised the crossbar. We’ll be looking at trend lines, and when we see one going in the wrong direction, we’ll ask for a written analysis of the problem. Ultimately, we’ll send out a team to investigate." The Joint Commission will expect evidence of improvement within six to 12 months of data collection.

But Spath questions how all of these data will affect patient care in an already tight financial environment."How is the ORYX initiative actually going to improve patient care?" she asks.

"In this environment of cost containment, we can’t afford to spend dollars on things that are meaningless. I’m all in favor of individual providers having comparative benchmarking data to evaluate their performance," she says. "At the local level, those data have a real impact on patient care improvement. But how will sending those data to the Joint Commission improve patient care quality?

"If, for example, one facility’s C-section rates are higher than another’s, the QI personnel know that," Spath says. "The facility may have higher-risk patients. What action can the Joint Commission take on that issue? Will ORYX prevent sentinel events? Most sentinel events occur in low-volume patients. By definition, that population wouldn’t be a part of one of our indicators for the Joint Commission."

Eventually, according to the Joint Commission, specific indicators will be standardized nationally so everyone is comparing apples to apples. But is that task possible?

"As I see it, the crux of the problem is trying to compare apples to oranges," Spath says. "If the Joint Commission takes measures from, for example, the Maryland Quality Indicator Project and bundles them with measures from another project, the outcome will be unreliable data because different projects maintain different data definitions.

"We would like to think that every indicator project defines, for example, patient falls the same way," she says. "But they don’t. If I’m counting patient falls and using the definition applied by my project, my number of patient falls is not comparable to those of another project using a different data definition."

"The systems and the Joint Commission will conduct periodic data quality audits. That should minimize anticipated problems," says O’Leary. "ORYX PLUS will contain a single set of core measures, all of which have been tested for reliability, validity, and discrimination capability."

An accelerated option, ORYXPLUS will be available on a voluntary basis to facilities that track data at levels beyond the basic ORYX requirement. The Joint Commission found that 69% of hospitals already participate in at least one performance measurement system; some are using as many as 24 indicators.

Participants commit to public disclosure

Under ORYX PLUS, participating hospitals will choose at least 10 of 25 identified acute care measures and commit to future public disclosure of meaningful performance data. ORYX PLUS hospitals will constitute their own cohort; they will only be compared to one another, regardless of which system their data come through.

"We developed [ORYX PLUS] at the request of hospitals at an advanced stage in their use of performance measures," says O’Leary. "The system will utilize a common set of fully tested performance measures and will lead to the creation of a national database." The fee structure for ORYX PLUS has not been determined yet, and the number of participants that may eventually participate has not been estimated.

All agree that performance measures are essential for quality improvement, but opinions vary as to the initiative’s efficacy. Some Quality Improvement personnel don’t see the need for additional labor and mechanisms for reporting to the Joint Commission unless it results in some real gains.

"I’m happy that the Joint Commission is making an effort to eventually coordinate us," says Reeves. "If they can accomplish that, this initiative will be helpful. I do think it could be done faster and cheaper, however. I’ve been on different Joint Commission task forces, including their IM System advisory council. The Joint Commission is a political body. ORYX is definitely a money-making initiative."

[A list of approved reporting software manufacturers has been published on the Joint Commission’s World Wide Web site (http://www.jcaho.org).

The site offers information profiling each performance measurement system and guidance for selecting one that best meets your needs.]