Use ORYX to achieve your quality improvement goals

Maligned initiative can be benchmarking gold mine

It’s no secret that ORYX is a four-letter word at most health care institutions. Hospitals have been protesting the program since its inception, besieging the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) in Oakbrook Terrace, IL, with complaints of added work, too much expense, and no return on investment.

Still, says Janet Barrett, MSN, CPHQ, a hospital quality consultant and former director of quality systems at a rural medical center in Washington state, "with ORYX we find we are comparing results within communities and affecting care all over, not just within our own little group. Hospitals were used to measuring things, zillions of things, but in isolation within our own institutions. ORYX has required that we compare results with our colleagues and that our measurements be compared with other hospitals."

It’s been a year since the Joint Commission, using feedback from the health care field and dialogue among key stakeholders, ultimately focused on five core areas for hospitals to address:

• acute myocardial infarction;

• congestive heart failure;

• pregnancy and related conditions;

• pneumonia;

• surgical procedures and complications.

Advisory panels were set up to identify the clinical logic relevant to each focus area. The use of these core measures, according to the Joint Commission, will allow for benchmarking based on processes and actual outcomes of patient care. Using common measures by similar health care organizations, the Joint Commission hopes to provide meaningful performance measurement data that can be made available to the public.

Barrett points out that this can have a positive effect within a community and ultimately across the entire country. As an example, she cites a mid-sized town in Washington with three hospitals. "Take cardiology, for instance," she says. "They’re all treating heart patients, but there are unique system issues in each hospital." When those issues are identified and then compared among the three hospitals, she points out, everyone gains. Weaknesses and adjustments that are going on in Hospital A can be a heads-up for Hospitals B and C. Strengths in the cardiology department at Hospital B can be used as models for the other two. "Now when these hospitals go through the process of measuring data, they know they’ll be comparing that data with the other two institutions," Barrett adds. "It won’t just be a statement about their own individual programs."

In her work with one hospital, she uses the hospital’s first ORYX report as an example of the positive results ORYX reports can bring to health care. In that report, the hospital measured four areas:

• overall complications of care;

• surgical hemorrhage or hematoma;

• common complications for obstetrics;

• unplanned avoidable readmissions for obstetrics.

The good news? Recognized obstetrics measures were below expected rates and stable. The hospital was home free on that issue.

The bad news? Surgery hemorrhage and hematoma turned out to be the No. 1 surgical complication, with 60 incidents that year — which seemed high. The data also showed 15 accidental punctures in the operating room for one quarter. This was an alarming statistic. "The physicians were appalled at the rates," Barrett says. "Their initial reaction was that it must be a coding snafu or a software problem." With even greater outrage, some claimed, "It’s the Joint Commission’s fault." Quality personnel responded by running outcomes analyses at every meeting they could and offering to run any report the doctors wanted in order to illustrate the data. They also organized small mini-coding sessions for the physicians to illustrate the coders’ role and competence.

Then the hospital went to work to address the problems. Physicians and coders worked together to define accidental puncture incidents, hemorrhage or hematoma, and acute postoperative anemia. New technology was added to reduce the likelihood of punctures in the operating rooms for neurosurgery and orthopedics.

Drugs for surgery patients were reviewed, and the information revealed that patients over 65 were often given drugs that caused falls, confusion, or gastrointestinal symptoms. By adjusting these drug orders, the hospital reduced the incidence of postoperative confusion by 70%. The pharmacy and coders are now trying to find a system to communicate adverse drug reactions to clinicians.

"Why was ORYX important?" asks Barrett. "Because it focused the medical staff and organization on hard clinical data." The ORYX data were tough to argue with, tough to debunk, and, most of all, "tough to live with unless we did something."

The data also validated the hospital’s outstanding OB service and provided a source of ongoing measurement and improvements. "Finally," Barrett says, "it recognized the real relationship among care, outcomes, patient satisfaction, and resource utilization." She points out that the cost of an ORYX review is less annually than one accidental laceration in the OR. The solid comparative measures also help the hospital meet mission requirements. And they improve management of time, people, and resources, as well as staff satisfaction.

Barrett emphasizes the need to communicate information obtained from ORYX data to the medical staff. It’s also important to communicate the data outcomes and educate everyone from the CEO to RNs.

Barrett sees the ORYX program as getting to population-based statistics. "If we’re going to spend all this money, we need to come up with good statistics," she says. Citing one hospital she’s worked with, she notes that "until ORYX, this kind of data was hidden within the facility and released to selected administrators. Now, over a three-year period, they have opened up to other layers of management in the facility. Midlevel managers are now tuned into what was happening, from finance to falls. They are aware of the weak points and the action plans related to improvement."

That kind of open communication, of course, makes for far better morale within an institution.

Barrett sees good things ahead if the system pursues statistics in a well-organized manner. She expects that hospitals soon will get results from whole groups of hospitals with which they were compared. "And in a few years, we should be able to get reports on birth rates, fall rates, surgery results, and so forth from everywhere in the country."