The trusted source for
healthcare information and
The ORYX core measurement pilot project is yielding two conclusions from the participating hospitals. The good news: The data look valuable and should help improve health care. The bad news: It takes a lot of work to get those data, according to some participants. Many health care providers probably won’t be ready when the time comes to start collecting the data nationwide, says Karen Reeves, vice president of professional services with the South Carolina Hospital Association in West Columbia. The association is one of several helping the Joint Commission on Accreditation of Healthcare Organizations to conduct a trial run.
"I don’t think administrators are ready for how people are going to come to them and say we need more people; we need more resources,’" Reeves says. "We’ve gotten used to using automated systems to do reviews, and now we’re going back to in-depth chart reviews. I’m afraid it’s going to take a lot more work than people are anticipating right now."
(Editor’s note: The August 2001 Hospital Peer Review erroneously reported that data collection started this year. As savvy HPR readers know, only those centers participating in the pilot project are collecting data now. Other accredited facilities will begin collecting the data in July 2002.)
Officials with the Joint Commission offer reassuring words and stress that the final ORYX system may not be identical to the pilot project. They say the reports of such labor-intensive experiences are just anecdotal so far, and even if the system does take a lot of work, there could be changes in the coming year that would ease the burden.
Jarod Logue, PhD, the Joint Commission’s vice president for research and performance measurement, confirms that pilot project participants are reporting it takes a lot of work to collect the data, but he cautions peer review professionals not to assume that means their own work in 2002 will be unreasonably hard. "Are these measures labor-intensive? The answer is absolutely," Logue says. "But this is a work in progress. What the pilot project participants are doing today is not necessarily what everyone else will do in 2002. These are groundbreakers, and they’re doing yeomen’s work to shape the process."
The core measure pilot project is a collaborative effort among the Joint Commission, five state hospitals associations, five listed measurement systems, and 83 hospitals in nine states. The project is intended to give the Joint Commission, health care organizations, and performance measurement systems an early base of experience in implementing a limited number of core performance measures, which have been grouped into these five sets: acute myocardial infarction (AMI, nine measures), heart failure (HF, six measures), community acquired pneumonia (CAP, nine measures), pregnancy and related conditions (PRC, three measures), and surgical procedures and complications (SPC, two measures).
The project began September 1999 when Dennis O’Leary, MD, Joint Commission president, sent a letter to all of the state hospital associations, soliciting interest in a pilot project that would assess the implementation of core measures by hospitals. Of the eleven qualified state hospital associations that expressed an interest, five were randomly selected to participate. Each was asked to identify a performance measurement system and 10 to 25 hospitals that also were interested in testing the implementation of core measures.
For the pilot project, participating hospitals have begun the data collection process for AMI, HF, and CAP measures, and the first transmission of data to the Joint Commission was received mid-April 2001. Throughout 2001, the Joint Commission, state hospital associations, measurement systems, and hospitals participating in the core measure pilot project have been working together to meet the objectives of the project and prepare for the national implementation of hospital core measures in 2002.
Hospitals will begin collecting core measure data for patient discharges beginning July 1, 2002. That date will come all too soon for some providers, Reeves says. "This is another case of the Joint Commission developing something that is, in its opinion, the reasonable, rational thing to do, but [the organization] underestimates how executing [the project] will have a major effect on hospitals as far as cost and staffing," Reeves says.
The Joint Commission explains that each accredited organization will be required to select core measure sets based on the health care services the organization provides. Surveyors will assess a health care organization’s use of its selected core measure sets in its performance improvement activities during the on-site survey process. Over time, core measure data will be used by the Joint Commission to assist it in focusing on-site survey evaluation activities.
The total number of measure sets an organization will be required to use will be relatively small. Individual organizations to which the core measure sets identified for their type are not applicable will continue to use their noncore measures to meet ORYX requirements until applicable core measure sets are identified.
Other types of accredited organizations will continue using noncore measures to meet ORYX requirements until core measures are identified for them. Home care core measure identification will focus on adopting Outcome and Assessment Information Set-derived measures for home health agencies, while long-term care core measure identification will focus on the adoption of Minimum Data Set-derived measures. Core measures for home care and long-term care organizations are expected to be consistent with Centers for Medicare and Medicaid Services (formerly Health Care Financing Administration) requirements in order to reduce duplication in performance measurement activities.
For long-term care, home care, hospital, and behavioral health care organizations that are becoming accredited for the first time, the requirements to participate in a measurement system and transmit data to the Joint Commission have been abated for two years or until core measures for these programs are implemented, whichever comes first. These organizations will satisfy performance measurement requirements by selecting and using six measures from the universe of relevant measures. For these organizations, Joint Commission surveyors will evaluate their use of performance measurement data for quality improvement during the on-site survey activities.
One peer review administrator in the pilot project tells HPR that it will yield "great, super data," but that it is "very labor-intensive." Indun Whetsell, RN, BSN, is quality management coordinator at The Regional Medical Center of Orangeburg (SC) and Calhoun County. Her facility is one of three participating in South Carolina. Staff at that facility are collecting data on the congestive heart failure and myocardial infarction measures. "They’re very specific data, which makes them really useful, but it’s all manual extraction," she says. "It literally means pulling a chart and sitting down to go through it and finding the right information. The data are at a much higher level than when we were just pulling it off the bills, but that [quality of information] comes at a price."
Staff members in the medical records department pull the chart and record the appropriate data. The chart review can take between 60 and 90 minutes of combined staff time, which includes everything from initially pulling the chart to abstracting the data and entering the information into the computer. That time-per-chart adds up quickly, she says.
To improve the process and make it faster, Regional Medical Center developed a form that nurses can use when providing patient education for congestive heart failure and myocardial infarction. The form lists all the education steps that are supposed to take place. The nurse checks off each step on the form, which is added to the patient’s chart. That puts all the patient education information in one place so the chart reviewer can find it more easily.
"The extraction is a lot simpler when they use that form," Whetsell says. All the information is centrally located. That’s the only tool we’ve put in place so far, but I think we’ll end up doing more, and I’d recommend that hospitals look for ways like that to make the process easier."
The process also requires physicians to check for things such as contraindications, information systems staff to pull demographic data, and clerical staff to input data. The resulting data are immediately useful, she says, but the pilot project has put a strain on the hospital. "It takes a lot of people to get good data," Whetsell says. "I think it’s going to be a shock for people next year. You know it’s coming, but until you do it, I don’t think you realize how much work you have to do and what’s involved."
According to Reeves, the other two South Carolina hospitals in the pilot project are reporting the same difficulty. When full-scale reporting starts in July 2002, many health care providers will find that they did not adequately budget for the increased labor, she predicts. "Hospitals will have to find a way to accommodate the labor requirements, and there is no vendor out there who can do it in an automated format," she says. "This is not the kind of data that can be captured on billing or automated data sets."
Aside from the staffing issues and the costs related to that problem, Reeves says hospitals are likely to find additional expenses that will blow their budgets. "We still don’t know how much it’s going to cost and that’s extremely frustrating," she says. "No vendor can tell you the exact price for transition to the core measures, and hospitals are well into their budget planning. You can’t go to the administration at the last minute and say you need a lot more money. Hospitals are going to have to make last-minute decisions, and administrators will be surprised at the money they’re asking for."
Logue disputes Reeves’ assessment of how much time is required and says other hospitals have reported satisfaction with the process. He notes that a hospital’s experience could depend on what information processing systems already are in place, as well as other variables such as the hospital’s size and the level of support from the administration. (For recommendations on how to prepare for the July 2002 implementation, see "Prepare now by studying data sets, enlisting support from top," in this issue.) "Remember also that there is a learning curve in every hospital," Logue says. "It will not take as long to do it next month as it did this month, and it will take even less time the following month."
Whatever additional work is required, Logue says, it’s worth the effort. "Remarkably, the imposition of core measurement activities doesn’t seem to result in a significant difference in the work these hospitals are doing already to meet Joint Commission requirements," he says. "That doesn’t mean there hasn’t been an increase in full-time equivalents, but the key question is whether the juice is worth the squeeze," Logue adds. "The answer we’re hearing is Yes. It is.’"