Collecting data for ORYX: Here’s how to stay consistent, out of trouble
Collecting data for ORYX: Here’s how to stay consistent, out of trouble
Quality managers should lead data-gathering teams
By now you’ve chosen your outcomes to report for ORYX, but the next step will be one of the riskiest in your quality improvement career: collecting, measuring, and reporting ORYX data consistently.
There are ways to make the process easier, or at least to ensure you will not waste your time collecting unreliable data for ORYX, a new initiative required of all hospitals accredited by the Joint Commission on Accreditation of Healthcare Organizations in Oakbrook Terrace, IL.
First, you’ll have to decide what role you personally will take in the process. The quality manager should take the lead in data collection, although experts say the process should be a team effort.
"I can’t see how any quality manager could collect this data alone," says Janet Houser, RN, MN, MS, regional director of professional practice for Mercy Regional Health System of Greater Cincinnati. Mercy is a nonprofit, integrated health system with 29 facilities in Cincinnati. "I think the quality manager is the natural place for the lead." Houser says the team should include medical staff, administrators, materials managers, information systems managers, and others.
The quality manager may be the coordinator who is involved in identifying the best source of data elements, but it might be the health information managers who are best suited to collect the data, says Patrice Spath, ART, BA, a health care and quality management consultant in Forest Grove, OR.
The quality manager’s role likely will depend on the size of the institution, agrees Janice Schriefer, RN, MSN, MBA, director of outcomes management for Butterworth Health System in Grand Rapids, MI. "In smaller institutions, the quality managers tend to wear multiple hats, including risk management and utilization management," Schriefer says. "But ideally it will be a team effort in all organizations."
Another good reason for taking a team approach is that it reduces data collection problems in the event the quality manager leaves the hospital system, Schriefer adds. "When people are on vacation or someone switches jobs, you should have a lot of people who know how to access the data base."
Some measures are easier to collect in a consistent fashion by a health system than are others, and each hospital will have to live with their selections. Deborah Nadzam, PhD, RN, vice president for performance measurement at the Joint Commission suggests quality managers find out what rules their vendors have for submitting and collating data. Then ask these questions:
• How does data submission work?
• Are there checks in place to make sure that each user of the measurement system collects data in the same way?
• Are the measures risk-adjusted?
Then what do you do next? The experts offer these guidelines to gathering data in a way that is consistent, reliable, and valid:
r Create a central dictionary for measures. Consistent data collection means that each facility within a health care system must collect the information in the same way. This begins with identifying measures and defining them with clear and measurable definitions, such as what might be found in a central dictionary, Houser says.
"That data dictionary needs to be the source for your system," Houser says. "It helps to start with an inventory of what data you have and what data you need."
It’s crucial to make certain that everyone enters the data the same way, following certain criteria, Spath says. "Make sure you understand the data definitions used by the people who put them in your own information management system, and don’t assume that the definitions provided by your vendor are the same definitions used by a business office or the people in the health information management department," Spath advises.
If a health system decided to measure mammography rates, for instance, then the central dictionary would need to define exactly what would be used for the numerator and the denominator. These could include the rate for all people in a managed care plan, all the women in the community, all the women in the community who are at an age when they should have regular mammograms, etc.
r Make sure your vendor’s definitions are clear. Most vendors have systems in place for reliability that include clear definitions for coding information, Schriefer says.
Quality managers need to know the level of data quality control in the measurement system that they’ve chosen for the ORYX project, Spath says. "They need to find out if there’s a mechanism in place for the owner of the performance measurement system to validate whether they’re getting accurate data," Spath adds. "Or does the information go in with an assumption of accuracy?"
It’s important to find out if the vendor’s data elements have standard or published definitions that are based on expert research, Spath and Schriefer suggest. For example, an infection indicator might have a definition that comes from a big infection control organization like the Centers for Disease Control and Prevention in Atlanta.
"So they’ll say, This is what we consider a heart attack; this is how we define an infection,’" Schriefer says. Butterworth’s vendor spells out how to collect the different indicators and defines what readmission is and what a cesarean is. "It’s very clear and consistent, and that helps you improve validity and reliability," Schriefer says.
r Share the data with physicians and clinical employees. This is an important way to make sure your data is valid.
"If a data analyst doesn’t do day-to-day patient care, he or she may not notice if an indicator looks really funny," Schriefer says.
For example, a physician might notice that an infection rate does not appear to reflect what he or she has observed in the unit. When the physician questions these data, the quality manager might double check and discover that there was an error in coding the information that skewed the results.
It also makes good sense to share data with the people who know how and where quality improvements can be made, Schriefer says. "Plus it lets them know how they’re performing."
Houser suggests the information be shared with the people who have the most influence on improving quality and reducing costs. "By collaborating with those individuals, you are able to integrate the information about the risk of the population served and about the clinical outcomes you’ve achieved and at what cost," Houser adds. "Then you can target your improvement opportunities appropriately."
[Editor’s note: To find out more about the measurement systems collected by JCAHO, FACCT, and NCQA, you may visit their Web sites at, respectively: www.jcaho.org; www.facct.org; www.ncqa.org.
Or contact Doug Davidson, Director of Communications, Foundation for Accountability, 520 SW Sixth Ave., Suite 700, Portland, OR 97204. Telephone: (503) 223-2228. Janet Houser, RN, MN, MS, Regional Director of Professional Practice, Mercy Regional Health System Greater Cincinnati, 4340 Glendale Milford Road, Cincinnati, OH 45242. Telephone: (513) 956-5867. Deborah Nadzam, PhD, RN, Vice President for Performance Measurement, Joint Commission on Accreditation of Healthcare Organizations, 1 Renaissance Blvd., Oakbrook Terrace, IL 60181. Telephone: (508) 799-2100. Janice Schriefer, RN, MSN, MBA, Director of Outcomes Management, Butterworth Health System, 100 Michigan St., Grand Rapids, MI 49503. Telephone: (616) 391-2974. Patrice Spath, ART, BA, Consultant in Health Care and Quality Management, P.O. Box 721, Forest Grove, OR 97116. Telephone: (503) 357-9185. Web site: http://www.brownspath.com.]
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.