Are your data collection efforts getting results?
Are your data collection efforts getting results?
Reduce redundant data collection
Misleading data that serve only to confuse administrators. Multiple departments collecting the same data without realizing it. Collecting large amounts of data but never addressing the problems they identify.
Do you struggle with these problems?
"As organizations are becoming more data-driven, there is an increased focus on the effectiveness of data collection activities," says Judy Homa-Lowry, RN, MS, CPHQ, president of Homa-Lowry Healthcare Consulting, based in Metamora, MI. "You may have a lot of measures or indicators that really don’t tell you much about patient care outcomes, or there may be duplication of data collection efforts," she adds.
What does this mean? Mainly that your organization is spending a lot of money on data collection that isn’t doing anything to improve patient care. "It’s very expensive to have people doing data collection if they are not going to use the data, or if the data are not credible," explains Homa-Lowry.
Here are strategies to improve data collection efforts from quality leaders:
- Make sure data are analyzed.
"There are still a fair number of organizations that are still not doing analysis of the data," says Homa-Lowry. "They do a lot of data collection, but sometimes the data have not been evaluated for validity, reliability, or analyzed appropriately."
At Covenant HealthCare in Saginaw, MI, Six Sigma "black belts" currently are training "green belts" to run Six Sigma teams, says Ann D. Law, RN, outcomes specialist. "They are especially helpful in teaching us how to use data in the most meaningful way," she says. "The black belts are a significant asset to our hospital."
You need to consider more than just average values for key metrics, says Sheri McClintic, one of the Six Sigma black belts working with Covenant HealthCare. "Understanding the standard deviation and its effect on the process is important for making process improvements and prioritizing improvement efforts," she notes.
For instance, when analyzing turnaround time for test results, reporting the average times for 1,000 tests only shows part of the picture — you also need to know how often turnaround times were too long. "Analyzing the standard deviation gives a better understanding of how well you are meeting customer expectations," says McClintic.
To be sure that standard deviations are considered, as opposed to only average values, she recommends reporting both the percentage of compliance and average performance, such as "The average turnaround time is 55 minutes, and 74% of the tests meet the 60-minute turnaround time specification."
- Eliminate redundant data collection.
By reducing unnecessary data collection, you can reassign those resources into actual data analysis, says Homa-Lowry.
"This is a constant challenge for our organization, as in most, I’m sure," says Pam Hurley, RN, BSN, CPHQ, director of performance improvement and disease management at NorthEast Medical Center in Concord, NC.
The organization pulled together all the various individuals and departments involved in outcomes reporting and called the group the "Outcomes Information Forum." "We spent months learning from each other — what our specific roles were, what data reports are produced, who gets them, and what the source of information was," she says.
As a result, many opportunities were identified to streamline data collection by appointing key resources for certain reports. "We also recognized the need to pull the information services decision-support staff into the performance improvement department, as the analysts in both departments were producing clinical quality outcomes," adds Hurley.
Members of senior administration, such as the vice president of information systems and the vice president of clinical effectiveness, were able to come together to support a common vision for data collection and reporting, she notes. "We still struggle with information overload, but we are constantly striving to better align resources."
The majority of clinical quality data now are filtered through the performance improvement department, with outcomes communicated through all levels of the organization, Hurley points out. "When we identify duplication of data collection or reporting, we try and isolate that process and refine it to avoid ongoing duplication," she says.
At Covenant HealthCare, staff send data to multiple agencies, including the Centers for Medicare & Medicaid Services, HealthGrades, The Leapfrog Group, and Solucient, Law explains. "It is amazing, the amount of data being sent to these various sources, and all have different criteria," she adds.
Health information management, risk management, finance, data analysis, pharmacy, and outcomes are just some of the departments pulling data, Law notes. "The problem is, one person may be doing a similar study as another," she says. "We have made several attempts over the years to come up with a creative solution, but it still reverts back to the same process."
One possible solution is to list all current studies on the organization’s intranet, Law adds. "Then, if someone wanted a study done, they could look to see if the information was already being collected," she says.
The problem is that multiple departments may be collecting the same data, such as pharmacy and a nursing unit both collecting data on expired medications, Homa-Lowry notes.
"A department may not have confidence in the data collection activities of another department, so they will collect it or may not know the other unit is collecting it — or may not know that they can get the data out of the information system," she says.
There needs to be oversight to make sure that this overlap is not occurring, Homa-Lowry adds. She suggests forming a "data oversight team" to look at data collection activities and doing random audits to periodically review the reliability of data. "Somebody in the organization should develop and maintain a data repository, updated on a regular basis," she advises.
For example, if you have implemented a new computer system that will give you certain data, you need to stop collecting the data manually. "People will often continue to do this until somebody tells them not to," Homa-Lowry says. "You can then put these resources toward analyzing data to develop a more comprehensive plan of action to address problems or deficiencies."
- Define data fields.
Data fields may be defined differently by various departments, so a clear definition of all fields should be included in reports, says Law, giving the example of patient days.
"Our clinical resource department includes observation patients in that number and finance does not," she says. "If someone does not know that, they could have confusing data."
It is key to clearly define the metrics under investigation, since a clear definition allows for proper analysis, McClintic notes.
- Avoid manual data collection if possible.
There is less room for error when using data that are available electronically, she says. When retrieving electronic data, it also is helpful to have integrated software systems that allow data sets to be merged, McClintic says.
- Make sure data are accurate.
At NorthEast, the performance improvement decision support team has outlined in detail many of the data collection and reporting processes, to ensure validity and reliability of the information, Hurley points out.
For collected data to meet public reporting or core measure requirements, education is done on the front end regarding the exact criteria for submission of data.
"We also deal with vendors that have established their own process for data reliability and spend a good bit of time identifying exactly how to meet our customers’ expectations of meaningful data," she continues. "Internally, we use software that allows us to monitor our data for statistical significance over time, which might provide a red flag if unusual trends are noted."
In the world of quality, your customers include physicians, clinical leaders, senior leadership, and the board of directors, says Hurley. "It has been our experience over the years that it is the first natural response to outcomes reported that the data must not be accurate, the source or collection methodology questionable, or the target an impossible stretch — especially if results show opportunity for improvement at the provider or unit level," she adds.
When problems do arise with misleading or inaccurate data, the best approach is honesty, says Hurley. "We attempt to tackle this directly and make efforts to correct the information as quickly as possible," she says. "We have realized over the years that sometimes the perception of misleading or inaccurate data is more related to the user’s understanding of how the data are collected and reported, so much ongoing education is required."
As a quality leader, you must be knowledgeable about external criteria for data collection and reliability of internal methodologies, so you can explain what the results mean. "We have seen an ongoing need to have continuous education and reeducation regarding principles of outcomes management, and we provide this at all levels of the organization," says Hurley.
To address this, education is provided on understanding graphs and data, and quality leaders meet with physicians and others to get direct feedback and answer questions, to increase their trust in the outcomes reported.
- Do a data inventory.
First, make sure you have a good handle on all the data being collected throughout your organization, Homa-Lowry notes.
"Then somebody needs to make a decision about what doesn’t need to be collected," she adds. "A lot of times, people just need to hear they don’t need to do this anymore."
Homa-Lowry recommends beginning by measuring data required for JCAHO, Medicare Conditions of Participation, and any contractual obligations, such as quality incentives many payers now are requiring.
"There is a difference between needing to know and wanting to know. If you make sure you are in compliance with regulatory and contractual requirements, that is probably giving you plenty of data and should be addressed first," notes Homa-Lowry. "You probably will have enough right there to measure. And if those things are being consistently collected and analyzed, you will probably have some good outcomes."
[For more information, contact:
- Judy Homa-Lowry, RN, MS, CPHQ, President, Homa-Lowry Healthcare Consulting, 560 W. Sutton Road, Metamora, MI 48455. Phone: (810) 245-1535. Fax: (810) 245-1545. E-mail: [email protected].
- Ann D. Law, RN, Outcomes Specialist, Covenant HealthCare, 1447 N. Harrison St., Saginaw, MI 48602. Phone: (989) 583-4060. Fax: (989) 583-4822. E-mail: [email protected].
- Sheri McClintic, Six Sigma Black Belt, Covenant HealthCare, 1447 N. Harrison St., Saginaw, MI 48602. Phone: (989) 583-4553. E-mail: [email protected].
- Pamela Spach Hurley, RN, BSN, CPHQ, Director, Performance Improvement & Disease Management, NorthEast Medical Center, 920 Church St. N., Concord, NC 28025. Phone: (704) 783-4009. Fax: (704) 783-2080. E-mail: [email protected].]
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.