Comparing apples to apples yields healthy crop of QI action plans
OASIS outcomes reports direct agencies’ assessment efforts
The elusive apples-to-apples comparison of home health outcomes for agencies participating in HCFA’s Outcomes Based Quality Improvement (OBQI) project is finally nearing reality. In January, agencies received the first outcomes report associated with the project. Two more reports will come later.
It was a time for introspection, self-evaluation, and celebration. And then it was time to get to work.
The report presented data that agencies could use to identify how they compared with the other agencies in the project, thereby showing both their strengths and weaknesses.
"They have responded to these reports with their own internal quality improvement activity, and we’re at a point where we’re beginning to summarize some of the information we’ve gained from them," says Kathryn S. Crisler, MS, RN, senior research associate for the Center for Health Services and Policy Research in Denver, which runs the OBQI project. "Agencies have written plans of action. They are implementing those plans of action, and we have interviewed those agencies about how their CQI [continuous quality improvement] activities have gone and what they’ve learned."
But you’ll have to wait until the center compiles the information in a report in late summer to learn from the agencies’ experiences. "It will be very valuable information for agencies because everyone measured different outcomes," Crisler says.
However, to give you a little preview, Homecare Quality Management interviewed staff at St. Mary’s Home Care Services in Grand Junction, CO, to find out how its recent outcomes report has directed its quality improvement actions.
When the agency was made part of the demonstration project in August 1995, some people wondered whether the increased amount of information could actually help improve patient care. After the initial data were returned to the agency in early February, the answer was a resounding "yes," according to Tracie White, RN, quality improvement coordinator at St. Mary’s.
Since looking at the report’s bar graphs, White and her staff at St. Mary’s have been able to improve patient education and assessment by coming up with new flowcharts and teaching forms that help the nursing staff deliver the best possible care to approximately 1,000 clients annually.
When the reports first arrived, White quickly gathered interested staff to look at the reports and see how the agency fared against others. She was only able to get nursing and physical therapy staff involved, however. She would have liked to include home health aides, and she says with hindsight that having some of the hospital’s continuous quality improvement staff on board would have been a good idea.
Once she had enough people willing to participate, White and a group of supervisors had to decide which two or three outcomes the agency would study. Every participant was supposed to look at hospitalization, so White formed an Unplanned Hospitalization Investigation Team (UHIT).
The choice for an additional study was to look for an area where the agency needed to improve and come up with a plan of action.
White noted that the agency fared poorly compared to other agencies in pain improvement and stabilization. A second team, Pain Management Stabilization (PMS), was formed to combat this problem.
Chart review discloses problems
The teams started by looking for outliers in the two identified areas through chart reviews. "We knew we had three weeks to turn this information around," White says, referring to the requirement for action to be taken on the outcomes reports by March 1. The teams met daily in order to beat that deadline.
For the UHIT group, chart reviews showed glaring problems in documenting education, a lack of initial nutritional assessments, and a lack of assessment for securing emergent aid. "There was just no question about the latter on the assessment form," notes White.
UHIT developed new guidelines that required a complete head-to-toe assessment on each patient visit. A nutritional assessment screen was included in the start-of-care assessment, as was a question on how to secure emergent aid.
The agency then developed a new teaching pathway that was based on Joint Commission requirements. (See copy of general teaching pathway, p. 87.) Nurses use the new form throughout the course of treatment. "There is some teaching done every visit," White explains, "but we don’t tell them when to talk about specific things." For example, the priority for teaching a patient who is taking a variety of medications would be to talk about those medications, their side effects, and danger signs. "We let the nurse prioritize the teaching according to the patient." Then, as each item is touched upon, it is checked off. At a glance, a caregiver knows what has been covered and what remains. "This is all a permanent part of our documentation," says White.
The PMS team started by reviewing the objective criteria for pain. "We found that the assessments between nurses were not always the same," she says. "We had unsatisfactory discharge assessments compared with the start-of-care pain assessments, when the notes indicate there has been an improvement."
White says there was a pain description box on the nursing notes, but it didn’t allow for a severity code of one to 10 to be included. "We clarified that," she says. The start-of-care form was also changed so it would better describe the pain by noting its severity, location, acuity, and what exacerbated it. "We gave the staff prompts for more detail on each of those items."
The agency also added a teaching pathway for pain management that provided more specific information on pharmacological and non-pharmacological control of pain. (See copy of pain teaching pathway, p. 88.)
Time to assess again draws near
The changes were implemented in the spring, and White says she still is trying to determine how successful the changes have been. "We are doing inservices and asking for input," she says. "We want to see how people will actually use this."
Initially, there were large meetings that explained OBQI, the project, and why it was undertaken. Then, each team orthopedics, rehabilitation, oncology, and cardiology was given follow-up seminars. Initial staff reaction has been good. "The new pathway gives at a glance what has been taught," says White. There are also plans to develop more specific teaching pathways, such as for home IV therapy, diabetic patients, and catheter care. "It is all branching out into the agency."
White says St. Mary’s has had a positive experience with using the Outcomes and Assessment Information Set (OASIS). "The amount of information we get out of it is great. It’s great because it makes sure you are comparing apples to apples, not apples to oranges."
All patients receive nutritional assessments
The project to date is so promising that she is anxious to see additional outcomes reports. "An interim report is due in June or July, which will be really helpful to us in seeing whether our changes have effected any positive change," White says. There are some indications that the reforms have been beneficial. "We don’t have a single patient any more who hasn’t had a nutritional assessment. Our internal audit process of reviewing charts of new opens and first visits catches these people immediately."
Patient education is also coming around, and there is now one easily read form that shows caregivers what has been taught and what is still needed. "We have better charting procedures," says White. She hopes that when the next patient satisfaction survey is done, there will be further corroboration of her impression that patient care has improved.
The next outcomes report will not only include St. Mary’s data from this year and the national average, but will also have a third section that shows the previous year’s data. "We will have a way to compare ourselves to ourselves. That’s how we’ll know if we have been successful in the areas we chose to address." Until then, White says she has to content herself with anecdotal information and positive remarks from staff and patients.
Susan Kerschen, MS, director of home care for St. Mary’s, says agencies have to constantly think about quality and satisfaction, and outcomes reporting facilitates that. "You have to realize that everything is part of a work flow. Patient care and outcomes studies are part of the same thing," she says.
Kerschen hopes big savings will accrue from their work. "There is a dollar cost to this, in terms of training staff to collect the data and getting up to speed. But if we can prevent one hospital admission, then it all balances out. I hope when we get our next report we find that hospitalizations have declined, patient satisfaction is up, and the length of stay in home care has diminished. That would prove that the time and money we spent on this was well worth it."