Develop action plan for using OASIS data
Here’s how to make OASIS part of your QI efforts
Using data from the Outcome and Assessment Information Set (OASIS) may move your quality improvement efforts up a notch. But the million dollar question is: Do you have an action plan for using the data?
A written and detailed action plan will give you an outline of each quality improvement step you’ll need to take for a particular indicator. It’s also a good way to make sure your QI process stays on track and doesn’t end when the first desired outcomes are achieved (see related story, p. 6).
The Center for Health Services and Policy Research (CHSPR) at the University of Colorado in Denver, developed OASIS and the Outcome-Based Quality Improvement (OBQI) process with funding from the Baltimore-based Health Care Financing Administration (HCFA) and the Robert Wood Johnson Foundation in Princeton, NJ. CHSPR suggests an action plan incorporating these six elements:
• target outcomes;
• target care behavior;
• statement of "best practices;"
• interventions to implement or reinforce "best practices;"
• responsible persons, time frame;
• monitoring and follow-up activities.
The action plan serves as an agency’s guideline, says Karin Conway, MBA, RN, senior researcher at CHSPR. Conway, and other CHSPR colleagues, including director Peter W. Shaughnessy, PhD, and associate Kathryn Crisler, MS, RN, spoke about using OASIS data to measure outcomes and improve quality at the Washington, DC-based National Association for Home Care (NAHC) conference held recently in Atlanta.
Sharon Johnson, MSN, RNC, CNA, director of clinical practice and outcomes management at Jefferson Home Care Network in Ardmore, PA, also spoke about action plans at the NAHC conference.
Conway says the action plan needs to be put in writing so everyone can understand exactly what it says. "In a way, it is kind of a guidepost so that everybody can use the same information," Conway told home care directors and quality managers at NAHC.
Conway and Johnson suggest agencies use these guidelines to develop a plan of action:
1. Target outcomes.
Jefferson Home Care Network is one of three home care agencies of the Jefferson Home Health/Main Line Hospitals network. The home care agency was one of the original 50 agencies involved in the OASIS demonstration project and has been collecting OASIS data for three years.
"We have gone through two cycles of process of care investigations," Johnson says, adding the outcomes are in the areas of hospitalization, stabilization of dyspnea in cardiac patients, and improvement in urinary incontinence.
The first step in targeting outcomes was to form a multidisciplinary team, Johnson says. The team represented a continuum of care, including discharge planners from the hospital.
"We did some brainstorming and asked some pointed questions, such as, What kinds of patients do you see readmitted to the hospital from home care, and why,’ and What are some services we need to provide in home care to keep people out of the hospital?’" Johnson explains.
Conway suggests quality managers focus on the patient, patient care, and the target outcome.
"This may seem obvious, but we found last year in the national demonstration project that those agencies that did not improve in their target outcome for the previous 12 months weren’t exactly clear that it was patients they were focusing on," she says.
2. Target care behavior.
Agencies should avoid turning a focus on staff behavior into a witch hunt of who did what, because this won’t get everyone on the same team, Conway says."What you want to be able to do is look at a neutral subject, such as patients, and say the patients are not being taught, or the patients are not being assessed by such and such. Give that kind of detail."
Home care agencies have some control over what and how their employees are teaching patients, so this makes a good target area, she adds.
Johnson created a fish bone-shaped cause and effect diagram, with the head serving as the outcome and the fins as the issues. For example, when the agency looked at hospitalization, she collected about 45 ideas from the team brainstorming session. She narrowed those down to four categories of people issues, information issues, procedure issues, and environment issues.
"From there, we could actually list common causes and begin to identify some of the actions to correct," Johnson adds.
3. Write statement of best practices.
Jefferson Home Care Network’s best practices were developed from the cause and effect diagram, Johnson says. The diagram also was useful in preventing the multi-disciplinary team from jumping to the wrong conclusions about various outcomes.
For example, when the team looked at the dyspnea outcome, most of the nurses were positive that the outcome was less than desirable because the therapists did not know how to assess dyspnea at discharge. Johnson did not put this on the best practices statement because the cause and effect diagram proved this was not the case.
"We were able to prove to them that this was not the problem," she says. "You have to really be careful not to jump on those quick solutions and problem-solve before you really know what the problem is."
The best practices should include precise clinical activities, Conway says. She suggests quality managers take the following steps:
• identify the patient population;
• identify actual clinical processes, including assessment, evaluation, and teaching;
• make sure you have the clinical discipline that is supposed to be involved in that clinical intervention or process;
• determine how often it’s going to happen, when it is going to happen, and what to do about it;
• know who is going to do the best practices and what patient population is going to be focused on;
• make sure you have only three or four best practices.
"Any more than that really gets in the way," Conway says. "People start seeing a massive list of things that have to be improved, and they start feeling that this is not attainable."
4. Implement interventions or reinforce best practices.
Jefferson Home Care Network uses a focused record review process to assess how the agency is doing on its hospitalization outcome goals. "We looked at patients’ rehospitalization within 14 or 30 days of the start of care," Johnson says.
Initially, the agency looked at 120 charts. Now each quarter, the agency reviews 25-30 charts, looking at these specific areas:
• Were the patients in the right specialty program?
• Were the disciplines in there, if needed?
• Were the clinical nurse specialists used if they were needed on a case?
• How many nurses were in to see that patient?
• How many visits did the patient have?
• Was the visit pattern changed appropriately for the patient’s needs?
"We pulled out ideas from brainstorming, the cause and effect diagram, and collected data on that," Johnson says.
One area the team targeted involved nurses reporting congestive heart failure (CHF) patients were not going into the cardiopulmonary team, but instead were going to the medical-surgical team. This led to team writing as a criteria: Was the patient referred to the appropriate specialty team?
"Basically, our intake department wasn’t following referral guidelines for specialty teams," Johnson says.
One guideline is that any patient with two episodes of CHF within six months will be placed on the cardiopulmonary team.
The other problem relating to repeated hospitalizations was that the agency was under-utilizing occupational therapists. "We did education for the staff on the role of the occupational therapist," Johnson says.
One of the agency’s biggest interventions during the first year was to implement interdisciplinary rounds that would focus on criteria every team would discuss on a weekly basis. Each team will talk about any patient who has been re-hospitalized, asking these questions:
• Why were they rehospitalized?
• Was there something we could have done differently?
• Could we have prevented that hospitalization?
• Was it truly a new diagnosis?
• Was it truly something that was out of our control?
• Or, could we as clinicians have prevented that hospitalization?
"We also talk about every new admission that has complex needs," Johnson says. "Any patient who has a home situation that puts them at risk, any patient who has two or more disciplines in the home is discussed at start of care to be sure that we are targeting them and moving them in the right way through their episode of care."
Other interventions included additional staff and patient education. The agency created a patient education sheet that tells patients when to call their home care nurse and agency.
"We make it clear to patients that there are times when the nurse can help them, rather than to have them immediately go to the emergency room," Johnson adds.
Conway suggests quality managers make sure their staffs know what any changes are. From the beginning of the process of care investigation, the agency should tell staff where the agency currently stands, where the agency is progressing, and ask the staff what they think about it.
"Make sure that the people who are actually doing the work are invested from the beginning; the only way you can actually get people invested is to get them involved," Conway explains.
5. Select responsible persons and a time frame.
Identify who is responsible for reviewing the plan of action and plan which dates that person will review it. Also, state on the action plan when the next outcome report will be available, Conway says.
"So many times people want to know, Okay, we are progressing; we have this monitoring in place; the interventions are going very well, but how do we know the effect on the actual patients?’" she adds. "The best way to do that is the outcome report; you want to make sure that you have the date of when to expect that."
6. Monitor and provide follow-up to all changes.
Monitoring is very important to make sure that the plan of action implementation occurs as planned, Conway says. "You can have a perfect plan; unless you put it into action, it is worthless," she adds.
In fact, she says, monitoring was identified by many of the national demonstration agencies as one of the chief reasons they were able to improve outcomes.
Jefferson Home Care Network places a strong emphasis on continual monitoring through its chart reviews.
"We developed our own monitoring tool that was used during this initial investigation," Johnson says.
The agency also monitors its progress through quarterly record reviews and spot checks. Also, the agency monitors activities through interdisciplinary rounds.
Here are some examples of Jefferson Home Care Network’s monitoring activities:
• During the agency’s 1998 work on pain outcomes, pain management guidelines were developed into a patient education tool that would go into every patient’s chart.
"We found when we did our investigation that many of our patients have pain, even if that is not their primary problem and the primary reason they are on home care," Johnson says.
If the pain guidelines do not apply to a particular patient, the clinician can use the guidelines for another case. But if they’re needed, they are readily available.
• The agency ensures coordination of care and the interventions related to those best practices by having the staff attend weekly interdisciplinary rounds.
"We also developed a documentation sheet for those rounds," Johnson says. "A copy is left in the team’s minutes, so we can also take a look at those just to be sure that the content of the discussion at those rounds is focused in the area that we want it to be focused."
• The agency revises items on the action plan according to what is discovered through the monitoring process. One problem in 1997 was that the agency was behind in the implementation of a dyspnea patient education packet.
"It was very easy to address that by simply re-prioritizing our staff development specialist’s project list," Johnson says. "I moved this to the top of the list, and it happened a lot quicker."
• Karin Conway, MBA, RN, senior researcher; Kathryn Crisler, MS, RN, associate; Peter W. Shaughnessy, PhD, director and professor, Center for Health Services and Policy Research, University of Colorado, 1355 S. Colorado Blvd., Suite 306, Denver, CO 80222. Telephone: (303) 756-8350.
• Sharon Johnson, MSN, RNC, CNA, director of clinical Practice and Outcomes Management, Jefferson Home Care Network, c/o Bryn Mawr Hospital, 130 S. Bryn Mawr Ave., Bryn Mawr, PA 19010. Telephone: (610) 526-3837.