Joint Commission looking for outcomes measurement

How do you know education is effective?

Patient education managers should make outcomes measurement a high priority, says Barbara Moore, MPA, CPHQ, an instructor at the Amarillo (TX) Veterans Affairs Health Care System. These measurements are important when seeking funding for a new program or resources as well as renewed funding. Administrators want to know if the programs are worth funding and if they are effective, she explains.

The Oakbrook Terrace, IL-based Joint Commission of Accreditation of Healthcare Organizations wants to see outcome measurements as well. "When the Joint Commission came here in the spring, their No. 1 patient education question was, how do you know your patient education is effective?" says Moore.

To gather outcome data, managers first must determine what outcome they want to achieve. For example, if many patients who are seen by physicians at the health care facility have high cholesterol, a class might be implemented to teach them how to lower their cholesterol. In that case, the goal of the class, or desired outcome, would be to lower patients’ cholesterol.

Don’t make measures too global

When selecting an outcome measure, select something clearly defined that either happens or doesn’t, says Moore. "If you are too global, then it might be difficult to collect the data. So you need to be very clear and precise. You also need to think about outcomes that will happen on a regular basis," she says. If outcomes are infrequent, it takes too long to accumulate enough numbers to analyze the data.

There are many examples of clear and precise outcomes. For example, if a program to teach new mothers to breast-feed their babies has been implemented, a patient education manager might measure how many women who are discharged successfully can nurse their newborn. Or perhaps people coming in for surgery are not prepared. The effectiveness of a pre-surgery education program could be measured by looking at the increased number of patients now prepared for surgery as a result of the teaching.

Ideally, a baseline measure should be taken at the beginning of the program so there are numbers to measure the outcome data against. "One of the mistakes that a lot of people make is they leap right into a program, and they haven’t taken much of a measurement prior to the implementation of that program," says Moore.

It also is important to define the population that the expected outcome will come from, she says. For example, if the effectiveness of a class that teaches patients to lower cholesterol is being measured, then everyone who attends the class is the population.

Determining how to collect the data is important as well. It sometimes can be collected from charts. To determine if an outpatient class is effective in teaching patients to reduce cholesterol, the figures would have to be collected from the physicians.

The final step once the numbers are collected is to analyze the data, says Moore. To make this process simpler, don’t just count the number of patients who reduced their cholesterol following the class, but figure a rate of those patients who lowered their cholesterol from month to month, such as 2% or 10%, she advises. In this way, percentages can be compared from month to month.

"Once you have your measurement or rate, you have a lot of options on how to analyze the data," says Moore. One way is to plot the data along a line from month to month to see if the program is achieving the desired outcome.

"It is really important to know where you began before the intervention started, and in that way you can support your case that your intervention was directly related to the outcome," she says.

Comparisons help to show that the educational intervention is effective. The most basic comparison is with the baseline measurement, says Moore. However, comparisons can be made between surgical units and other institutions with similar types of patients as well. "You need to make sure that the way you count patients and the way you are figuring those rates are similar enough to justify what you are doing," she cautions.

Be sure to present clear message

When presenting findings to top management, give them the whole picture but make sure that the tool used, whether a chart or a graph, has a clear message, advises Moore. For example, 60% of patients who attended the class lowered their cholesterol.

The process of gathering data to support patient education programs isn’t too complicated. "Think carefully about the outcome measure, take a baseline, find a comparison, and watch it over a period of time. Don’t try to draw conclusions too quickly," Moore advises.