Study: Fewer heart deaths when QI efforts are made

Patients less likely to need readmission

Heart failure patients are less likely to die after they go home from the hospital if the hospital has participated in an organized quality improvement program, compared with patients treated at hospitals where such efforts aren’t undertaken, a new study from the University of Michigan Health System finds. They’re also less likely to need another hospital stay.

Those are among the key findings from a two-year study involving more than 2,500 heart failure patients treated at 14 community hospitals in and around Flint, MI.

The findings were presented at the recent Scientific Sessions meeting of the American Heart Association.

Significantly lower death rates in the month after hospitalization were seen among patients treated at eight hospitals that cooperated to find ways to deliver proven care and educate patients about their treatment, compared with six hospitals that didn’t take part in the cooperative effort.

Rehospitalization rates dropped by 22% when doctors and nurses used a "toolkit" of heart failure specific standard admission orders, inpatient clinical pathways, and discharge checklists to make sure that their patients didn’t miss out on treatments or counseling.

The study was sponsored by the American College of Cardiology as part of its Guidelines Applied in Practice, or GAP project, which seeks to ensure that all hospitalized heart patients receive proven treatments, counseling for lifestyle changes, and education that can help them care for themselves after they go home.

In all, 30-day rehospitalization rates for patients treated at the participating hospitals fell from 26.1% at the start of the project to 21.7% by the end, compared with a slight increase among patients treated at the nonparticipating hospitals. The 30-day mortality rates fell from 9.4% at the beginning to 7% at the end at participating hospitals, compared with a jump from 8.5% to 10.7% in nonparticipating hospitals.

One of the keys to the success of participating hospitals was their intimate involvement in the design of the QI toolkit, says Todd Koelling, MD, University of Michigan (Ann Arbor) Cardiovascular Center heart failure expert.

"The keys to success were, one, that this study learned a lot of lessons from the initial GAP study," he notes. "They found they got the best responses from participation if the hospitals were able to design the instruments themselves. If I had designed it myself, the reactions from the hospitals would have been less than enthusiastic. Instead, we held meetings where the project leaders and physician champions broke into groups and designed it themselves."

Even after they came out with a common template, he notes, each hospital was allowed to change the documents to meet its own specific needs. "They felt it was theirs," Koelling comments.

In general, the tools were based on ACC/AHA guidelines, which recommend treatments based on medical evidence from research studies. Some of the tools and tactics were patterned after those already in use for heart failure treatment at the University of Michigan Health System.

The other key, says Koelling, was the strength of the Greater Flint Health Coalition. "They really helped organize the entire study and kept it on track," he says.

Physician participation also was critical. "The key here was that we not only involved cardiologists, but early on we tried to include as many different types of providers as possible — hospitalists, family practitioners and ER physicians, as well as nursing leadership," Koelling explains. "By doing that, no group was put in position of perhaps being defensive about their own particular type of patient population if a different physician group tried to change their treatment behaviors."

When the tools were developed, their impact was not immediately measured, he notes. "We used this iterative plan, do, check, act’ method," he notes. "We put the tools in place, and tried to measure the effect. But not all hospitals were using them." Then, each hospital began sharing information in group meetings — what they did, how they handled reluctant physician groups, and so forth.

"By doing that with these successive meetings, the adoption of these tools really increased significantly," says Koelling. "That’s why we used a 15-month period (beginning in 2003) between baseline and remeasurement; we used the extra meetings to get the use of those tools increased."

The participation of quality managers in these efforts was invaluable, he adds. "Almost all of the project leaders were quality managers of some sort. Also, one of the collaborators in the study was the Michigan peer review organization, and each project leader was in many cases the same person who would be entering data into the Q-net for their hospitals. So they were quite familiar with measuring quality."

Koelling and his colleagues soon will make the toolkit elements available on the web at www.acc.org. Meanwhile, the heart attack toolkit and framework for implementing it are now available on-line at content.onlinejacc.org/content/vol46/10_Suppl_B/.

Even though the study has been completed, says Koelling, the participating hospitals are still using the tools. "These hospitals wanted to participate," he observes. "They wanted to improve. When you look at them at the baseline, they were really quite average, but at the end they were nowhere near average. They would almost be benchmark hospitals."