Autonomous teams keep CQI on track, effective
Autonomous teams keep CQI on track, effective
There is no one way for doing things
Continuous quality improvement is just that at The Ohio State University Medical Center in Columbus. At any given time, there are dozens of QI efforts ongoing throughout the facility, requiring Gail B. Marsh, MHA, RRA, administrator for quality and operations improvement, to function somewhat like traffic cop.
"We are always looking for ways to improve. We have a shotgun approach to [continuous quality improvement]. Every department is required to do ongoing quality measurement, but the [rules] are not the same for everyone. There isn’t just one way of doing things," she emphasizes. "We look at how we can improve treatment and patient satisfaction and then we tailor [the CQI process] to individual areas."
It’s important to use the resources that are already available, Marsh says. For example, if there are automated systems that are collecting data already, use them. "Sometimes, quality and operations improvement [professionals] will approach an area on their own if they [benchmark against global data] and see something that needs work. Other times, [a department] will see that something isn’t working properly, and they’ll come to us. Then, if it’s something that can be handled by one or two people, they’ll work as a team to fix it. All the teams determine their own meeting schedules."
That was the approach that created code blue teams to provide faster treatment for cardiac arrest patients outside the intensive care setting and emergency department (ED) admission teams designed to cut patient wait times. How these quality teams work is typical of the CQI efforts at Ohio State.
The code blue team, made up of a physician leader, clinicians, a quality improvement facilitator, and a hospital administrator, was established in November 1992 to better evaluate the management of code blue episodes. The QI efforts encompassed all of Ohio State’s on-campus sites, including 42 clinics, four specialty hospitals, and the medical center’s ambulatory surgery sites.
The team devised a code blue evaluation form that enabled clinicians to document non-ICU code blue incidences and staffed a code nurse coordinator to help gather the data. Included in the data collection were average response time, code duration, individuals assisting the code, availability of equipment, patient disposition, and overall assessment of the code.
What followed were a series of significant improvements. From 1993 to 1996, average response time fell from about two minutes to less than one minute, while average code duration decreased to 25 minutes, compared to 31 minutes reported in 1993. Patient mortality decreased from 61% to 56%. (See evaluation of code blues, inserted in this issue.)
"There were definite improvements, but we didn’t stop there," Marsh explains. "The [team] had quarterly reviews for codes that they thought were less than satisfactory and worked on improving them more."
The code blue team also reviewed the equipment typically found on crash carts and made any necessary changes. "They looked to see if the equipment on the crash carts were appropriate. For example, we typically don’t treat children. But sometimes we will, and we have to be ready for that. We needed to ask ourselves, Do we have the appropriate equipment necessary to handle these cases?’" Marsh adds.
Different problems, different team solutions
During this same time frame, ED personnel worked on a few shortcomings of their own. Patient satisfaction surveys indicated that wait times for admitting were less than stellar. The department conducted a time study and confirmed the findings; approximately 68.5% of patients waited more than one hour for a bed.
In 1992, the ED set out to reduce the wait time for patient admissions for both the time it took for patients to be seen by a physician and the time it took to place the patient in a bed. The department formed a team consisting of a physician leader, nurses from both the ED and inpatient areas, a quality and operations improvement facilitator, and an administrative representative. The committee measured its progress in terms of average wait time, the cause of the delay, as well as patient satisfaction.
According to 1992 data, the average wait time to find a room and notify the ED was 54 minutes. One year after the program’s implementation, the department slashed that time by 20 minutes. From 1992 to 1993, the average time it took for a patient to be placed in a bed fell from two hours and 18 minutes to one hour and 15 minutes. And on a scale of one to five, patient satisfaction for wait time jumped from 2.99 to 3.20. (See chart, inserted in this issue.)
"There have been major improvements in this area, but I think there is still more opportunity for improvement," says Marsh. "We can’t just stop trying when we reach a goal."
To help keep quality improvement efforts on the frontline of patient care, Marsh regularly recruits the help of clerical and medical students to gather much of the comparative data. The information is then handed over to departments for analysis.
"Someone’s always involved in [quality improvement]. But we find that if people are working to collect and analyze their own data, they are more likely to pay attention to it and look for ways to improve," Marsh explains.
Regardless of any incremental improvements recognized by hospital personnel, the true test comes in terms of patient satisfaction. To effectively monitor improvements, the medical center queries its patients via an automated telephone survey shortly after discharge.
"We try to find out if they thought the environment was pleasant, if they felt the clinicians and physicians showed care or concern, if the nurses were responsive, or if their treatment options were clearly discussed," Marsh says.
"On a scale of one to 10, we were at 8.6 two years ago. Now we’re at 9.3. Scores are increasing, which tells us we’re doing something right. We have always tried to excel in clinical quality, but we haven’t always put the same emphasis on customer service," she adds. "But that’s all changing now. We have to look at what we’re doing through the patient’s eyes."
[For more information, contact: Gail B. Marsh, MHA, RRA, The Ohio State University Medical Center, Hospital Administration, 410 West 10th Ave., 168 Doan Hall, Columbus, OH 43210-1228. Telephone: (614) 293-4349. E-mail: [email protected].]
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.