JCAHO award for stroke measures compliance

Measures are kept in front of leadership’

At Seattle-based Swedish Medical Center, stroke outcomes improved dramatically as a result of a comprehensive program that deploys a coordinated team to assure comprehensive, timely, and efficient acute stroke care.

This data-driven focus on rapid evaluation of stroke patients has led to a significant increase in patients with an emergency department evaluation completed within 45 minutes — from just 40% at baseline to consistently better than 80%. Over the same time period, average time from door to thrombolytics has improved from 84 minutes down to 79 minutes.

The organization was awarded the JCAHO’s Ernest Amory Codman Award, which recognizes excellence in the use of outcomes measurement by health care organizations to improve patient safety and quality.

"At the start of this initiative, we were like other hospitals nationally in that we struggled to actually implement the stroke guidelines that were available," says Tammy Cress, RN, MSN, stroke program manager.

For example, compliance with standard order set utilization was only 55% and is now 98%, and the number of patients processed very rapidly for rescue therapies was only 42%, and is now more than 80%, reports Cress.

During site visits by the JCAHO for Primary Stroke Center Certification, surveyors were impressed that data was collected concurrently, as opposed to retrospectively.

"We have implemented a model wherein the staff providing the care collect and manage the data," says Cress. "This allows us to use data collection as a checks and balance to guarantee that each patient gets every opportunity for optimal care management."

Data checklists

Staff use data checklists to remind them of quality goals for each patient, so there is an opportunity to correct or modify care as necessary. "We also have real-time data to share with the leaders and grass roots providers daily," says Cress. "We then celebrate a job well done but also learn from this data in various forums where we select and focus on quality improvement efforts."

Stroke nurses who assist in the coordination of stroke care are responsible for the data collection. "When we began the program in 2001, there weren’t any standardized instruments available for data collection, so we developed our own homegrown database," says Jennifer Harville, the organization’s director of clinical effectiveness. The stroke nurse collected data on a scannable form and scanned the data into a database.

When the American Heart Association’s "Get With the Guidelines — Stroke" database was released in 2004, the stroke nurses were able to collect the data electronically using the online tool.

To allow for analysis of the data, data were downloaded into an Access database for more flexibility in querying and reporting. Each month, cases are downloaded and compared with a list run from the administrative ICD-9 coded data to ensure full case capture.

"We would provide feedback to the stroke team on cases not captured in real time, and they would review these charts retrospectively," says Harville.

To analyze and report data, quality leaders worked with the stroke leadership team early to identify the outcome measures they felt were indicators of performance in delivering stroke care.

A balanced set of about 15 measures was selected based on the Dartmouth Clinical Compass, which measures clinical quality, cost/utilization, functional quality, and satisfaction/professional enablement. A "dashboard" with all of the selected measures is produced for the monthly leadership team meeting, with data from the previous month.

"This rapid turnaround of the data into the dashboard has been a critical factor in the success of a data-driven program," says Harville.

A list of "variances" — stroke cases that didn’t meet the requirement — also is reported, with comments on any circumstances or mitigating factors to help the team better understand why the variances were occurring and how the next PDSA (Plan, Do, Study, Act) cycle of improvement should be approached.

"We also analyze the data in a variety of ways, based on the needs of the team, areas of focus, and data available," says Harville.

However, at times certain areas require improvement that aren’t included in the dashboard, such as rapid evaluation of stroke patients in the emergency department.

"When we began to focus performance improvement activities in this area, we reported the data to the leadership team every two weeks, along with root cause analysis findings from failures to complete the evaluation within the desired time frame," says Harville.

Initially, data analysis for this area was done separately from the dashboard, but it has since been incorporated into the dashboard. It no longer requires the same intensive focus but still needs to be "on the radar screen," says Harville.

"This is how we sustain the gains — we keep it in front of the leadership team," says Harville. "If any of the measures begin to travel in the wrong direction, the team asks that we bring back a more detailed analysis, and we move into more active PDSA cycles of improvement."

For example, on one occasion when length of stay, a cost/utilization measure that had been steadily improving over time, appeared to be trending upward, the leadership team requested more analysis into the issue. "This analysis identified discharge coordination as the issue of focus for the next PDSA cycle for this particular measure," says Harville.

[For more information, contact:

Jennifer Harville, Director, Clinical Effectiveness, Swedish Medical Center, 747 Broadway, Seattle, WA 98122-4307. Telephone: (206) 386-3762. Fax: (206) 386-3546. E-mail: Jennifer.Harville@swedish.org.]