Tracking data is key

To monitor progress during the project, each ED tracked the following criteria: total length of stay, length of stay for admitted patients, length of stay for ED patients, length of stay for fast track, presentation to physician evaluation, and admission cycle time. "Not every ED has a tracking system, so you need to learn how to collect data easily and in real time, with the least effort possible," says Linda Kosnik, RN, MSN, CEN, unit manager for the ED at Overlook and co-chair of the collaborative.

For EDs without an electronic information system, a data collection form was provided that incorporated sampling, says Kevin Nolan, a statistician at Silver Spring, MD-based Associates in Process Improvement, who worked with participating EDs. "Records were selected each day for patients, with data summarized weekly using a median," he explains. "The medians are plotted on a run chart to monitor progress, along with data on clinical cycle times and patient satisfaction."

Several weeks before the collaborative started, each ED was sent information that outlined the basics of data collection, including sampling. "At the first learning session, we talked to the whole group, showed examples, and worked individually with those groups that had problems or questions," says Nolan. "As the teams send in their progress reports, we give them ongoing feedback on their data collection and visual displays."

Sampling data is a simple, efficient way to understand how a system is performing, says James Espinosa, MD, FACEP, FAAFP, chairman of the department of emergency medicine at Overlook Hospital in Summit, NJ. "It's astonishing that even relatively sophisticated ED managers as yet do not appreciate the power of sample data displayed as a trend," he stresses. "We have, by and large, been reared on pre- and post-intervention systems, comparing two cuts of data and looking for the difference between the two groups. That is the shape that tends to emerge from clinical trials."

The work of sampling daily calls for a different skill set than many managers are familiar with, says Espinosa. "The whole point is turning data into a picture which is simple and compelling," he explains. "Charts can be enlarged and colorized, and placed near a coffee machine, which creates a storm of buy-in from staff."

At Nash General, sampling of data and trending over time was found to be a cost-effective way to measure progress, says Kirk Jensen, MD, medical director of the department of emergency medicine. "At first we thought we needed to wait for a $30,000 computer system. We do this all by hand, and have learned that if we sample four charts a day over time, we do get an accurate view of where we are going," he explains. Two charts are selected at four items per day for three types of patients: emergency, fast track, and admissions.

The method has proven itself over time. "Eighteen months later, we are still doing it by hand with secretaries, student volunteers, and people who answer the radio," says Jensen. "In fact, the less involved the individual is with the department, the more pristine the figures."

There is no need to invest in high priced tools, Espinosa advises. "In emergency medicine, there is an immense amount of attention paid to selecting sophisticated digitized tools with a cycle time measured in years from implementation. But there is very little practical experience or evidence in how these tools are going to change systems," he says. "The data does not jump out of a box graph itself and find its way to an administrative office, or the hearts and minds of workers in the ED."

Teams measured their progress with an annotated time series. (See Emergency Care Center Turnaround Times graph on page 115.) "Making both the problem and progress visible builds support for continuing to make the changes necessary to reduce delays," says Nolan.