Collecting, sharing data tops list of challenges for quality managers

How quality is defined, measured seen as more vexing than ever

What keeps quality managers up at night? What do they consider their greatest challenges? A survey of quality professionals indicates a decided commonality, with concerns centering around how quality is defined, measured, and shared with national organizations and the public.

"One area we all struggle with is defining what quality is," says Tom Knoebber, CPHQ, Six Sigma Black Belt and director of performance improvement for Mission Hospitals in Asheville, NC. He notes that quality managers are besieged by a host of quality "definers" — "from the many public vendors announcing their new awards for excellence to CMS with their top performers, as well as hospital boards that want a high-quality, low-cost hospital but are unable to define a problem beyond anecdote."

"We are faced with requirements from our regulatory agencies, payers, purchasers, and the public," adds Kathy Schumacher, MSA, CPHQ, director of quality, safety, standards, & outcomes, William Beaumont Hospital, Royal Oak, MI. "They all want many things from us as a health care system, and we have an obligation to be responsive to those needs — like the need for more transparency for our data and outcomes. That's one of our greatest demands."

Sandra Trotter, MBA, MPHA, CPHQ, patient safety program director, Lucile Packard Children's Hospital, Stanford University Medical Center, Palo Alto, CA, agrees. "The greatest challenge facing us at Packard is the increasing number of regulations that are sometimes at counter purposes," she notes. "There are a large number of regulatory, accreditation, and other organizations with hospital guidelines and requirements — many of which don't align."

All of this pressure for providing more and more information in an atmosphere of economic woes and downsizing makes the challenge all the greater, notes Patrice L. Spath of Brown-Spath & Associates, Forest Grove, OR. "The challenge is how to maintain a quality department in a downsizing environment and maintain viability," she asserts.

For many of these challenges, quality professionals have discovered at least partial answers. For example, Spath notes, the answer to succeeding in a downsizing environment is to create a quality department as an "internal consultant," as opposed to one that does all the data-collecting work for all the hospital departments.

"The hospital departments and process owners have a lot of ownership for collecting their own data," she asserts. "That is one of the keys in being successful in spreading the wealth. Of course, everybody else is very busy, so there will be a lot of push-back from department managers, which means that leadership has to intervene and create that environment."

That can be a challenge, she concedes, and involves knowing just what it costs to collect a data element, "so when they make decisions about data to collect, they're able to equate it with the resources needed to collect it, and the decision can be made at the leadership level as to which is the most efficient place from which the data should be gathered."

The quality department, Spath continues, should have a keen understanding of all the information sources in the hospital, so if someone asks, for example, for data on CT scans, you can tell them that the radiology department routinely collects those data, and they should tap into their resources.

"The quality department should act as an internal consultant not just for the collection of data, but also to help people design data collection in other departments or to help people formulate the study question they are looking at," Spath says. "So, for example, when physical therapy decides they want to monitor patient outcomes relative to pain, they should be encouraged to meet with the quality people to help them design that study. In general, I don't think many organizations regard the quality department as a resource they should tap into."

Culture change required

Schumacher takes a slightly different approach. "A lot of what we are doing is becoming much more transparent with data and outcomes; we are very open about outcomes and performance data," she notes. "These data are out there, and as an institution and as a leader we have the obligation to provide them."

How does she do that? "Not with more staff — just a change in culture; a change in the way we do things," she responds.

"We've been on this mission a number of years and we continue to work on it," Schumacher adds. "It's not so much a focus on individual blame but on redesigning processes and systems factors. We've instituted a lot of collaboratives to help us do that."

For example, she notes, "We're very integrated into the work of the Michigan Health & Hospital Association's Keystone initiative's ICU program to reduce infections. We're a beta site for the surgery initiative and also very engaged in their HAI [hospital-acquired infection] program."

Beaumont also has focused on outcomes as well as PI initiatives, she says. "For that we've participated in a lot of national programs like NSQIP (the National Surgical Quality Improvement Program); and we're focused on bariatric outcomes as a bariatric center of excellence. It's great to initiate PI programs, but if you do not have a way to measure outcomes they almost become just something else you're doing," she says.

Defining quality

"It is difficult to find that one measure that would define a quality hospital," notes Knoebber. "If you choose to focus your efforts on what you internally believe are quality metrics and a new study comes out citing a new outcome, hospitals are forced to react to the public and media to ensure they are 'on the list' if it's a new measure of quality."

Knoebber says that internally his department has tried to address "this constant diversion" by providing an annual assessment of all public databases, internal rankings, and projects by providing a score similar to an FMEA Risk Prioritization Number.

"We ask an objective team to rate an assessment of these macro measures or projects — this past year we had 52 — individually using three questions on a scale from 1-10," he shares. The questions are:

1: Value — Its importance or relevance to employees, the system, or patients? (10 high, 1 low)

2: Opportunity — Do we have opportunity; are we at risk? (10 high, 1 low)

3: Cost — What resources are required to achieve the goal or attain high performance? (10 low-cost, 1 high-cost)

"The product of these three numbers lets us rank where we should focus," Knoebber explains. "Within this model, we are able to include mandatory items, and public reporting tends to score high as a value to the system. We can then use this to communicate what our priorities are, or where their individual projects fit."

Having said this, Knoebber adds that "there is always 'executive privilege,' and as new things do pop up we always work them through — but this at least helps us defend or justify the resources needed to support our defined quality focus."

Does coding determine quality?

Unfortunately, some challenges don't lend themselves easily to solutions — take coding, for example. "I think probably the biggest challenge from a quality perspective is how we are going to manage being paid for quality based on coding," says Bev Cunningham, MS, RN, vice president, clinical performance improvement, at Medical City Dallas Hospital. "What concerns me is that the decision on mistakes is based on ICD-9 coding originally designed for billing only. Now what's happened is that it's being attached to quality."

The good news, she says, is that coding appears to be "catching up a little bit." For a long period of time, she notes, pressure ulcers were covered by a single ICD-9 code, "so you couldn't tell if a patient came in with one." Now, says Cunningham, "You can indicate if they are present on admission or not, and pressure ulcers are divided into four stages, so there is a little better definition." Unfortunately, she adds, "A lot of other conditions are not defined like that."

Re-admissions also are problematic, she says. "If a patient goes home and chooses not to follow the physician's directions, there's really no ICD-9 code that describes that," she observes.

The move to ICD-10 coding may help, says Cunningham, "but everything goes so slowly, and I'm not sure that will answer all our questions."

There's not much quality managers can do if coding is not fixed, Cunningham notes. "I wish we had an answer, but I don't," she says. "Being verbal through our hospital associations and even aligning with physicians might help, because physicians will be graded the same way. Wait until they figure that out!"

[For more information, contact:

Bev Cunningham, MS, RN, Vice President, Clinical Performance Improvement, Medical City Dallas Hospital, Dallas, TX. Phone: (972) 566-6824. E-mail:

Tom Knoebber, CPHQ, Six Sigma Black Belt, Director of Performance Improvement, Mission Hospitals, 509 Biltmore Avenue, Asheville, NC 28801. Phone: (828) 213-9194. E-mail:

Kathy Schumacher, MSA, CPHQ, Director of Quality, Safety, Standards, & Outcomes, Director of the Surgical Learning Center, William Beaumont Hospital, Royal Oak, MI. Phone: (248) 551-9707. Fax: (248) 551-9700. E-mail:

Patrice L. Spath, Brown Spath Associates, P.O. Box 721, Forest Grove, OR 97116. Phone: (503) 357-9185. E-mail:

Sandra Trotter, MBA, MPHA, CPHQ, Patient Safety Program Director, Lucile Packard Children's Hospital Stanford University Medical Center. Phone (650) 725-0631. Fax: (650) 497-8465. E-mail: STrotter@LPCH.ORG.]