Baldrige criteria provide framework for assessment

Grid pinpoints improvement priorities

"To determine where we want to go, we need to get a feel for where we are now," says Myrtle Kimble, RN, MPPA, director of the office of quality assessment at the Department of Veterans Affairs Medical Center in Jackson, MS.

That’s why a team at the medical center developed some innovative tools to conduct a complete and objective organizational assessment, assist in determining performance of key processes, and aid in setting priorities for strategic planning. The organizational assessment was the first step in developing the strategic plan.

In 1995, when these tools were developed, quality leaders at the medical center wanted the strategic planning process to be a little different from prior years, says Toni Layer, RN, MHCA, CPHQ, quality assessment specialist. They wanted the quality manager to be involved in the process because she had global knowledge of the organization; they wanted the assessment team to rely on existing data sources instead of recollecting data that already existed; and they wanted to design tools that would provide a systematic framework for the assessment process, Layer says.

A working team consisting of the associate director, a representative of the chief of staff office, the nurse leader, the environmental manager, and the quality manager was responsible for the assessment. They decided to use the seven Malcolm Baldrige National Quality Award criteria as the basis for their efforts, Kimble says. "Each of the criteria became our broad headings for information collection."

The next step was, through brainstorming, to decide if the medical center had existing data sources that would tell team members how well or poorly they were performing in each of those areas. "That’s why we said the quality manager should be on this team, because she has a good feel for knowing what the data sources are."

Going through this process served another purpose too: It showed team members where the hospital’s measurement systems were weak, Kimble adds. The categories and data sources were:

• leadership — Joint Commission survey results, the Quality System survey (a VA tool by which employees assess the organization in the same seven categories), bottom-up peer evaluations;

• information and analysis — information management committee employee survey and annual report, Joint Commission survey results;

• strategic planning — previous year’s strategic plan and whether those goals were achieved;

• human resources — employee surveys, employee database, exit interviews, employee suggestions, the Quality System survey;

• customer results and satisfaction — hospitalwide and department-level customer satisfaction data, VA patient surveys, patient contact tracked through the patient representative program, patient focus groups;

• process management (emphasizing Joint Commis-sion functions) — committee meeting minutes, service-level performance improvement plans (the QM department had to integrate information from services to analyze processes and functions);

• organizational performance results — patient satisfaction survey, medical/ administrative database.

Team members’ next step was to make judgments about the meaning of the data, Layer says. They developed a grid on which they could list each category, the information source, any issues arising from the data, a decision about whether the issue — there were about 50 — presented a strength or weakness, and a judgment about the hospital’s performance level (low, average, high) for each category. For example:

• Under information and analysis, the Joint Commission survey stated that improvement was needed in documentation of outpatient records for allergies. This issue is cited as a weakness.

• Under organizational performance results, customers had noted on their surveys that waiting times in selected clinics were better than expected — a strength.

• Under human resources, an employee database showed a low turnover rate for selected positions — a strength.

Performance levels were defined as:

• High:

— continuous efforts to improve;

— positive trends;

— high levels of satisfaction.

• Average:

— improvement with problems;

— variation in performance;

— customer complaints.

• Low:

— no improvement in process;

— poor performance;

— frequent customer complaints.

Next, team members took priorities as determined by a leadership group and designed a prioritization matrix. (The leadership team prioritized issues as either critically important, very important, or important based on 10 critical success factors such as cross-functional impact, customer-focused, high-risk, or high-volume.) All organizational processes were placed on a matrix according to their importance and the hospital’s performance. (See sample of matrix, p. 29.)

This simple grid assisted in determining where improvements should be made — in those areas that were ranked as critically important but low or average in performance, Kimble says.

In the past, organizational assessment was unstructured, and no one knew the steps or what the result should be, Kimble continues. "The new process helped share the workload, helped service chiefs get a clearer picture and feel some involvement in the process, and helped us meet Joint Commission standard for a systematic assessment process. In the past, it was more subjective and involved guesswork."

[Editor’s note: For more information, contact: Toni Layer, Quality Assessment Specialist, VA Medical Center, 1500 E. Woodrow Wilson, Jackson, MS 39216. Telephone: (601) 362-4471, Ext. 5577.]