Competency testing gains in priority, but struggles with lack of standards

ICUs search for criteria in designing suitable performance exams

How effective is your competency assessment tool? Frazzled by a lack of both time and materials, busy managers are using homegrown creativity to assess their ICU staffs’ abilities by fashioning assessment tools pulled together from inside and outside their organizations.

In the absence of national guidelines or standards covering competency assessment, many departments are resorting to internally devised interviewing tools and exams when evaluating their staff’s technical know-how.

"You usually end up developing your own because each unit as a whole operates in different ways," says Georgiann Homuth, RN, MS, CCRN, a clinical nurse specialist and competency assessment expert at Swedish American Health System in Rockville, IL.

Homuth emphasizes that exams should test nurses on the less obvious, less-often practiced and high-risk areas of patient care. "It’s the things that nurses don’t do often that you want to test them on."

However, not all hospitals are in agreement with that approach.

At the Department of Veterans Affairs Medical Center (VA) in West Palm Beach, FL, for example, nurses developed an interviewing tool that emphasizes detailed problem-solving situations designed to test a broad range of nurses’ clinical knowledge.

The tool assigns one of four grades ranging from a negative and a neutral to a positive and a double positive based on how well, relatively speaking, a nurse can evaluate patients’ problems and prescribe appropriate, effective interventions, according to the VA.

Nurses at the VA had outside help chiefly from nurse certification exams and similar sources in developing the tool. The tool was comparable to the exam offered for certification in critical care nursing by the American Critical Care Nurses Association (AACN) in Aliso Viejo, CA.

Competency tool based on hospital need

But a large part of it was based on the contents of the nursing literature and what the hospital needed to accomplish in recruiting and retaining good nurses, says Sara Moore, RN, MSN, CCRN, a critical care nurse who helped develop the VA hospital’s testing tool.

In 1997, Moore won a National Teaching Institute Creative Solutions Abstract Award offered by AACN for her work in devising the competency-interviewing tool at the VA.

Whether efforts such as those are reliable is open to question considering the lack of national competency testing standards. What is known is that ICUs are assessing nurse competencies in a variety of ways and using various testing criteria.

But no one is certain how well those tools actually gauge nurse competencies, according to nurses contacted by Critical Care Management.

For this reason, many RNs say that managers need to begin the process by developing relevant standards and objectives to guide them in assessing competency.

According to Moore, departments vary widely in the criteria they bring to the process. One hospital may differ markedly from another. And in many cases, some nurses speculate, the quality of competency testing may range from good to nonexistent.

Yet, there are abundant resources available including such places as the Internet and through private consultants and nursing organizations for managers who want to create accurate, intelligent skills-assessment tools.

Short of fixed national standards, these resources do a good job, Homuth says. "But they involve some research and foot-work," she adds. The difficulty helps to explain the variety of assessment approaches, Homuth says.

For many nursing departments, competency assessment is the least-liked, most onerous responsibility that managers are required to perform, according to veteran managers. (For a few reasons and factors that may make competency testing more difficult, see the article on p. 135.)

The chief complaint is that the process is slow, labor-intensive, and often doesn’t accurately reflect the day-to-day clinical realities of nurses’ responsibilities.

Testing tools should target objectives

But that can occur when nurses put too little time or effort in constructing relevant, effective testing tools, says Homuth. The first place to begin is by creating guidelines that target desired objectives.

Testing new nurses or recent college graduates differs markedly from assessing veteran nurses or those with fewer than five years in critical care, she observes.

The contents of the test is important. But "whatever your objectives, the guidelines you develop should incorporate more than the actual test design," Homuth adds.

The guidelines should include:

creation of pretesting materials;

preparation protocols, including necessary inservice coursework;

actual testing schedules and intervals between tests;

criteria for selection of nurses to be tested;

how often the test should be scheduled during the year;

an equitable grading system;

provisions for training and retesting nurses who don’t pass the first time.

Here are some additional ideas on guideline development suggested by Homuth and Moore:

Emphasize actual nurse practice.

Incorporate questions and testing scenarios into the assessment tool that concentrate on what nurses do every day, and what is done most often.

But also focus on situations that seldom occur, Homuth says. This can be done by reviewing the hospital’s database, the patient records, and surveying the staff on its most common bedside procedures.

Testing nurses on less-traveled areas may not reflect what you do 90% of the times, but it sharpens nurses’ skills so they can perform as expected in the remaining 10%, she observes.

Make exams challenging

Avoid written tests.

Design tests that involve oral questions and answers. This format allows the tester and test-taker the opportunity to fully cover each assessment area. The test may be offered in written form, but should be taken orally in the presence of a nurse assigned to do the competency interviewing, says Moore.

At the VA, nurses are tested in two key areas: 1) knowledge, skills, and ability (KAS) that cover technical and procedural areas of expertise; and 2) work orientation factors (WOF) that involve problem-solving, interpersonal skills, and family visitation scenarios.

Challenge the test-taker.

Create realistic and complex scenarios that require the test-taker to explain in detail the problem-solving steps involved in answering the questions, says Moore.

One option, suggested by Homuth, is to observe a nurse during the shift or give the test orally during the course of bedside care. Of course, this alternative depends on the size, staffing, and amount of activity during the day.

But the actual, hands-on situation can yield a better assessment than a simulation-based test, Homuth notes.

Be comprehensive.

The assessment tool may focus on what nurses actually do in the unit. But it should also cover the entire gamut of nursing skills. "Make a detailed list of things nurses should know," says Homuth.

The list should cover basic knowledge, such as a Foley catheter insertion to higher-order skills performing textbook nursing care in a pericardiocentesis.

Also, design your tool equitably so nurses without sufficient experience aren’t held accountable for higher-order skills, Homuth says.

Use external resources.

Homuth strongly advocates borrowing from outside sources, especially the AACN’s core curriculum for critical care nurses, in developing assessment tools.

Carol Hartigan, RN, a certification specialist with AACN Certification Corp. (AACNCC) says the core curriculum can furnish a basis for developing assessment questions.

But it won’t provide much more than that. However, as a reference, it does reflect what ICU nurses actually do in the field and can be valuable in helping nurses with test designs.

Another useful source, according to Homuth, is the certification exam for critical care nurse. The test itself is not available for the purpose of competency testing.

But AACNCC provides a blueprint on the Internet that managers may use in developing guidelines, Hartigan says. The blueprint can be accessed by logging on to the Web site: www. (The chart, left, has an excerpt of the CCRN certification blueprint.)

Of course, most ICUs have a procedure manual or "black bible" that outlines in detail the unit’s clinical policies and procedures. The value of the black bible is that it documents only steps and procedures used within the unit, which focuses directly on what nurses most need to know, says Homuth.

Competency assessment has come a long way, says Moore. "From self-study programs and low accountability, the process is becoming much more disciplined and formal," Moore adds. "But hospitals are still largely doing things their own way."