Provide brief, effective performance reviews

Self-evaluations included in process

Sometimes an IRB director will notice that board members lack interest in evaluation processes. Any attempt to assess how each member is doing might be shuffled to the back burner of the schedules of very busy people. So what can an IRB do to improve its IRB member evaluation process and obtain board buy-in?

The answer, one expert says, is to design a very short but effective review process.

“Before this program was implemented, we had a program where IRB chairs evaluated the members,” says Virginia Rhodes, RN, MSN, human research protection program coordinator at the Durham VA Medical Center of Durham, NC.

“People didn’t take it seriously,” she adds. “When we started this program and had peer reviews, interest skyrocketed; people want to be held accountable and do a good job.”

The Durham VA Medical Center’s new IRB member evaluation program provides a brief but thorough review of all IRB members. It has four parts: self-evaluation, peer evaluation, IRB chair evaluation, and research office staff evaluation.

“We didn’t want to create a process that creates more work for people who already are busy,” Rhodes says. “But we want a process that shows how the IRB has functioned as a group.”

Here’s how the organization created an effective and popular evaluation program:

Brainstorm to identify the best ideas. Rhodes met with the IRB chair to discuss the evaluation process and to point out that the existing process was insufficient.

“We were talking about the evaluation process, and he said, ‘It’s kind of a chore; people don’t get much out of it,’” Rhodes recalls. “So we brainstormed about how to make it better, and we thought by involving members we could have more positive feedback on the whole process.”

Create a survey. “We didn’t want the survey to be overly burdensome,” Rhodes says.

The evaluations ask for ratings from 1 (poor) to 5 (excellent) on each question. For peer evaluations, IRB members rate each of their peers on two questions. And each member rates him- or herself on the same two questions, which are as follows:

- How would you rate the quality of your [this member’s] protocol reviews?

- How would you rate the quality of your [this member’s] contributions to IRB discussions?

For the IRB chair, there are three questions, as follows:

- How would you rate the quality of this member’s protocol reviews?

- How would you rate the quality of this member’s contributions to IRB discussions?

- List briefly any contributions this member made above and beyond protocol review and discussion, especially those that contributed to improving the IRB’s policies and procedures.

The IRB staff were asked to complete these questions:

- Out of 12 meetings (regular and urgent) held by the Durham VAMC IRB between April 30, 2011, and May 1, 2012, how many meetings did this member attend?

- How proactive is this member in identifying concerns with IRB review packets/individual submissions?

- How quickly did this member answer PI [principal investigator] requests?

The two IRB co-chairs evaluate each other and do self-evaluations, as well. The administrative staff also evaluates the chairs, Rhodes says.

Look at individual IRB members’ statistics. The IRB member evaluations by the IRB administrative staff provide useful data, which can be used to provide feedback to individual members, Rhodes notes.

“Five members of the office rate all reviewers,” she says. “The administrative staff gives an assessment of each member to the chair, and the chair can bring that up with the member at a one-on-one meeting if it’s an issue.”

Their ratings for the two questions about how proactive the member is in identifying concerns and how quickly the member answers PI requests are based on a number of factors, including these:

- Were there any issues with the packets they received, and how quickly do they look at the packets and ask the staff about any identified problems?

- How quickly do they respond to administrative requests?

- Do they get responses back quickly or not?

- Do they respond to PI requests in a timely manner?

“Once a review is done, investigators can send their questions directly to the IRB reviewer,” Rhodes explains. “PIs do not know initially who did the review, but if they have any questions about recommendations after it’s done, they can direct these questions to the member who did the review.”

Use evaluation results to improve performance. The IRB chair meets one-on-one with members to discuss their evaluations. It gives the chair a good sense of each person’s strengths and weaknesses and helps to identify items to work on, Rhodes says.

“If the member’s evaluation is great then that member can share what he or she is doing with other members, and we’ll talk about it at meetings,” she says.

This could be part of the meetings’ continuing education focus in which members share new things they’ve learned, she adds.

Address IRB review issues that arise. “One of the issues that came up involved timeliness in reviews and getting information back to the office,” Rhodes says. “The administrative staff crunched numbers, looking at how many meetings IRB members attended, how quickly they responded to the principal investigator.”

The data revealed some concrete examples of areas that could be improved.

“Our chairs could say to an IRB member, ‘Is there something we could do to help you be more proactive? Can we get this material to you in a way that is easier for you to handle?’ Rhodes says. “We could deal with this on a case-by-case basis.”

Data collected through the evaluations has made it easier to identify problems.

“With this process we can pinpoint where there might be a problem, and we can address the problem,” Rhodes says. “Before, it was easy for people to complain that someone was slow in responding, but unless you have data showing how far off the mark they are, you can’t do anything about it.”

Handle remaining challenges. “The biggest challenge is finding time for the IRB chair and members to spend 10 minutes or 20 minutes talking about the evaluations,” Rhodes notes. “That’s the hardest part.”

They decided the solution would be to set up these discussion times right before and after IRB meetings, she says.

“That seems to have worked well, and we’ll do that again for the next cycle,” Rhodes adds.

“The evaluation change has been a really positive experience for our members, chairs and even the office staff,” Rhodes says. “The take-home message is the accountability: You’re accountable to your peers, your chair, to your office; the evaluation says, ‘This is serious; this is important, and I want to contribute to the group, and I also want to do great reviews for our researchers.’”