How IRBs can improve investigators' opinion

IRBs should strive toward transparency

IRBs often err too much on the side of caution, creating a situation in which investigators do not trust their judgment and doubt their fairness, an expert says.

There are a number of ways an IRB can improve the review process, making it appear more equitable and reasonable to principal investigators, says Gerald Koocher, PhD, dean of the school of health studies and professor of psychology at Simmons College in Boston. Here are Koocher's suggestions:

1. Demonstrate consistency. "One thing IRBs could do is publish statistics every once and a while of how long the average protocol took to be reviewed and how many corrections were required," Koocher says.

Other statistics that would be helpful are:

— What percentage of protocols was declined?

— What were the most common mistakes needing corrections?

"Simple descriptive statistics in a report that principal investigators and others could read might be one example to attest to fairness," Koocher says.

2. Invite investigators to IRB meetings. "At one institution I was at, the IRB never invited investigators to be present," Koocher recalls. "At another IRB, investigators were asked to be available on call during the IRB meeting so if the IRB had questions, they could be invited in to answer them, and you can guess which IRB was seen as fair."

IRBs need to have a degree of transparency and to project a sense that all protocols are treated the same, whether they come from a chair of the department or a junior investigator, Koocher suggests.

3. Have a mechanism for corrections. It doesn't have to be an appeals process, but there should be some mechanism in which investigators can have the IRB's decision reviewed or reconsidered, Koocher says. This could be an informal challenge of the IRB's discussion about what are viewed as necessary changes to a protocol or consent form, Koocher says.

For example, Koocher once had a federal grant to create an intervention with families who had lost a child. The intervention would provide preventive psychotherapy to 120 families over a four-year period. When Koocher submitted the consent form, someone on the IRB sent a message back that it needed to be revised to tell participants about how investigators would be required to report any knowledge of child abuse according to the state's mandatory child abuse reporting laws.

"I challenged that," Koocher says. "If you have a family that has lost a child and has surviving children and you tell them that if they tell you they're abusing the surviving children that you'll call social services, then that's not helpful to them."

The statement itself could harm the participants, he adds. "I looked up statistics and saw that the probability of having a family that abuses children come into the study was less than one in 1,000, so with 120 families we wouldn't even expect it," Koocher says. "And if we did see child abuse, we'd stop the study and say that anything we saw would result in a report."

Finally, Koocher asked the IRB to consider whether every other study involving families that had been reviewed, including medical studies, was required to put in the child abuse reporting disclaimer.

"If they're asking me to do something risky, why aren't they asking everyone?" he says.

Finally, the IRB changed its decision and did not require the child abuse reporting phrase to be put in the informed consent. But without the opportunity to question the IRB's decision, Koocher would have been left with what he thought were only bad choices.

4. Make the IRB representative. IRB members should have knowledge in the research content areas of protocols they review, Koocher says.

"One way IRBs can do that is to bring in ad hoc members," he suggests. "If you have an IRB composed of physicians and nurses and you receive a behavioral research study, it would be important to have a behavioral scientist to talk with about that study."

Likewise, an IRB that reviews protocols from a psychiatry department might need to have an endocrinologist expert as a consultant for when there are biomedical studies to review, Koocher adds.

"When the content area of the study is not represented in the expertise on the IRB, the board can enhance the sense of fairness by inviting in a consultant who has the necessary expertise," he explains.

"I don't know if that happens enough," Koocher notes. "I've been on IRBs where we've done that, such as one where we had a proposal about domestic violence of kids, and we had no one who knew about domestic violence, so we found an expert who could speak with us by speaker phone."

IRBs need to have a diverse nature and should represent members of the community and the range of specialties that are present at a research institution, Koocher says.

"This optimizes the likelihood that different viewpoints are represented," he says.

5. Work toward transparency: "IRBs can publish who is on the IRB, and they can describe what their procedures are," Koocher says. "They can publish minutes of the meeting without attribution."

This can be done without violating investigators' privacy or identifying exactly which IRB member made what comment, he adds.

Koocher was once involved with an IRB and when he asked who the members were, he was stonewalled. Finally, he informed the IRB that the members' names were public information, and were they going to make him file a Freedom of Information request.

For IRBs that would like to know how they perceive themselves and how investigators perceive them in the area of justice and fairness, there's a free, downloadable IRB assessment tool, which is available at the web site:, Koocher says. (See sample of assessment tool, left.)

"Click on 'forms' at left and then scroll down to the IRB RAT rating assessment tool," Koocher says. "We're giving it away, and it's not copyrighted or anything we're trying to sell."

The instrument, which already is being used at many institutions, is likely to be a good tool for improving an IRB's image simply because it's rare for IRBs to ask investigators to tell them how they're doing, Koocher notes.

"One thing that was interesting is a lot of people who downloaded the instrument asked us to tell them how they compare," Koocher says. "We say we did not intend this to be used as a standardized instrument, and we don't want them to compare themselves with other institutions."

The goal is for IRBs to use this in their own institution, try it for a year, and then change what needs to be changed, and take the test again, he explains.