Over regulating can put subjects at risk

Research ethicist offers ways to lessen that risk

In their zeal to protect research participants from undue risk, are IRBs actually making them more vulnerable, by causing frustrated researchers to circumvent the IRB system?

Research ethicist Patricia Keith-Spiegel, PhD, suggests that dynamic may be in place at some troubled research institutions. She worries that an IRB's overzealousness or uncooperative attitude could drive some research underground.

And she says there's even a model that explains this type of behavior: organizational justice theory, which predicts that employee (or in this case, researcher) misconduct will increase when a system is perceived to be unfair, arbitrary or biased.

But Keith-Spiegel suggests there are things IRBs can do to prevent such incidents, simply by changing the way they deal with researchers and making their processes more transparent.

"I think in some ways, if IRBs are amenable to it, it's a fairly simple fix," Keith-Spiegel says. "It doesn't require a lot of staff. I think IRBs would look at [the suggestions] and say, 'We could do that a little better,' and it might really pay off in ways that they don't even recognize."

Researchers get around IRBs

Keith-Spiegel, of Aptos, CA, is a consulting editor of the journal Ethics and Behavior and is past chair of the ethics committee of the American Psychological Association (APA). She says she became interested in the workings of IRBs when she began hearing anecdotally about social-behavioral researchers at various institutions who had found ways to avoid what they believed was onerous IRB oversight.

Examples of anecdotes were published last year in an article Keith-Spiegel wrote for the journal Ethics and Behavior:1

  • An investigator considers her IRB unresponsive and arrogant, and so collects most of her research data as "regular educational assignments" carried out in her classroom or through required homework. When she finds data that interests her, she submits a protocol to the IRB requesting use of data already collected for non-research purposes.
  • An investigator who has had several run-ins with his IRB over issues he believes are picky and unnecessary, submits elaborate protocols he knows will bore readers, while distorting or omitting elements he believes the IRB might object to.
  • An investigator who believes his IRB's consent requirements are too strict simply doesn't seek approval for his protocols. In doing so, he takes the chance that a publication based on a protocol never submitted to the IRB might be noticed.

Most of the incidents Keith-Spiegel learned about were among social-behavioral researchers, in part because she herself is a social-behavioralist. She says that the more extreme examples of IRB circumvention are probably less likely among biomedical researchers, where the risk to subjects and the potential for discovery may be greater.

Since the article was published, she's heard about even more examples of angry researchers circumventing their IRBs.

"There are people who have taken some really major risks, which rather surprised me," she says. "It seems like the more draconian the IRB is, especially when it's accompanied by arrogance, and perceived incompetence , those are the institutions that are really at risk for this phenomenon."

When Keith-Spiegel studied more about organizational justice theory, these incidents began to make "a sad sort of sense."

The theory is often used to study fairness and perceptions of fairness in the workplace, and how those perceptions affect employees. There are three facets to organizational justice:

  • Procedural justice: appraising the process used to make decisions, whether they're seen as fair, consistent and accurate, and whether employees have a voice in decisions.
  • Interactional justice: how people are treated by decision makers. Even when there's a negative decision, an employee is more likely to accept it if he or she feels respected and treated with care.
  • Informational justice: full explanations of the decision-making process, so that an employee understand the basis for an unfavorable decision.

All of these facets play into negative perceptions that researchers have about IRBs, Keith-Spiegel says. In fact, she notes that when researchers were surveyed about what qualities they wanted to see in their own IRBs, qualities of organizational justice topped the listed, ranking even above protection of human subjects.2

"This was a huge national sample of federally funded investigators," she says of the survey, which was published this year in the new Journal of Empirical Research on Human Research Ethics. "Protecting research participants, although important, was No. 7 on the list. That's absolutely fascinating, how important this stuff is to researchers."

Keith-Spiegel says IRBs need to understand that for researchers, ability to conduct research is more than a career or money issue.

"It's about their self-concept, the research is really who they are," she says. "I think IRBs need to understand that that's why researchers may be so anxious and so obnoxious. Because to them, it's so critically important."

Dissatisfaction about the way an IRB operates can make an investigator more prone to excuse misconduct. Keith-Spiegel has conducted a study in which researchers read a vignette about an investigator who was turned down by an IRB, but with varying degrees of "fair" or "unfair" treatment. The investigator in the vignette then responded with differing types of scientific misconduct.

"Participants who felt the researcher had been treated unfairly were much more sympathetic to what he did," she says. "Even though they disagreed with what he did, there was tremendous empathy for his plight, in the situations where he was treated poorly."

Assessing attitudes about IRB

How can IRBs even know if researchers believe them to be unjust, let alone correct the impression?

Keith-Spiegel believes an IRB should take steps to see how it is viewed by its researchers. She and co-author Gerald B. Koocher, current president of the APA, have developed the Institutional Review Board Researcher Assessment Tool (IRB-RAT) that IRBs can use to determine how investigators perceive their IRB. The tool is available at www.ethicsresearch.com/forms.html.

"I think it would be really good for IRBs to take their temperature every so often," she says. "I've got maybe 40 or 50 universities using our IRB-RAT to see how the community feels about the job they're doing."

She also suggests that IRBs examine their processes to see areas where they could improve:

Transparency. Keith-Spiegel says the procedure used by the IRB has to be stated clearly and widely disseminated so that every researcher understands what happens when a protocol is submitted.

"I think it really helps if the whole process is demystified — everybody knows when you hand it in, here's what happens first, here's what we do next. Instead of seeing the IRB as this mysterious group of people in there tearing things up."

Avoiding bias. Keith-Spiegel says IRB members need to be alert to the possibility of bias in their decisions and those of other members. "It's so easy to be biased, especially if somebody is assigned the protocol to present to the rest of the group," she says.

"[IRB members] really have to watch out that they're really focusing on the protocol itself and not other extraneous factors. Whenever somebody finds themselves saying 'I think,' that's something you should look at."

Respect. She says this factor is key to improving perceptions of researchers about their IRB. "This really seems to be the most important when it comes to the problems of people getting angry and then trying to circumvent the IRB," Keith-Spiegel says.

"It costs nothing to be pleasant and respectful, even in the face of an obnoxious researcher. I think sometimes IRBs think that to be respectful and pleasant is a sign of weakness, but it's really not."

Information. She says researchers need to get a full explanation of an IRB's decision, particularly if it's unfavorable. Keith-Spiegel says one colleague told her that the colleague's IRB delivers bad news in a form letter with an X in a box marked "Disapproved."

"That's just absolutely unacceptable," Keith-Spiegel says.

She says that an IRB should take the time to write out an explanation when a protocol is disapproved, a practice that would also help weed out bias. "As you're writing out those decisions and the justifications for those decisions, if it doesn't look right, it probably isn't," she says. "I think sometimes biases can be picked up at that point."

Giving researchers a voice. She says researchers whose protocols have been rejected should have the chance to address the IRB to make their objections known.

"Even if you get shot down again, if you felt like people listened to you, that's really helpful," she says.

"These are simple things that IRBs can do that will really increase the perception that the procedure is fair, that it lacks bias, that people making decisions are competent, that they're not just coming from their own idiosyncratic place," Keith-Spiegel says. "That's where you get so much resentment."

References:

  1. Keith-Spiegel P, Koocher GP, "The IRB Paradox: Could the Protectors Also Encourage Deceit?" Ethics and Behavior, 2005;15(4), 339-349.
  2. Keith-Spiegel P, Koocher GP, "What Scientists Want From Their Research Ethics Committee," Journal of Empirical Research on Human Research Ethics, March 2006; Vol. 1, No.1, 67-82.