By Melinda Young

If an IRB sets a goal of greater efficiency, then giving researchers self-assessment tools and using self-auditing tools on IRB operations is a method that can work.

These tools can help study coordinators and investigators turn their study protocol submissions from a hot mess into a submission that is mostly compliant and easier to pre-review. IRBs can use self-auditing tools to ensure their human research protection program (HRPP) is compliant with regulations. These can inform quality improvement projects.

“What any program has to do is make sure they themselves are in compliance,” says Lisa Denney, MPH, CIP, deputy director in the research compliance office at Stanford University. “We try to create consistency. But, still, every research study has its own nuance, so we do a self-assessment on our own program to see if we’re consistent and compliant and make sure we’ve gotten the required documentation.”

Whether IRBs create tools for self-audits of IRB operations or for investigators to use, the goal is uniformity. (See story on study site self-auditing tools in this issue.)

“You will see one uniform approach for how we do those,” says Sana Khoury-Shakour, PhD, CCRP, director of the office of research compliance review at the University of Michigan. “We make sure the look and feel of the tools is the same. It starts with an introduction on how to use the tool, and we include cross-references to available resources.”

Often, when IRBs receive studies from investigators, they’re not ready to be reviewed. “Sometimes, they’re a hot mess; people just put down their ideas,” Denney says.

That is where IRB offices can help through pre-review processes. Making self-assessment tools available for researchers also can help. “We help investigators create a concise, harmonious package,” Denney says.

This requires a lot of work, details, and communication with investigators, who might say IRBs are nitpicking or annoying.

“That’s why we have to perform quality reviews,” Denney explains. “It takes effort. Sometimes, things are complicated, and sometimes, things get missed, so we have to get them fixed.” Consent form templates might be unclear, or checklists might be missing an item.

Stanford University has eight IRBs, each staffed by two people. Combined with management, there are about 20 HRPP employees, Denney says. To keep staff consistent when they assess protocols for regulatory compliance, senior IRB staff train and mentor newer staff.

“It’s a lot of work to train a new person,” Denney says. “It’s part of senior people’s job.”

Some senior IRB staff enjoy training others, and Denney will ensure they have more of that responsibility. New IRB employees will review protocols, forward the protocols to senior IRB staff to review, then send the protocol and changes to the study team. This might continue for several months.

“They perform the review and the trainer shadows them, and that’s the bulk of the training,” Denney explains. “We shift work to what people like. Training is continuous.”

IRBs also can provide self-auditing tools to investigators. The University of Michigan offers a dozen checklist and self-assessment tools on its research ethics and compliance webpage. The tools include a checklist for registration, data security, Department of Defense research, eligibility criteria, and others. (The tools are available at this link:

Stanford University’s HRPP maintains consistency in IRB staff reviews of protocols by asking IRB members to follow a four-page protocol checklist for medical research.

The checklist includes dozens of items, such as:

  • Whether the study uses non-significant risk device and the justification;
  • Appropriate child risk determination indicated and justification provided;
  • Plan to review responses to questionnaires asking about suicidal ideation provided;
  • Protocols if targeting participants with impaired decision-making capacity;
  • Whether study is listed as both multisite study and collaboration.

Consistency and quality improvement policies and practices are important among IRB staff, but should be a part of the culture and not seen as punishment, Denney says.

“Continuous quality improvement should not be punitive; it should be affirmative,” she explains. “You can say, ‘That’s great! We’re on target, so there’s good news.’”

The IRB can hold standing staff meetings to review hot-button issues and talk about interesting protocols so the staff can learn from each other, Denney says. For example, one new issue that comes up with studies is related to the COVID-19 pandemic. Some pharmaceutical studies will require participants to undergo a COVID-19 test, she explains.

With these tests, the question IRBs might ponder is whether the COVID-19 test is just for the study and should be paid by the study, or whether it is standard of care because the person might have gone to the hospital for other care anyway, she says.

“We have to unpack this, and we don’t have a precedent for this situation,” Denney adds. “These are the kinds of things our managers get together and talk about.”

Whatever decision the IRB makes about handling new situations that arise with studies, this decision should be handled consistently. One way to maintain quality and consistency is to create a real-time staff review of protocols under continuing review. “We work on continually reviewing our process, but doing a real-time review,” Denney says.

For example, rather than hearing about a study problem that occurred two years or even two months ago, the IRB will catch it and address it in real time, helping investigators fix the issue well before the continuing review date occurs.

“At the time of continuing review, we do a spot check on them,” Denney says. “This is a pretty fast-paced process, depending on what the issues are.”

Another way to assess the IRB’s work is to perform targeted audits of protocol reviews to ensure they have been completed properly.

“A person can run a report, identify a certain number of studies, and review to see if they met the review criteria,” Denney explains. “This is a more programmatic review of our program.”