Use researcher feedback to improve IRB turnaround

NIH project helped identify best practices

IRBs can learn a great deal from each other. One research institution's hard-earned lesson and resolution can be another organization's best practices.

This is one of the philosophies underlying the Roadmap for Medical Research, which was launched eight years ago by the National Institutes of Health (NIH) of Bethesda, MD. The Clinical Translational and Science Awards (CTSA) programs emerged from this. CTSA brings together research leaders at major academic research institutions. Together they tackle clinical research (CR) issues facing the industry, including the problem of IRBs having slow turnaround times on protocol reviews.

Under CTSA, there was an IRB task force that brought together major research institutions, including the University of Michigan and Yale University.

"Under the task force, we spoke with several colleagues about the things they do in their institutions to collect metrics and turn around the time from when investigators first dream of the idea of a study or receive a protocol proposal from a sponsor to when the first patient is enrolled," says Kathleen T. Uscinski, MBA, CIP, director of the office of human research protection in the office of research administration at Yale University in New Haven, CT. Uscinski was a co-chair of the task force.

The NIH recognizes that the cost of conducting research is high and benefits gained from research can take way too long to reach the bedside.

"So we wanted to know if there was a way to do things more efficiently from the point of administrative tasks, writing proposals, negotiating contracts, processing grants, reviewing a proposal by an IRB," Uscinski says.

The task force collected data from 37 research institutions and reviewed protocols that were reviewed by a fully-convened IRB.

"We looked at data from institutions that had inordinately long times and from those that did it very fast," Uscinski says. "A good group of them hovered around the middle group, and a few did it exceptionally well."

After considerable analysis, the task force identified time points that everyone had in common. They found that while IRBs shared these time points, the length of time it took them to process information varied greatly, from six turnaround days to 90 turnaround days.

With a second NIH award, Yale worked with the Mayo Clinic of Rochester, MN, to further explore best practices. Together the two institutions came up with six key initiatives for improving efficiency in the review process.

"The one I'm most active on is IRB processing," Uscinski says. "We compared similar categories of research and identified the ones that were more efficient."

Uscinski spent several hours speaking with Mayo Clinic's IRB staff, discussing their quality improvement initiatives that helped reduce turnaround time.

"Mayo has a series of projects where the staff and IRB evaluate work flow and business handling," she says.

They identified ways to make processes more efficient, focusing first on the low-hanging fruit. The goal was to find projects that could produce quick results at Yale.

"I asked them: 'Of all the process improvements you did, what gave you the most significant decrease in turnaround time?'" Uscinski recalls. "They said they were focused on their application form."

The application form can cause delays if investigators misunderstand its questions or leave items blank. This will result in the IRB sending the application back to investigators, requesting revisions, clarifications, and more information.

"So one of the most time-consuming obstacles to getting the protocol approved was the back-and-forth with the investigator," Uscinski explains. "What we found out from further analysis is it's not necessarily the processes and unnecessary steps, but the quality of the applications that is causing the IRB to ask investigators for further clarification."

The problem is in how the information is written and provided to the IRB.

One potential solution to this is to engage focus groups to look at the application form and make suggestions for changes. Then it can be rewritten in a way that makes it more understandable to the research community.

A less resource-intensive quality improvement project would be to identify the sections on the application that cause the most problems for investigators and the IRB and show those questions to researchers, asking them for input on how these could be improved, Uscinski suggests.

This process can result in helping the IRB look at the application form in a different light.

"This is where the IRB staff says, 'Oh wow! I didn't know that question was so far off,'" she says.

For example, the IRB might ask investigators for information about who is approaching potential research subjects and whether there is a primary health care provider involved, Uscinski says.

"Those details and responses are critical to the IRB as it considers the protection of human subjects," she says.

For instance, if a researcher approaches someone who is a sibling of a person with a medical condition and it's because of this connection that the person is asked to participate, the sibling might feel violated, as though there was a breach of privacy, she says.

"These are the types of things an IRB evaluates," Uscinski says.

And perhaps the investigator has the most ethical approach for contacting potential study participants, but this approach is not fully communicated in the IRB application. It takes back-and-forth communication to pull that information.

"So what we wanted to do is write the application so the principal investigator fully understands what is expected, minimizing the amount of rewrites and communication between the investigator and IRB," Uscinski says. "What we're trying to do is make sure the PI writes a good solid protocol the first time it's submitted to the IRB so the IRB has the necessary information and can approve the research more quickly."