Protocol review consistency is a hot topic as IRBs, research organizations, and investigators struggle with balancing quality and efficiency in the review process.
Consistency meets both goals, and it prevents complaints of protocols being given the green light at one IRB review, followed up by a similar study being flagged at a later review.
"At IRB meetings, sometimes you’ll have [a change in] members, and different people will look at protocols differently," says Maria J. Arnold, CIP, IRB clinical research manager at Baptist Health South Florida in Miami.
But most of the time, inconsistency in IRB reviews is the result of investigators giving inaccurate or inconsistent information, Arnold adds.
For instance, an investigator submits a protocol, listing the study personnel who will be providing informed consent. Someone on the IRB staff notices that the list of names is different from the names listed on an ongoing study submitted by the same investigator a few months earlier. If this means the research site has new personnel for its ongoing studies, then the earlier study’s change in personnel should have been submitted to the IRB, Arnold explains.
"This means there has been a change in personnel, and the investigator failed to notify the IRB," she says. "The new study reflects the new personnel, but the older study hasn’t been changed to reflect the new people."
These types of inconsistencies could be missed without an IRB pre-review process. (See story about creating a pre-review process that improves consistency, page 124.)
Another way to cut down on institutional variation in protocol reviews is to ask investigators to note previous studies that are similar to the current submission, says Paul Reitemeier, PhD, chair of the human research review committee and an associate professor of philosophy at Grand Valley State University in Allendale, MI.
According to Reitemeier, IRBs could give investigators instructions that say, "IRB membership changes periodically, and a similar project to one that was reviewed and approved may not be reviewed by the same people. So if this study is similar to a previously reviewed project, please indicate the review number."
Members of an IRB might not recall the last time they reviewed a similar study by a researcher, so asking principal investigators (PIs) for this information can provide historical perspective and improve consistency, Reitemeier says.
Reitemeier and Arnold are scheduled to speak in December about institutional memory and consistency in protocol review at the 2014 Public Responsibility in Medicine & Research (PRIM&R) conference in Baltimore. They are planning to hold an interactive session, presenting four protocol vignettes and asking the audience what they would like to know to make an approval decision, Reitemeier says.
"One of us will write down what people say and we’ll put it on an Excel spreadsheet," he explains.
Although it’s difficult for an IRB to be consistent when some studies have rare qualities, there is a strategy that will help improve an IRB’s institutional memory of how a similar atypical case might have been handled in years past, Reitemeier says.
Keep a separate list of these cases in a spreadsheet, he suggests.
"We have had cases from our own institution that were new and atypical, and we weren’t sure how to proceed with our review," he says. "As we stumbled through it, we took lots of notes, and I kept a file."
Over time, the file grew and some of the cases had similarities. This type of file works best as a spreadsheet because it’s easier to search for key features and compare notes on the PI, location, kind of review required, which federal agencies are involved, lessons learned, and who the funders are, he adds.
"You can note on the spreadsheet whether there were any other institutions or officers at your institution who you thought should be notified about the study," Reitemeier explains. "These include risk management, legal counsel, and pastoral care."
Reitemeier offers two examples of atypical cases:
Investigators proposed doing a procedure for which they had no certification.
One example of an atypical case that would make it to the spreadsheet involved a slightly greater than minimal risk exercise study. The PI was a faculty member trained in the United Kingdom and with 20 years of experience in research, including some years in the U.S., Reitemeier says.
"He wanted to draw blood and insert a two-centimeter wire into the muscle to measure temperature, but he was not licensed [for that procedure] in the U.S. or U.K.," he explains. "He had a colleague from Canada who would do the wire insertion."
The dilemma was that the study called for having two non-licensed, non-U.S. individuals perform a procedure that was slightly more than minimal risk.
"They were willing to come under our regulatory oversight and follow U.S. requirements," Reitemeier says.
The IRB checked into what was required for phlebotomy in Michigan and found that it was left to hospitals to provide training. So the IRB asked if any hospitals would train the two investigators or allow them to demonstrate competency for certification. Every single institution said, "No" to this request. They were concerned about legal liability for a non-employee, Reitemeier says.
"Even within our own institution they wouldn’t provide training for people for whom they had no oversight," he adds. "It was difficult to get these guys empirically qualified to do what they needed to do for their research."
The study was altered considerably after some weeks of the IRB working with the investigator, he says.
Adding this case to the spreadsheet will help the IRB answer some questions more quickly when similar issues arise, including these:
- What are state regulations for phlebotomy qualifications and training?
- Which departments and offices might have answers to questions about having procedures conducted by legal but non-certified personnel?
- Has risk been minimized in a protocol with an unusual procedure?
Study involved deception that backfired.
Another atypical case involved a student-led project with the goal of assessing agency and government responses to Freedom of Information Act (FOIA) requests. One issue was that the people who would receive the FOIA requests were not going to be told that this was for a research project because investigators wanted to assess compliance, Reitemeier says.
"This one didn’t work out as well as they hoped because of its timing: It coincided with the first day of spring break, and a lot of schools that were being sent the FOIA had no one available to answer the request," he explains. "The timing caused a burden for them because they were really understaffed."
Another challenge was that FOIA requests often were sent to an attorney who worked with the organization, and sometimes one attorney would handle several different organizations.
When an attorney saw several of the same type of request, he put out a query to other attorneys, saying, "I just got FOIA requests from a number of institutions, and these have the same wording, and I’m not sure what it is."
The researchers had mailed out 350 FOIA requests, and once attorneys began to communicate about the strange requests, they learned that each came from the same source, even pinpointing the university and faculty member involved with it, Reitemeier says.
The IRB learned from this experience that sending out multiple FOIA requests can result in someone tracking the requests back to the investigator. This made the study’s attempt to avoid bias a moot point.
"We had anticipated some public relations fallout, but had not contacted our legal counsel about the study, and, in retrospect, we should have done that," Reitemeier says. "We also learned that FOIA requests are holiday-sensitive."
The study had included a statement that asked FOIA recipients to call a specific number if they found the request a burden or if they were going to have difficulty in fulfilling the request. However, it turned out that many recipients had not noticed that sentence, so another lesson learned was that this type of statement should be more prominently placed in the communication, Reitemeier adds.
"We started getting calls from people who were very upset that this FOIA request was a student research project, requiring them to disrupt their work and spring break," Reitemeier says. "So we recognized that the sentence about giving us a call if handling the request would be difficult needed to be highlighted."