The trusted source for
healthcare information and
IRB seals fate by approving fake protocol in federal sting
"To say this is just a problem of [some] IRBs is very naïve."
A federal undercover investigation into possible vulnerabilities in the IRB review system has led to the closure of one independent IRB, and has left other IRBs wondering whether the due diligence they exercise is diligent enough.
Coast IRB, based in Colorado Springs, CO, announced in April that it would close its doors as soon as oversight of its protocols was transferred to other IRBs. Coast had been one of the targets of a General Accounting Office (GAO) sting, in which investigators submitted a phony protocol on behalf of a non-existent medical device company.
Two other independent IRBs had rejected the fake study of a post-surgical healing device for women, which had purposely included multiple characteristics of "significant risk" from existing FDA guidance, according to a GAO report on the investigation.1Comments from those IRBs had called it "junk" and "the riskiest thing I've ever seen on this board," the report found. However, Coast IRB approved the study, said Gregory Kutz, managing director of forensic audits and special investigations for the GAO in Washington, DC.
The GAO investigators provided information to the FDA, which "determined that Coast IRB committed several violations of the laws and regulations intended to protect the rights and welfare of human research subjects in clinical trials and that the company failed to perform the robust review needed to approve a study," according to an FDA statement.
While criticizing the sting operation as illegal and unnecessary, Coast did voluntarily agree to stop accepting new studies and to put in place a number of improvements, including a new board chair, new members and new standard operating procedures. Shortly afterward, however, the company announced it would stop operating.
Kutz says that since the investigation became public, he's received calls from other IRBs asking for help in determining whether they're doing all they can to protect subjects.
"We've done some teleconferencing with IRBs telling them about our investigation and giving them some advice," he says.
Marjorie Speers, PhD, executive director of the Association for the Accreditation of Human Research Protection Programs (AAHRPP), says the investigation should serve as a wake-up call for all IRBs, not just commercial or independent IRBs.
"I would say, based on looking at the IRBs that come to us for accreditation, there are a number of IRBs that are not functioning at the level they ought to be functioning at," Speers says. "To say that this is just a problem of central or independent IRBs I think is very naÃ¯ve."
Committee requests investigation
The GAO investigation came at the request of the U.S. House Energy and Commerce Committee, which Kutz says has a long-standing interest in human subjects protection issues.
The GAO actually looked at three aspects of the human subjects protection system – establishing and registering an IRB with the U.S. Department of Health and Human Services (HHS); obtaining a federalwide assurance (FWA) from HHS; and getting protocols passed by IRBs.
To do this, they created a fictitious IRB and medical device company, using phony documents to apply for real certifications. Using these methods, the investigators were able to register their fake IRB with HHS and obtain an FWA for the device company (see accompanying story).
They also used this company to submit a protocol for a fictitious surgical adhesive gel – a protocol rife with problems, Kutz says.
"The study should have raised red flags – there were no animal studies, the chief investigator wasn't licensed to practice medicine," he says. "There were sterility issues. The details of what (the device) did were deliberately confusing."
He says the protocol was submitted to three independent IRBs after searching online for those that had less burdensome paperwork requirements. They submitted fake CVs and a fictitious medical license number for the principal investigator.
Two of the IRBs sent back so many questions and concerns about the protocol that the investigators could not keep up the deception. One asked for documentation of animal testing that the investigators claimed had been conducted on their product. In the end, the two boards refused to approve the study, one voting unanimously to reject the protocol.
Kutz noted that even those IRBs, while recognizing the obvious deficiencies in the study, failed to realize that the company didn't exist and that the medical license number was fraudulent.
"No one caught the fact that it was a bogus study," he says.
IRBs aren't necessarily set up to catch the type of elaborate fraud used in this case, says Felix A. Khin-Maung-Gyi, PharmD, MBA, CIP, chief executive officer of Chesapeake Research Review Inc., a Columbia, MD, independent IRB that is receiving some of the studies being transferred from Coast.
"That type of scheme is something that was pretty elaborate," he says, "For an IRB to uncover that would have been a stretch of the imagination."
He says the fact that two of the three IRBs approached rejected the protocol out of hand is significant.
"What it says is there was one bad apple," he says. "The system is not perfect but it is working."
Speers says central IRBs accredited by her organization – none of the three IRBs approached in the GAO investigation were AAHRPP-accredited, she says – are supposed to check the credentials of investigators because in many cases those researchers wouldn't be known to the board.
"They actually do a good job of evaluating the credentials of the investigator and they do look at issues around medical licensing," she says. "Some of the central IRBs actually go out and do site visits."
In fact, she says, those central IRBs are often more rigorous about checking the bona fides of investigators than are institutional IRBs, who may make assumptions about investigators based on the fact that they are associated with the hospital or institution.
"An institutional IRB generally does not check credentials of an investigator or research staff – they just assume that if they're part of the institution, they pass muster," she says. "And they shouldn't be making that assumption."
Speers says the faked protocol in this case had obvious deficiencies that should have been picked up by an IRB.
"For example, there was no data and safety monitoring plan," she says. "It was a fairly short protocol, given that this was going to involve the implantation of a device that involves significant risk."
But she's not confident that all IRBs would have recognized those deficiencies.
"I think there are institutional IRBs that could have done the same thing (Coast did)," Speers says. "AAHRPP knows from looking at hundreds of applications that, for example, IRBs are unclear what they should be looking for in a data and safety monitoring plan, so that could easily be missed."
In his report, Kutz noted that his investigators conducted all their communications with the IRBs by fax or through the Internet.
"As a result, our investigators were never exposed to real-time activities, such as telephone conversations, face-to-face meetings, or site inspections, which would have revealed their lack of expertise, lack of an actual facility, and other fraudulent representations," the report says.
As more IRBs go electronic, Speers says it's important to ensure appropriate oversight. She says that effort is two-fold.
"One is the importance of education upfront, so that investigators and research staff know what they're supposed to do, how to interact with the IRB," she says. "And likewise, the IRB is educated so it knows how to interact with investigators. Through education, you set up a particular culture."
Secondly, she says, it's important to establish post-approval monitoring.
"So if either the investigator or the IRB staff needs to pick up the phone and make a phone call or they need to make a visit, that occurs."