The trusted source for
healthcare information and
IRBs need to become better at collecting outcomes, costs data
Objective outcomes data would be useful
In lean times, IRBs will have to become more disciplined. And part of their discipline will need to include collecting data about their own efficiency, outcomes, and cost-effectiveness, an expert says.
"As research facilities try to figure out how to save money, they'll come back to the IRB and say, 'Tell me why we should keep you guys,' and the burden then falls on the IRB administrator to show value," says Todd Wagner, PHD, a health economist in the U.S. Department of Veterans Affairs in Menlo Park, CA. Wagner also is a consulting associate professor at Stanford University in Stanford, CA.
IRBs could view the current economic downturn as an opportunity to make long-term changes that will improve their efficiency over time, he suggests. "We need to figure out how to collect better IRB data," Wagner says. "Most IRBs I talk to don't collect enough data in-house and aren't able to compare themselves to other IRBs."
There needs to be a data collection system that would enable an IRB to evaluate its failures and successes. "We need a system to say why this protocol fell through the cracks," Wagner says.
Wagner has researched the costs of operating IRBs and the economies of scales from which they might benefit. But it's difficult to track information on their performance, he notes.
"We're doing two studies now," Wagner says. "The VA is creating its own central IRB, and we're evaluating that." Every time a principal investigator reports an action, researchers survey the IRB, he says.
"We're getting fairly good response rates from investigators, but the IRB is impenetrable, and that's partly on purpose," Wagner says. "We're getting very low response rates, so self reporting will never be the system we need."
What's needed is objective data, preferably collected electronically, and each IRB should collect this data, at the very least for their own quality improvement purposes.
For example, IRBs should be collecting this type of information:
• How timely are the processes?
If a protocol is submitted on April 1, when is it first processed, Wagner says.
If IRB has a protocol-tracking electronic system, then collect the dates protocols are submitted, when it was processed, and who reviewed it and when the review and decision were made, Wagner suggests.
"You'd want to know processes and who handled it," he adds.
When IRBs are able to track protocols to these details, it's very reassuring to institutional officials and investigators who want to know where their protocol is in the system, Wagner says.
• What types of trials are handled?
"Ideally, you'd like basic information about the type of trial," Wagner notes. "Unfortunately, unlike in medicine where we have diagnostic codes that are standardized, there is no IRB system saying this is a blood trial."
If possible, IRBs should divide protocols into groups that are homogenous with resource consumption, Wagner says.
"We don't have a system for neatly dividing them into groups now, so they'll have to figure out their own system," Wagner adds.
The point is to be able to identify protocols that require more resources, so an IRB can plan for these or make adjustments.
"If this one went through in three days and this one went through in 18 days, what was the difference between the protocols?" Wagner says. "Maybe one truly is higher risk."