The trusted source for
healthcare information and
Using metrics to track, improve performance
Monitoring review times, training, other measures
To learn the difference collecting and analyzing metrics can make for an IRB, it may be helpful to be a proverbial fly on the wall at the offices of the Vanderbilt University Human Research Protection Program (HRPP).
The Nashville, TN, based institution has been capturing metrics for its IRB since 2001, says Director Denise Roe, MSM, CCRP, CIP, RAC. Roe says she and Associate Director Julie Ozier, MHL, CHRC, CIP, don't just refer to the numbers periodically they use them continuously to track performance.
"We use our metrics, if not on a daily basis, then three or four times a week to make decisions about what we need to do," Roe says. "We really embrace data. It's important to us, it can be the first indicator that there might be an issue that we might want to look further at."
One major indicator that Vanderbilt collects for its four IRB committees is the length of time it takes from submission of a protocol through pre-review to the first committee action letter or approval letter from the IRB.
"We do that for all types of studies, those that have to go to full committee, as well as those that are expedited," Roe says.
That measure, along with monthly metrics about the number and type of new submissions, are published on the Vanderbilt HRPP's website.
She says that while Vanderbilt does collect figures regarding how long it takes until final approval of a study, they aren't published and she and Ozier don't review them as often.
"The metrics are a measurement of how well we're turning things around, our response times," she says, noting that the final approval is dependent on many outside factors, including time spent dealing with contracts and managing conflicts of interest.
However, Roe says that the recent release of IRB metrics by the Association for the Accreditation of Human Research Protection Programs (AAHRPP) has caused her to rethink that strategy.
"Seeing that makes me think maybe we should pay more attention to that end result," she says. "It may not be something that we personally at the IRB can change. But perhaps we could be facilitators in other areas to help improve that turnaround time. For example, if I'm waiting for a billing plan to be finalized, is there a way that we can intervene early on and have a discussion with the department of finance to see what their struggles are?"
'Let's not let that happen'
When her office first began using metrics, Roe says, team leaders would examine them monthly to see whether the numbers were staying within pre-established thresholds. If they didn't, "we had to provide a justification why did this take longer than our target?"
But she says that didn't turn out to be particularly helpful.
"It was happening after the fact 'why did that happen?' Now we're using more of a 'Let's not let that happen' approach."
She says team leaders now use real-time monitoring to see whether the metrics are getting close to the thresholds, and intervening before that happens.
For example, Vanderbilt has cut the number of lapsed studies by monitoring studies that are getting close to expiring and notifying investigators.
"We'll make a phone call and say 'What can we do to help you?'" Roe says. "Four years ago, we might be sending out four lapsed studies, now we're pretty much down to zero."
The department also used metrics to manage the heavy expedited study workload of its behavioral/social science IRB.
"In their expedited and exempt turnaround time, we'd seen significant time lags for them," Roe says. "If you broke it down and looked at the behavioral/social science team, they were maybe twice as long as the others."
First, she says, they changed the process from two reviewers to one for expedited reviews, which led to an improvement, but the review times still exceeded those of the other committees.
"About a year and a half later, they were still a little high, so we implemented a protocol analyst doing the exempt reviews, and wrote our policies to support that. And they are now right in line with the rest of the other committees as far as turnaround time for expedited and exempt reviews."
Obviously this ongoing monitoring requires a sophisticated system to collect data electronically and make numbers available whenever staff needs them.
"We have dashboards, where we can click a button and it will pull all of our data and put it into graphs and pie charts, so we can get an understanding of the performance, all the way down to each staff member," Roe says.
Investigators also can access information about their studies and their training via the Vanderbilt website. When they call up their studies, gauges will tell them how close the study is to expiring and how much time is left before they need to complete their annual training, says Gene Gallagher, MSPH, CIP, RAC, VHRPP associate director for compliance and quality improvement.
"We now have fewer incidents when something comes in for continuing review, to have to go back and tell them their training has lapsed," Gallagher says.
Roe says the university's informatics team has been very helpful in making their use of metrics possible. It even has enabled them to link to other departments to get shared measures.
"They really look at integrating throughout the system," she says. "They don't come in and say, 'This is the system you're going to use.' They say, 'What is your system, and how can we pull information out of it?'
"We've been really fortunate to have some great informatics groups that really work well together."
To see the current performance metrics for the Vanderbilt Human Research Protection Program, visit its website at http://www.mc.vanderbilt.edu/irb/metrics/