Smaller sponsors may beget better data and metrics

Sponsor has ability to act instantly

The clinical research industry's well-documented inefficiency is like a rhinoceros stampede: everyone recognizes it, regrets it, but can't figure out how to stop it. But there might be a potential solution in an approach to clinical trial planning that is catching on from the smaller research organizations and sponsors on up.

Some newer biomedical and research companies are being developed with flexibility and a greater attention to performance data and metrics. They keep real-time data to quickly react and troubleshoot when issues arise. And they know which clinical trial sites are performing well and which are below par at any point in time.

"Cost and time are key things we look at for our clients, and that's always a metric we're tracking," says Lisa Sanders, PhD, RAC, senior clinical strategy scientist for Cato Research, a clinical research organization (CRO) in Durham, NC.

The industry's biggest players might be slower to change, but the trend of relying on metrics to improve efficiency is gaining momentum.

"I see the industry moving this way," says Steven Sweeney, director and head of clinical operations for Infinity of Cambridge, MA, a small sponsor that has three compounds in full development and which also contracts out services.

"Companies like Infinity have a massive advantage because we're starting small and can build in metrics as we build our company," he explains. "It is a lot more complex for global CROs and pharma alike because they're already complicated."

Infinity, founded in 2001 as a discovery company, has been collecting metrics and adjusting operations to improve efficiency for the past decade, he adds.

While quality is a clinical research organization's chief goal, time and cost data are important for project planning, Sanders says.

"We continually look at whether we're in our budget, and if we're not, then we determine why and what we can do to adjust our activities to be closer to our budget," Sanders says.

Data collection also is important for tracking clinical activities, including site monitoring visits and site enrollment.

"We collect metrics at the site level and can troubleshoot down to the site level," Sweeney says.

"I can dig into a site and look at the patient level to see, 'Wow—we're really behind on these patients,'" Sweeney says.

"What I've realized and experienced is I don't need a lot of metrics, but I need good metrics I can react on quickly," he adds. "The art is collecting metrics that drive behavior on a day-to-day basis."

For example, site monitoring visits cost Infinity $3,500 to $4,500 per visit. Unfortunately, these visits often are made when a site is behind in its data entry. These visits then might cost sponsors money without providing any data or other benefit, Sweeney explains.

"If a site is way behind in entering data into the system, then don't bother doing the visit and wasting time," he says. "So now our monitors go into the system and make those determinations between visits to maybe increase intervals so they'll have a more productive visit."

By keeping real time metrics on each site's performance, a sponsor or clinical research associate (CRA) potentially could pull up graphs that show how each is doing on subject recruitment. The CRA could use this information to make adjustments to site monitoring visit schedules.

"They could drop an email to a site coordinator and say, 'Hey, this is your CRA; here's a visual. It looks like we're behind on data entry,'" Sweeney says. "'Do you think we could get caught up before the monitoring visit that's scheduled for next week?'"

Or, in the case of a site that is enrolling above expectations, the CRA might say, "You've enrolled like crazy; there's no way we can take care of this in one day. Do you mind if we extend the visit to three days so we can stay ahead of the data curve?" he says.

These types of fast changes can save thousands or even millions of dollars over time.

"Even with eliminating a quarter of the wasted visits for most companies, it could save millions of dollars," Sweeney says. "Even at Infinity it works out to substantial cost savings for us."

Infinity's data collection is comprehensive and allows Sweeney to compare data across multiple studies with innovative and unique visualization, such as visiting report compliance across multiple types of study set-ups, he says.

"I can see outliers and look at data quality and source data verification," he says. "Because I can look at multiple set-ups, I can see why we are behind on monitoring visiting report compliance, for example."

When Sweeney speaks with vendors he can use this information to let them know details about their own performance and to help them work out problems or to give them positive feedback when all is going well.

It's the proactive component of this level of metrics that holds the most potential in making a particular study more efficient.

"I really care about getting data pushed out to people in the field, like CRAs, so they can react before it becomes an issue," Sweeney says. "They can use the information to adjust on the fly."

For example, a site monitor could learn from the metrics that she's behind on source data verification and then correct that problem by expanding visits, he explains.

Sponsors no longer need to wait for weekly reports to come out and to troubleshoot in retrospect, he adds.

Sometime soon, the sponsor might even share its clinical trial operational metrics with sites, Sweeney says.

"We think we'll have that in place probably in the next year," he says. "First we have to get all the internal metrics' definitions and visualizations worked out so it will work extraordinarily well and be accessible to the sites."

These data include speed of data entry, data quality, turnaround time for queries, etc., he says.

"I want to give people doing the everyday work access to the same information I'm working on so they can adjust quickly," Sweeney says.

"The flaw of metrics right now is most of them catch things that have already happened, and there's a substantial lag before you take action on it—it's a reactionary system," he says. "It'd be better to see trends over time so I can prevent them from tripping a threshold or being put on a dashboard report that would not be favorable."