Institution's reviews proactive, educational, and boost compliance
Catch PIs before they make mistakes
The University of Pittsburgh in Pittsburgh, PA, has developed a thorough, review process that serves as an educational opportunity, as well as a way to prevent clinical research problems.
Here's how it works:
• Schedule initial visit: The initial visit often occurs before the trial site has enrolled subjects, although some occur later in the study cycle, says Cindy Kern, RN, CCRC, CIP, education and compliance coordinator and assistant director for the office of investigator-sponsored IND/IDE support at the University of Pittsburgh.
"If there are subjects enrolled we may be at the site for six to eight hours, depending on the nature of the study," Kern says. "If there are no subjects, we're generally out in two hours."
• Time reviews to enhance teaching moments: "In the past we would go out after a study had been ongoing for a couple of years," Kern says. "Now we try to meet with them soon after the IRB approval, if possible, and prior to subject enrollment."
Called RISE for Research Investigator Start-up Education, this process is proactive, preventing problems, and it's educational for clinical research investigators and coordinators.
Reviewers help investigators assess whether they have all necessary data in place. Also, it makes it less crucial to conduct a post-enrollment review.
"We may go back six months, eight months, maybe a year later to see if they've put everything in place," Kern says. "But most times we don't go back unless an issue arises."
In the four years since starting RISE, the organization has seen positive outcomes: "The investigators we've gone back to assess have actually implemented a lot of things we've suggested, and their records are in very good shape," Kern says.
The support office surveys researchers anonymously four times a year, and about 75% have returned positive comments, she adds.
• Assess trial site compliance: "Generally, we can tell by the tone with which we're received when we have an interview," Kern says.
"I can honestly say that almost all the folks with whom we've met have been very receptive and appreciative of the tools we've provided," she adds. "A few won't feel they need any assistance, but that's generally the exception."
What to leave in, what to leave out
One area that has shown improvement involves inclusion/exclusion criteria.
"We try to stress with folks that not only do they need to address the presence of inclusion criteria, but the absence of exclusion criteria," Kern explains.
"An example is a person with poor kidney function," she says. "We need to see documentation that the kidney function was assessed, and the subject does indeed have appropriate kidney function to participate in the study."
In another example, if a study participant has to use an approved method of birth control, reviewers want to see what kind of birth control is being used and whether the participant is willing to use that method of birth control throughout the trial."
"We tell them they need to show the details and not just check a box saying they're using birth control," Kern says. "We're seeing improvement in that area."
• Review safety assessment plan: "When we go in to meet with investigators we want to assess how they are making sure they're following the protocol with a minimal amount of risk for subjects," Kern says. "We want to know who is doing this assessment and how they are assessing safety on an ongoing basis."
For instance, is a site following a data safety and monitoring (DSM) plan?
Some investigators will write in their protocols that they plan to follow a DSM plan, but then they do not follow through with actually creating one, she notes.
"We say, 'We want you to describe what you're willing to do to assess ongoing patient safety, and be realistic,'" Kern says. "Once we have them thinking about this process we might find they have a good approach, but have to give it more thought and document it when they're done."
• Suggest study modifications when needed: Sometimes a study needs to be modified because of issues that arise, and the support office will make this recommendation as needed.
For example, there might be a study in which adverse events are occurring at a greater-than-expected frequency, Kern says.
The support office will suggest the investigator assess all factors that might influence adverse events. If the assessment reveals the study is not causing too many problems other than occurring at a greater frequency than anticipated, then the investigator will need to inform subjects and modify the informed consent document, Kern says.
"So the investigator would have to change the protocol and consent document to reflect the greater occurrence," she adds. "Then we also tell them they will have to come up with a modified consent to provide to participants who already are enrolled in the study, reflecting the new finding."