The trusted source for
healthcare information and
Approval time dropped 10-plus days
When an IRB process is routinely misunderstood and cumbersome, it might be time to ask the users what they think would work better.
At least that's what the human research protection office at the University of CaliforniaSan Francisco did when planning to revamp the chart review research IRB application. And it worked: The improved application had a post-implementation mean time to approval of 18.6 days, more than 10 days shorter than the pre-implementation mean time of approval of 29.3 days.1
Before the changes, the application process had many problems, says Liz Tioupine, CIP, iRIS system administrator, human research protection program (HRPP), UCSF.
"We had one chart review study returned six times for corrections," she recalls. "People could not understand what we were asking in our application."
The application's wording was called "IRB speak" by some critics, and IRB analysts found that most submissions for chart review research were poorly prepared, with about 82% of submissions in 2012 returned for corrections and nearly one-quarter returned more than once.1
Confusion over the form's questions was so commonplace that the HRPP office could only conclude that the application was not user-friendly, Tioupine says.
The IRB enlisted help from the form's users, faculty of different clinical research areas, asking them for their thoughts on which questions were confusing and which were unnecessary.
The IRB and researchers worked together to create a streamlined chart review application form with the goal of making it simple enough for a novice investigator to complete quickly. To achieve this goal, they eliminated 32 unnecessary questions and reworded many others, Tioupine explains.
"We'd read the application to them, and they'd say 'That doesn't make sense for a chart review study,'" Tioupine recalls. "We got rid of questions like, 'State the hypothesis,'" she says. "The study design and hypothesis boiled down to one simple question of 'What do you hope to accomplish with this study?'"
The IRB also asked researchers for help in rewriting questions to get rid of the IRB-speak.
"We had thought the general terms we had for the main application could be applied to all different types of research, including social-behavioral research, and we found that these terms had very specific connotations," Tioupine says.
For example, the IRB found that researchers seemed unclear about how to answer the question regarding inclusion/exclusion criteria.
The original form said, "Describe the inclusion criteria and describe the exclusion criteria." The IRB had found that researchers often had difficulty with this question, saying it didn't apply to their study because the study involved a chart review and not visits with patients, Tioupine says.
"We asked them, 'How do you decide what charts to look at?' and they would say, 'We're looking for people with high blood pressure who are on two or more medications,'" Tioupine says. "And we'd say, 'Okay, that's what we mean.'"
The researchers' response was to suggest the IRB simply ask for that rather than use the terms "inclusion" and "exclusion," which investigators believe means something very specific, such as the process of screening patients for a clinical trial, she adds.
So the application's question was changed to, "Describe the population being studied."
Here are some examples of other questions on the revised chart review IRB application form:
• What do you hope to accomplish with this study?
• Indicate if the primary study population includes:
- children or neonates;
- pregnant women;
- none of the above.
• Types of records or biospecimens being reviewed/analyzed:
- medical record or other health record (identify source below);
- data repository (IDR) or The Health Record Data Service (THREDS);
- existing research records (including OnCore) or identifiable biospecimens (identify source below);
- records open to the public (identify source below).
• List all variables you are collecting from the records.
• Approximate number of individuals whose records or biospecimens you will review/analyze. If you cannot estimate the number of records you will need, explain why.
After streamlining and improving the application form, they tested it with a focus group and made fine-tuned adjustments before deploying it in a pilot test system, Tioupine says.
Researchers also helped the IRB realize that the application form needed to address collaborations with outside sites.
"We did not realize that any chart review research could involve other sites, and other sites might rely on us to be their IRB," Tioupine says.
So the form has a section for adding information about collaborating, non-UCSF sites.
Less than a year after the application form was revised, the IRB found a number of positive outcomes, including satisfaction among investigator users, Tioupine notes.
"They all loved it," she said. "The time to complete the new application was within 30 minutes for almost half of the [researchers]."
Specifically, before the new form was implemented, the mean time for creating the application to the time it was submitted was 20.1 days. After the implementation, this time had dropped to 6.8 days.1
The process resulting in the improved IRB application proved so successful that the IRB plans to convene other faculty researcher user groups to streamline other types of research processes.1
"Without sitting down with researchers and having them ask us, 'What does this mean?' we wouldn't have known what was wrong," Tioupine says.
Since implementing the improvement, Tioupine has shared the revised form with other IRBs, finding considerable interest in it: "One guy sent me his grandmother's marinara sauce recipe as a thank-you," she says.