Many IRBs have had quality improvement processes and electronic IRB systems in place for a number of years now, but are they working optimally?
Just having a quality improvement (QI) program in place is not the same as having continuous quality improvement or having a system that regularly identifies issues and prevents problems. What human research protection programs (HRPPs) and IRBs might need is a quality improvement project for their QI and other processes and workflow, according to Candice Yekel, MS, associate vice president for research, and Sara Horn, CIP, assistant director of the Office for Research Protections at Pennsylvania State University in University Park.
“We’ve had an official QI program for at least six or eight years, and we’ve been building it over the years to do a number of things, including post-approval reviews and quality checks,” Yekel notes.
“Most recently, QI activities helped us to recognize that our system hadn’t been meeting the program’s needs,” Yekel adds.
“We either needed to enhance the homegrown system and put a lot of resources into it, or we needed to buy a commercially available product,” she adds. “We thought figuring out how to put in an electronic system was a QI activity, and we put together a task force to determine whether to use the homegrown system or purchase a new one.”
There were two major deficiencies in the old electronic system, Horn says.
“It was great for managing protocols in our office, but it had no meeting management functionality,” she explains. “So we had to use other mechanisms with anything that had to go to the full board.”
For example, there were no mechanisms for handling unanticipated problems, noncompliance, and other things that are now referred to as reportable new information, Horn says.
Although everyone agreed the electronic system needed to be improved, it wasn’t certain whether the improvement would involve replacing the existing electronic system with something new or patching and repairing problems, but retaining the existing system. So they put together a task force to help make this decision, Yekel explains.
When the existing system was created, there were few commercially available options that fit in with the institution’s HRPP needs, she adds.
“Our mindset at the time was that we could build our own system, but we learned very quickly that it costs a lot of money to build a system and maintain it,” Yekel says. “And we had very limited information technology [IT] resources.”
This resulted in IRB analysts doing some of the IT work, which was unsustainable as a solution, she notes.
“We were building this system for a long time, and it was good, but it wasn’t good enough,” Yekel says.
The task force, along with institutional leadership, helped them reach the decision to purchase a new system and start over.
“Once we made the decision to purchase a commercial product, we felt it would be an excellent time to review our HRPP process to make sure we had maximum efficiency,” Yekel says.
This is where the second phase of a quality improvement process began: “We evaluated all IRB staff and processes and looked for where there were inefficiencies and did some process mapping.”
“We were charged by the vice president for research at the time to make this process transformational, and so we embraced that process of transformation,” Yekel says. “We made significant changes in how we thought about work, monitoring ourselves, and continuous quality improvement.”
The changes resulted in shorter review timelines and a more transparent and streamlined IRB process, Yekel says.
“Investigators know where their submission is in the process,” she adds. “They can see real data on how the IRB and IRB staff are performing.”
The new system also makes it easier to collect QI data, such as the following data points:
• How long does it take to get an initial response from the IRB?
• How long are investigators making revisions to the protocol?
Part of the transformation was a philosophical change, Yekel says. The legacy system was designed to ask hundreds of questions with each answer branching off into a new batch of questions. “It was like a tree that keeps expanding,” she says.
The problem was that the additional questions did not result in better IRB submissions or make the process better, she adds.
“We needed the right information, so we adopted a HRPP toolkit that included a series of quality measures and procedures that were compliant with federal regulations and with the standards of AAHRPP,” she says.
The toolkit includes standard operating procedures (SOPs), worksheets, checklists for IRB analysts, and workflow diagrams, Horn says.
“IRB analysts and members can use those workflow diagrams to help them understand how a review occurs and how a submission moves through the review process,” Horn explains. “The electronic system and the review process mirror the workflow in the toolkit.”
The tools were incorporated into the software so the electronic system would be able to collect and analyze the most useful data. The tools also were used by IRB staff prior to use of the electronic system to help them become accustomed to the new workflows, Horn says.
“Previously, we had our own SOPs, checklists, and worksheets for IRB review,” she explains. “Then think of it as we threw all of those out the window and started over; that’s what we did.”
They created a responsibility matrix team with outlined responsibilities for individual team members. “That team was responsible for evaluating the HRPP toolkit document and determining where we might make some small tweaks,” Horn says. “We did a lot of work on those documents before we began to use them.”