By Melinda Young, Author

Research institutions that undergo accreditation preparation quickly learn that having separate policies and procedures (P&Ps) for each IRB is a problem.

Accrediting organizations want documentation, compliance with regulations, and consistency between an IRB’s policies and its actual practices. Having several different P&Ps for the same processes increases the likelihood of survey findings and problems.

Accreditation is an opportunity to improve an organization’s operations, says Michael Mahoney, director of research operations and services, human research protection program (HRPP) administrator at the University of Florida in Gainesville. Mahoney previously was an IRB coordinator for one of the IRBs that had undergone accreditation.

“When the University of Florida made the decision it wanted to pursue accreditation, I had a thorough understanding of what that meant,” Mahoney says. “Even though one IRB mostly met the standards, we had two other IRBs, and we didn’t have synchronized policies across all of the IRBs.”

That was one of the biggest problems the institution faced as it sought accreditation from the Association for the Accreditation of Human Research Protection Programs (AAHRPP).

The reason one IRB was further along in meeting accreditation standards was because it had been an IRB of record for an accredited Veterans Administration organization and had changed its policies a decade earlier to meet accreditation standards, says Gailine McCaslin, MS, HRPP coordinator at the University of Florida.

“Since that time, the P&Ps were never changed or revisited, so we were in alignment with accreditation policies in some ways, but not in others,” she says.

So when the University of Florida decided to seek AAHRPP accreditation, each IRB would need to update and adjust its P&Ps, but also align these with each other.

The University of Florida has three foundational IRBs, one of which is on a distant campus, and also uses an independent IRB. They often work in silos, so the accreditation process was a rare opportunity for the IRBs to talk with each other, McCaslin says.

“It didn’t happen before,” she says. “We had different sets of policies and procedures and types of research reviewed; staff had different internal practices that might not have always been shared.”

Even best practices weren’t always communicated.

When Mahoney first met with the three IRB chairs, top-level administrative IRB staff members, and a quality assurance team, and introduced McCaslin, they began the policy revision process.

“We needed to make sure when we drafted a new or revised policy that it would work for all of the boards,” McCaslin says.

McCaslin provided stakeholders with a spreadsheet report that had been vetted first by Mahoney.

“He’s extremely familiar with the boards and had served as the assistant director of IRBs, and he knew the key players and their practices,” McCaslin explains. “He also knew the gravity of changing policies and the reasons for certain gaps and things like that.”

Mahoney acted as facilitator, explaining when gaps would be more of a challenge to address, providing reasonable resolutions, and identified when the IRBs would have to change procedures.

“There were times when people would drag their feet or have differences in opinions, and when we ran into resistance, I’d be a diplomat and make sure we found common ground on issues related to compliance,” Mahoney says.

IRB chairs were accustomed to having autonomy, so it was challenging for them to work in unison with the other boards to follow the same policies and processes, Mahoney adds.

“We have a biomed IRB, a sociobehavioral IRB, and a biomed that serves an indigent population, and they all have different justification for why they do things the way they do,” he explains. “But it was too cumbersome to have three different P&Ps. We need a single way to deal with these issues.”

During the accreditation process, the IRBs began to share information. All of the boards also moved toward full electronic submission systems, and that helped with getting all of them on the same page, McCaslin says.

Another strategy was to send each IRB regular emails and electronic reports about their progress in aligning their policies and procedures. McCaslin scheduled meetings with IRBs to discuss the changes and progress.

Sometimes, IRB chairs disagreed about the items on the gap analysis list. One chair thought some of the gaps were just differences in how the IRB liked to do things. But each procedure now had two standards to meet: regulatory rules and accreditation rules.

“We’re trying to uphold ourselves to higher standards for AAHRPP accreditation, and we need to be on the same page,” McCaslin says. “I would give them the highlights, a 30,000-foot overview of AAHRPP standards, and Michael would help translate that into practice.”

Aligning the various boards was a challenge, she notes.

“We had one meeting, which blew everyone’s minds because there are so many standards to follow,” McCaslin says. “We presented a long Excel spreadsheet, which outlined where we met standards, where we sort of met standards, and where we had a gap.”

Mahoney helped IRB chairs find a middle ground when there were disagreements among them about how a policy should be worded. It was sometimes difficult to reach a consensus.

“You need a constructive argument for whatever the middle ground is,” he says. “In some cases, I had to play the tiebreaker.”

Despite disagreements, the meetings were never contentious, he says. “From the beginning, we set expectations that we needed to do changes for accreditation, and everyone understood it wasn’t an option, but we’d grow in the same direction.”

The process worked. Policies and procedures were made more consistent. This improved things operationally and was a morale booster, Mahoney says.

“Researchers could use any one of our IRBs and instead of learning different ways of doing things, they now had a single mechanism for all IRBs on campus,” he explains. “It improves compliance and simplifies human research protection.”

Prior to the change, investigators would complain about their meeting requirements for one IRB only to have to start over again with another IRB. “They’d say, ‘I learned what I learned to protect subjects over here at IRB 1, so now why go through different training courses for IRB 2?” Mahoney says. “It’s a valid point.”

The changes resulted in rebranding for the institution’s HRPP, McCaslin says.

“We created an HRPP website to provide the full rollout of UF HRPP and to create buzz,” she says. “We started laying the foundation of a message that it’s not just the IRB that makes up the HRPP, but various other stakeholders, such as institutional leadership, compliance offices, sponsored programs, researchers and research staff, and any other entities that played a role in human subject research.”

As the revised P&Ps were approved, new tools were created, and the AAHRPP application was being completed, McCaslin kept a list of what still needed to be done.

“You have a puzzle piece, and you know what the puzzle has to look like, and one by one, you put in each piece,” she says.

“Gailine compiled all documents that met the accreditation standard and coordinated the drafting of various policies, guidance, and guideline documents that would fill those gaps previously identified,” Mahoney says. “We knew the payoff would be in the long-term, so there were no problems in getting people to participate in the meetings and provide feedback on the changes.”