Research organizations improve QI with electronic data collection

Consistency, efficiency, reliability are chief aims

When research organizations establish quality improvement and auditing processes, the chief aims are to improve consistency and efficiency in identifying and preventing noncompliance. One way to do this is through use of an electronic data collection tool.

"We had been using an Excel spreadsheet for reviewing compliance, but we were missing things," says Paula Bistak, RN, MS, CIP, CHRC, executive director for human subjects protection program at the University of Dentistry and Medicine (UDMNJ) of New Jersey in Newark, NJ.

The institution already had a long-standing quality improvement (QI) program for auditing investigators, but its IRB auditing was less established and formal, so the institution worked to create an electronic form that would make these audits consistent, systematic, efficient, and comprehensive, Bistak says.

They developed the solution, a smart form using Microsoft InfoPath 2007, after holding brain-storming sessions with information technology specialists and research program auditors.

"We came up with an auditing tool that would be consistent, and we did that with limited staffing," Bistak says.

It's important to have an auditing tool that could be used by both experienced and novice auditors, she adds. "We wanted to use this so we could go through our routine faster."

"What we wanted to do was develop a procedure that was systematic, comprehensive, specific, measurable, and efficient," says Cheryl A. Forst, RN, BSN, CCRP, senior compliance analyst in the human subjects protection program at UDMNJ.

"We set out to develop a system to expand our program, and we needed to do it on a zero budget," Forst adds.

The result met all goals, Forst says.

The team of analysts and informatics specialists created a form that used Microsoft InfoPass, a smart form, that provides drop-down choices that can be molded according to institutional needs, she explains.

"In order to make a form that's measurable we had to include quantifiable questions and identify trends," Forst says. "We did audits prior to visiting investigators, and if we saw a trend, say they were missing an IDE annual report when they were doing device studies, we'd let them know they were missing documentation in their file."

For example, the electronic tool's section on adverse events includes these questions:

  • Are reviewer sheets attached to each report of internal/local SAE/UAE as required?
  • Are reviewer sheets signed and dated?
  • If the IRB has requested clarifications/follow-up of individual SAEs/UAEs, has each request been appropriately reconciled and documented with PI response and IRB review?
  • If an SAE/UAE requires change to current informed consent document, has the revised consent been reviewed, documented, and processed appropriately?
  • Is each report in the IRB protocol file appropriately entered into the IRB database?

A poster based on the tool received a first place award when it was presented at the annual meeting of the Society of Research Administrators International in October, 2008, in National Harbor, MD.

Here is how the tool works:

• It's simple to use: The electronic form is color-coded and has check boxes. Auditors answer questions sequentially, Bistak says.

"The boxes turn red or yellow depending on how critical the information is," Forst says.

"So if you have a study that had funding listed, and they forgot to tell you who the funding source was, the box turns red so you'd know it was a piece of information that you tell the IRB director was missing," Forst explains.

The electronic tool is in XML format so any university could have access to it, Forst notes.

The form has conditional formatting, optional and repeating sections, she adds.

Auditors use laptop computers, taking these on their audits, Bistak says.

Most of the electronic tool's questions have "yes" or "no" answers, but some are open-ended questions, Bistak says.

"If you answer 'yes' it'll take you down another trail," she adds. "If something is missing, it takes you down another set of questions."

The electronic tool's questions cover every aspect of a study and its IRB review, and the tool has pull-out sections that inquire about risk, Bistak says.

"The tool uses plain text and is readable," Forst says. "We used our own platform to design it, and it was independent — not tied to any other programs in the university."

The form makes it as easy as a click at the corner to replicate a box that needs to be repeated, such as continuing review box, and it can self-populate when repetitive information is needed, Forst says.

"When you enter something in the box, it has your drop-down choices, asking you what kind of review this is – a full board, expedited, or exempt review," Forst says.

"It's a very easy to use form that many institutions have and many research administrators can use," she adds. "If they like this form in any institution they can pick any Smart Form and create it, molding it according to their institutional needs."

• Shorten form for efficiency: "We found out in the testing phase that the form was very time consuming," Forst says. "It was okay for smaller files, but when you had a protocol study that consisted of 10 or more files it would take three hours to complete that form."

Initially, auditors thought this was fine, and they'd be able to audit every file, but they quickly decided a shortened version would be more practical.

"It's a very manageable tool," Bistak says. "We have 3,000 active studies over all of our campuses, and the auditors don't have hours and hours to review all of these studies."

While the review is useful in conducting for-cause audits, its real value is when the office wants to conduct targeted audits, such as reviewing principal investigator-sponsored studies or asking a specific question, Bistak adds.

"So the whole form is shortened up," Forst adds. "We took away the fields we didn't need."

If an auditor clicks on devices off the "view" section of the toolbar menu then she can click on "devices," and the form automatically will remove all the areas that do not pertain to device studies, shortening the form, Forst explains.

"So mainly it goes to study population, approval information, sponsor, and funding source, and the next box would be IRB determination," Forst says.

It will ask these kinds of questions:

— Is the device recorded on the sheet?

— What is recorded?

— Was there a serious risk with the device?

"Then it asks a couple of pediatrics and vulnerable population questions, and that's mostly what you need to do for that audit," Forst says.

With the short form, auditors can pull all protocol files of a particular type, such as those with pediatric subjects who are wards of the state or investigators holding INDs, Forst explains.

The long form still is used as needed.

"It's just that initially it seemed like a great tool to use to review every file, and that proved to be very time consuming," Forst says.

"So we decided to do it according to categories, tests, or areas that we consider hot topics," Forst adds.

• Generate reports from tool: "The final goal was to generate reports, and that's where we are now," Forst says.

"We can forward information to IRB directors to use for the purpose of educating their staff," Forst says.

"We're not doing any trend reporting yet," Forst notes.

But the electronic tool can be used for a variety of reporting purposes.

With help from the informatics technology experts, the form can put information into different folders where it can be used to identify trends and issues, such as missing information, Forst says.

"We make sure files are complete so if we had an external audit we wouldn't be caught with missing information," she adds.

If an audit reveals that an investigator has issues in a particular area, then the findings could lead to a corrective action plan, Bistak notes.

"Part of the reason for using the electronic form is to let investigators or even the IRB know we're not being arbitrary in what we're looking for," Bistak says. "We show them that this is what we checked, and we want to do it the same way each time because investigators are sensitive to being singled out – so it's the same for everybody."