Best Practices Spotlight

Focus on what your IRB client wants

Submission forms need reworking

IRBs and human research protection staff engage in important, sometimes life-protecting work. Their purpose and goals are on high ethical and moral ground. So it’s easy to forget that it’s also an enterprise with clients or customers to satisfy.

“Over the past few years, we’ve really been focused on listening to the voice of the customer: the research community, investigators,” says James Riddle, MCSE, CNE, CIP, assistant director at the institutional review office at Fred Hutchinson Cancer Research Center in Seattle.

“We now ask: ‘What can we do to help you get your research done more easily and faster?’” he says.

The point is that IRBs can improve their research review quality and improve their client satisfaction by focusing on how they and their forms communicate with researchers.

“The primary thing we learned was that most IRBs — and ours was no exception — write their forms to help them keep their records, stay in compliance, and to reflect the perspective of the regulator,” Riddle says. “What we were missing was the utilization of the forms by the end user and how they might choose to interact with the forms.”

The organization hired an expert who has a master’s degree in linguistics and a master’s of science degree in human-centered design and engineering to help the IRB design data structures that would result in the most accurate data while being user friendly, Riddle says.

“She was invaluable,” he says. “Her advice to us was to create forms that have a thoughtful progression of the information.”

At the expert’s suggestion, the IRB now has a trained IRB professional create forms using four guiding principles, Riddle notes. The principles are as follows:

1. Write clearly and plainly.

“They should be understandable to a reader with some level of familiarity with research and regulations, but who is not a regulatory expert,” Riddle says.

IRBs now are paying more attention to the wording in informed consent forms, and they should apply some of the same strategies to making their submission forms more readable and simple, as well, he adds.

“Make sure the forms are focused on your customer,” Riddle says.

Here are some question examples from the IRB’s new submission form:

“Is there a separate research protocol, synopsis, or other document detailing the study’s research procedures?

- “Yes – please submit protocol

- “No – please describe research plan in detail:”

Another plainly worded question on the new form is this: “How long will individual participants be in the study, and how long do you expect the entire study will take to complete?”

2. Target questions to the expected user.

One form does not fit all, Riddle says.

“Don’t ask questions the end user doesn’t need to complete,” he advises. “Ask only questions that pertain to the user completing the form and that user’s particular kind of research.”

For example, a Fred Hutchinson researcher who is doing epidemiological research about prostate cancer biomarkers does not need to answer questions about investigational drug research or experimental devices, Riddle explains.

“What we found was our forms were so regulatory-focused that they asked the same questions of everybody,” he says. “Everyone had to answer questions about devices, whether they were using them or not. That meant users had to read and decipher and understand a question they never needed to answer in the first place.”

Even when IRB forms are electronic and contain decision trees, allowing users to skip a question if they answered the previous question a certain way, researchers can find this confusing, Riddle says.

“We found users were even getting confused by questions that allowed them to skip other questions,” he explains. “By having more targeted forms, we were able to eliminate a number of those decision points so the end user will not have to waste time on this.”

3. Do not duplicate questions on forms.

“Do not ask the same question twice, and don’t require duplicate data,” Riddle says.

Prior to Fred Hutchinson IRB’s quality improvement project, all IRB forms asked researchers to record inclusion and exclusion criteria for the research protocol on the form, he notes.

“We had them enter those data in the form although the data exists in a separate research protocol,” he explains. “So they’d cut and paste it into the IRB form.”

Forcing users to duplicate data can introduce errors unnecessarily, he says.

The solution is for the IRB to go to the source data for this information.

“We found it was more effective to ask in the IRB form, ‘Where is this data in the protocol?’” Riddle says. “That eliminated a lot of error and it eliminated a lot of angst from the users.”

For example, the IRB’s new submission form has a question, stating: “What are the objectives that will be met? If this information is clearly described in the protocol, reference the specific page(s) where applicable:”

4. Create forms that can be audited.

“The forms had to be auditable — that’s the term we came up with,” Riddle says. “What that means is the forms must provide the researchers, the IRB, and the center/institution overall with some means to transfer information from the principal investigator to the IRB in a way that is compliant and auditable by federal regulatory agencies.”

Forms can be simplified but not eliminated entirely. Also, all IRB communications with researchers need to be documented so an auditor can see what has occurred, he adds.

“We came to the conclusion that you could not have some sort of unstructured, unfettered communication,” Riddle says. “You have to have a form at some level.”

Regulators look for forms with signatures on the page, documents they can review, he adds.

Since re-engineering the IRB’s application process and forms, there has been a 20% reduction in errors, researchers report spending less time completing the forms, and IRB staff spend less time reviewing the forms, Riddle says.

“And the forms are more accurately filled out,” he adds.