Focus on QA results in better informed consent

Explanation of costs was focus

Research institutions put considerable time and resources into developing informed consent (IC) templates and tools to assist researchers as they seek ethics board approval, but they sometimes neglect to focus on the part that describes cost to participants.

This is an important part of informed consent from both a regulatory perspective and from the participant's perspective. A poorly worded form could result in a participant believing that every procedure performed during a study would be free to the participant and/or his insurance company, when that sometimes isn't the case. This creates an additional risk for the research participant.

Vanderbilt University in Nashville, TN, has addressed this issue by creating a template that assists investigators with their descriptions of a study's cost to participants. Then the template was evaluated and improved through a quality improvement program.

The goal was to improve how research costs were explained in informed consent documents and promote consistency across study documents.1

Before the quality improvement processes were employed, Vanderbilt's human research protection program determined approximately 20% of studies could have the quality improved in capturing processes and content in submitted IC forms. After implementation, fewer IC forms required improvement, with a 10% increase in accurately capturing research procedures and related costs in the consent form.

"Any time you do a quality assurance process in your programs it is a value-added process," says Barbara Gibson, BSN, RN, CIP, associate director of the human research protection program at Vanderbilt University.

"It's important for everyone to evaluate how they're doing and to identify strengths and areas for improvement," Gibson says. "It's a very important process to have in place."

The QA process involved assessing the content of the IC document and comparing it to the billing plan submitted to the department of finance.1

Investigators could use template language regarding study costs to participants, but this sometimes proved to be inadequate assistance.

"We found that some principal investigators were not sure about how to select the correct phrases from the template we created," says LuEllen Davie, BSN, RN, CIP, research subject advocate in the human research protection program at Vanderbilt University in Nashville, TN.

Gibson and Davie describe how they developed the QA process that helped them improve the IC template's language regarding costs to subjects:

Review studies' cost language: "In order to develop a quality assurance process we had to understand our measuring sticks," Gibson says. "We needed to determine what criteria we were going to use, so we took a large number of studies and looked at the cost language in each consent form for these various studies."

Then they made notes on their findings and narrowed these down to key criteria with how cost language would be used.

The object was to make certain the cost language was appropriately informing research participants of their potential cost exposure, Gibson explains.

The three key criteria were these:

—Is the appropriate template language being used?

—Are research procedures being identified adequately?

—Is there consistency between consent forms and other study documents?

Once they determined the criteria for measurement they used these on new sets of data, Davie says.

"We found that probably 81% of what we were looking at was on target and the balance could be tweaked to be improved," she adds.

Implement process improvement strategies: One important strategy was to provide voluntary education about the IC form and cost language.

"We provided information internally and externally to the research community," Gibson says.

The educational sessions typically resulted in improvements in the IC forms, she says.

"Barbara and I also created an electronic database that helped us track information and identify the weak spots," Davie says. "We also developed a flow chart on how to select the appropriate cost language."

The flow chart was one of the most significant changes, and it made the process simpler for investigators, Gibson says.

"It took existing cost template language from the template and charted out pieces of that template," Gibson says.

Obtain input from various departments when writing template language: Developing IC cost language for the template proved complicated and required input from different departments, Davie says.

"The template language was built with consideration of other departments' needs," she adds. "We wanted to make it easier for them to understand what it was they should select."

Gather data at set time periods: "We gathered data in three-month increments," Davie says.

They measured baseline data and compared it to each three-month period.

When the comparison was unfavorable they would provide more targeted education to investigators, Davie says.

Then they would compare again and find an improvement, which indicated that the education was having a positive impact.

Data comparisons are ongoing: "We continually assess our processes and see how we're doing," Gibson says. "That will continue, although it won't be as frequent as every three months because we observed a consistency in our processes, and we think it indicated that we're doing a good job."

Communicate results: "One important thing with the QA process is that it's not just a matter of identifying areas for improvement," Gibson says. "It's important for people to hear when they're doing well."

The QA process is ideal for identifying both areas of improvement and areas of success. Seeing these makes it clear where a department should put its energy.

"We provide feedback to the team and protocol analyst so they would know how they're doing," Gibson says.

Reference:

  1. Gibson B, Davie L. Implementation and evaluation of a quality assurance process. Poster presented at the 2010 Advancing Ethical Research Conference by the Public Responsibility In Medicine & Research (PRIM&R), held Dec. 6-8, 2010, in San Diego, CA.