Study needed on consent process
Study needed on consent process
Personal presentations more effective
A review of nearly 40 years of research of interventions intended to improve informed consent for research participants found mixed results for some of the most popular methods studied. It also revealed a significant lack of good, methodologically sound studies of these informed consent interventions to see which ones work best, says one of the authors.
"I think each one of them [the interventions] could use a very, very good rigorous test," says Ezekiel Emanuel, MD, PhD, chief of clinical bioethics at the NIH’s Warren G. Magnuson Clinical Center.
"That would be a great way to spend money, and they’re not expensive studies to do, comparatively. If I had enough money, I would invest the money in making sure each of the studies is really good methodologically, so that the data are really reliable at the other end."
In the meantime, Emanuel says, IRBs can draw some conclusions about what works to improve informed consent — and what doesn’t.
"For IRBs, if there is one take-home message, it’s not at all clear that more, and especially more boilerplate informed consent documents, helps," Emanuel says. "And there is some suggestion from the literature that more boilerplate doesn’t help, that actually you can improve understanding by just getting rid of that boilerplate."
Extended discussion helpful
The review,1 which was published in the Journal of the American Medical Association, looked at 30 studies from 1966-2004 that described 42 trials of practices intended to improve research participants’ understanding.
The authors broke down the various studies into categories — multimedia projects, enhanced consent forms, extended discussion with study staff, testing participants on their knowledge and a group of miscellaneous interventions.
They examined each type of intervention to see which resulted in the greatest improvement in understanding. Among their findings:
— The "extended discussion" option — having the research participant talk about the trial with study staff — resulted in significant increases in understanding in three of the five studies in which it was used. The interventions ranged from a 30-minute telephone conversation with a nurse to multiple counseling sessions lasting up to two hours.
— Multimedia projects, which asked patients to watch videos or interact with a computer program, showed less success, with only three trials out of 12 showing significant improvement in understanding.
— Enhanced consent forms, with improvements ranging from condensed formats to larger type and more graphics, resulted in improvements in only six out of 15 trials in which they were used. The study notes that five of the six successful trials were of "limited quality," casting doubt on their usefulness.
— Test/feedback interventions, in which patients were tested on their knowledge of the trial, showed successful results in all five trials in which it was used. But the authors said that those patients were tested using the exact wording in the intervention and questionnaire. "This is a serious methodological flaw, because any improvement in the test score would reflect rote memorization of the answers to questions, rather than increases in real understanding," the study notes.
Researchers attracted to high tech
Emanuel says it’s not surprising that the most-tested methods of intervention — multimedia presentations and enhanced consent forms — were ultimately the least successful in trials.
"There’s an attraction to new high-tech solutions for all sorts of reasons," he says. "They don’t require extra labor — you can give it to a person and you don’t have to actually have a [staffer] there. And lots of people, not just in the research world but in the education world, somehow think that computers enhance learning."
Conversely, Emanuel says, the extended discussion intervention is more labor-intensive, and more complicated to study. But he says it shouldn’t come as too much of a surprise that it was a more successful intervention in actual use.
"What this data did for me was to make me stand back and say: Do I learn better by looking at a computer?’ No. A computer tends to be pretty passive. I tend to go to sleep. I don’t actually like it, personally," Emanuel says.
"When did I learn most? Well, mostly sitting in classrooms where I could interact with people and ask questions," he says. "In some sense, the result becomes intuitively obvious, once you’ve been forced to step back and look at the data."
Emanuel says if he were pursuing a very complicated study, with important risks he wanted to ensure participants understood properly, "I would say you ought to recommend to the researchers that they ought to consider more time or second contact with these subjects, just to see if they have more questions or anything. That would be my recommendation."
More studies needed
But Emanuel says this review hardly is the last word on the subject of informed consent interventions. For one thing, many of the studies were methodologically flawed, so it’s unclear whether the results might have been different had the studies been better designed. Some of the studies reviewed were never even peer reviewed.
Even with a finding such as the success of the extended discussion intervention, it’s not clear how much discussion helps, and how it should be best carried out.
Emanuel says he’s still interested in studying enhanced consent forms, particularly those that have been condensed to rid them of what he considers unnecessary and complicating boilerplate language.
"We, as a department, still have a great interest in doing a study of short forms," he reports. "But I actually think, for example, that none of the short form studies [reviewed in the study] are very good; in part, because I don’t think their short forms were very well designed."
He does think that the findings regarding multimedia projects should make IRBs sit up and take notice.
"I’ve seen some organizations that require videos or computer interaction, and they’re willing to pay $100,000 to create it," Emanuel says. "At least from this data, I think I’m in a strong position to say there’s no evidence it’s going to work.
"If you see someone who wants to make a video, I would say as an IRB, Why don’t we evaluate how effective that is?’"
Ultimately, Emanuel would like to see several well-designed, rigorous studies of each of the interventions identified in this review.
"I think you could do that for just a few million bucks, and it would be a huge benefit to the research community," he says. "We’re spending billions, if not tens of billions on clinical research. A few million to figure out which of these interventions give you the biggest bang for the buck would be well spent."
Reference
- Flory J, Emanuel E. Interventions to improve research participants’ understanding in informed consent for research. JAMA 2004; 292:1,593-1,601.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.