Boilerplate language for informed consent documents is simple, but not always easy for study participants to understand. The goal for IRBs is to help researchers simplify the words and scientific jargon they use to describe studies to participants, but it is unclear how this can be accomplished.

One solution is the Consent Language Explicit And Reasonable (CLEAR) Initiative. Investigators of the CLEAR Initiative surveyed 120 people, in lay and scientific communities, to see which words and phrases they preferred to see in consent documents.1

“We have boilerplate language, referring to issues of privacy and in the case of injury or harm during research,” says Ilene Wilets, PhD, CIP, IRB chair, program for the protection of human subjects at Icahn School of Medicine at Mount Sinai in New York City. Boilerplate language often contains a lot of jargon that many potential research volunteers would find difficult to understand, she notes.

An IRB might attempt wordsmithing, but how can they know what will work and what falls flat? The answer is to ask people which words and phrases they prefer to read in a consent document. “We call this project the CLEAR Initiative,” Wilets says. “I like the word ‘reasonable’ in the CLEAR acronym because we want to protect people, but the pendulum has swung so far. We have so much in there that our text is lengthy and dense, and it confuses them,” she says. “We’re trying to simplify it, so we asked community members, through a survey, which words they like better.”

Survey Revealed Preferred Terms

For instance, what should research participants call the person who conducts research: researcher, investigator, or study doctor? The answer, according to the CLEAR study, is that laypeople prefer using the word “doctor,” as in “study doctor” or “research doctor.”1

What would research participants prefer to be called: subject, volunteer, or participant? “We looked at the language consent forms typically use,” Wilets says. “We’re trying to understand which words are most meaningful to study candidates.”

Researchers conducted the survey at the medical center. “A lot of people approached our kiosk, and they wanted to voice their opinion,” she says.

Then, they analyzed the results, comparing the people who worked in healthcare research with the people who had no connection to the healthcare industry. If investigators had more time, they might have started the CLEAR Initiative with focus groups to gather more exhaustive information, she notes. “We ran out of time to do that.”

The survey results found that about two-thirds of participants were employed in medicine, research, or healthcare. Their perspective was different from the one-third who were not employed in medicine or healthcare, Wilets says. For instance, people who worked in healthcare or research knew what HIPAA was, and many laypeople did not.

“We’re still analyzing the results, but the first thing that jumped out was when we asked survey respondents about what to call a researcher,” Wilets says. “In our consent form, all different terms are used. It’s not that any of these is incorrect, but we were wondering if one term was better than the others,” she explains. “It might be incorrect if you said ‘study doctor’ because the person conducting research might not have an MD, and maybe is not medically trained.”

But the survey results showed that members of the community felt comfortable with calling researchers “study doctors,” while people in the research and healthcare community preferred the names “investigator” or “researcher.”

If the CLEAR Initiative had conducted focus groups, they might have discovered why laypeople preferred that term. “Maybe people are conflating research with clinical care,” Wilets says. “We would like more data to find out why they preferred ‘study doctor.’”

The findings can help IRBs improve consent documents. For instance, among the survey’s respondents of healthcare, research professionals, and laypeople, the most popular term for describing people who volunteer in research is “study participant.” Those surveyed also preferred the term “experimental drug” to describe a new medication used in a clinical trial.

“The findings have helped us tweak our consent form,” Wilets says. “We talked about the findings with our board members, and we had an IRB retreat recently — a full-day conference with members of the research community and IRB members. We talked about simplifying informed consent forms and looking at literacy levels.”

“We know illiteracy is high, and we just wanted to simplify it as much as possible so people from all walks of life can understand consent language,” she continues. “It’s too much to ask investigators to keep track of different versions, based on different populations.”

The goal is to present information in a way that people can understand. “CLEAR is not a template at this point,” Wilets explains. “This is more about exploring the phenomenon of word preferences in our community, and we’re still learning what those preferences are.”

The study does not definitively select the best words and phrases to use for informed consent documents, but it suggests some simple changes that IRBs can make and will lead to more thinking about this issue.

“It is more of a springboard for thought in our office, and we want to spin it into something more definitive,” Wilets says.

REFERENCE

  1. Eshikena M, Richmond M, Joseph Y, et al. CLEAR: Consent Language Explicit And Reasonable. Presented at the 2019 PRIM&R Advancing Ethical Research Conference, Nov. 17-20, 2019, Boston. Poster: 26.