Technology is moving far faster than federal human research protection laws and regulations. But there are a few things IRBs can do that will help protect study participants.
“Any investigator who comes up with a project proposal that uses that data or biospecimen will have some minimal review,” says Stephen Rosenfeld, MD, MBA, president of Freeport (ME) Research Systems of Freeport. Rosenfeld is the chair of the Secretary’s Committee on Human Research Protections (SACHRP).
“Most IRBs would not accept a consent for creation of a repository that says you can do anything you want,” he says. “It does get complicated. It’s a bit of a mess.”
These are potential ways to improve protection of research participants:
• Recruit a data scientist to the board. “IRBs need to have access to information technology expertise,” says Michele Russell-Einhorn, JD, chief compliance officer and institutional official for Advarra in Columbia, MD.
This might involve recruiting an information technology expert to the board. Or, IRBs could ask a data scientist to consult with the board. “Big data requires a certain level of expertise, and IRBs should look at their membership to make sure they have that level of expertise,” she adds.
Many IRB members are limited in their knowledge and understanding of data collection, storage, and de-identification. “These are things I’m surprised that researchers don’t think about more often,” says Stephanie Malia Fullerton, PhD, professor of bioethics and humanities at the University of Washington School of Medicine.
Data scientists understand how to evaluate big data sets. If an IRB is unfamiliar with these issues, they should ask a data scientist to help them evaluate such studies and prepare for the future, says James Riddle, MCSE, CIP, CPIA, CRQM, vice president of institutional services with Advarra.
“They might also be statisticians or biostatisticians,” he explains. “They understand how to combine the data and could advise the board on how likely it is a combined data set might become identifiable.”
• Make sure informed consent matches the protocol. “One of the things that happens is you have an informed consent form that says your data will be de-identified,” Russell-Einhorn says. “Then, you read the protocol and see what they’re doing, and you realize that it’s not really an accurate statement.”
IRBs should be cautious about accepting the informed consent language as written when it describes the use of data. “You probably need context to figure out whether it’s an accurate statement,” Russell-Einhorn says. “The IRB should review the protocol and consent, and make sure there’s at least enough information to tell someone how their data will be collected, shared, stored, and used. Also, there should be a requirement that people are told of new information and its impact on their continuing to participate in the research.”
IRBs should clarify their responsibility to re-evaluate protocols and consent forms when study changes might change the accuracy of an informed consent document’s wording about de-identified data, Russell-Einhorn adds.
Large data sets could include detailed information held by multiple stakeholders with different interests, similar to biobank research that has no end to the data, Fullerton says. “We need to be transparent about that. The nominal right to withdraw is a limited one.”
Once a person’s data are used, they cannot take it back in this environment, Fullerton adds.
• Grant waivers cautiously. When databases are used for purposes consistent with the original informed consent, investigators probably can proceed, Rosenfeld says.
“But we’re talking in general about things that were not thought of in the original consent,” he adds.
When a study is proposed that is inconsistent with the original consent, researchers and IRBs can consider waiver of informed consent under certain conditions. For instance, the waiver cannot adversely affect the rights and welfare of subjects, Rosenfeld says.
“I think people are used to saying, ‘I don’t see how that will impact the rights or welfare,’” he adds. “But I don’t think anyone knows what that means.”
In 2008, SACHRP said the interpretation of rights and welfare should be calibrated against an individual’s expectations. “Would someone object if they knew of the waiver?” Rosenfeld asks. “Would the study population in general think the waiver could cause adverse consequences for their welfare or well-being?”
Another criterion is the research could not be carried out practically without the waiver. Whenever appropriate, subjects or legally authorized representatives will provide additional pertinent information after participation.
“People say, ‘We published results or let the community know,’” Rosenfeld says. “That’s easy to satisfy or justify not satisfying.”
• Teach the difference between research and consumer data. “In the consumer side of space, you would be appalled with how much the consumer products folks know about you specifically,” Riddle says. “Others can go in and purchase those data.”
There are several ways research participants might misunderstand how de-identification works with large databases:
• Consider service and use agreements. Research participants could have misconceptions about the broader implications of their data and privacy, particularly when data come from a large commercial database or the use of wearable technology.
“They often can conflict with what’s said in informed consent for a research study,” Rosenfeld says.
One example is if they believe their data, studied from a commercial database, is kept private through de-identification by investigators. While researchers can do their best to de-identify commercial data, it is always possible the company that holds the data does something with the same information that makes identification possible.
“What is the role of the IRB in terms of reviewing things like terms of service and using license agreements?” Rosenfeld asks. “There are a lot of circumstances where these documents are written to not be understood; they’re written in ways to discourage people from paying attention to them.”
For example, SACHRP studied several cases in which study populations used Apple Watches before enrolling in a research study. Investigators wanted to collect data, like steps taken, from those devices, Rosenfeld says.
If the data collected are for an app that people already were using, it means they signed a service and use agreement with that vendor. The bar the IRB has to clear is pretty low, he notes.
“They still will need to review the study and they need informed consent for research,” he says.
But if the end-user license agreement already mentions the data might be used for commercial research, then adding an academic research study does not add much more risk for participants. The key point is to note all this in the informed consent.
The trickier scenario is when a study provides participants with a wearable device, and people have to sign an end-user license agreement in addition to the informed consent document, Rosenfeld says. The end-user license agreement might be much broader than the study’s purpose for its use.
“The IRB has to pay a little more attention to the privacy rights they’re asking people to sign away,” Rosenfeld explains. “It’s that kind of thing: What should IRBs do, practically, when using a device or app requires people to do certain things to use it?”
This issue is broader than informed consent. IRBs should scrutinize the risk of these situations, he adds.