Dana-Farber Cancer Institute goes beyond regs with its QA program

Human subjects protection garners two awards

At the Dana-Farber Cancer Institute in Boston, human subjects protection is treated as a quality assurance issue. The institute takes steps to ensure that protections are extended throughout protocols through data monitoring and auditing and is working to improve informed consent through a validated questionnaire.

The emphasis on quality paid off recently, when Dana-Farber received two awards for its human protections innovations from the Health Improvement Institute, a nonprofit organization dedicated to improving the quality of America’s health care. It was the only institution to receive two awards.

Dana-Farber/Harvard Cancer Center, a collaboration between the cancer institute and six other Massachusetts health institutions, won an award of excellence for best practice for the Quality Assurance Office for Clinical Trials (QACT), a comprehensive system to ensure protection of human subjects.

The Health Improvement Institute named the cancer institute’s Quality of Informed Consent questionnaire as the winner of its award for excellence for innovation.

Monitoring life cycle of protocols

The Quality Assurance Office for Clinical Trials is a department that helps monitor the entire life cycle of a protocol, says Jane E. Russell, quality assurance officer for clinical trials.

"It’s kind of a unique department," Russell says. "Other people do have different structures — they might have a clinical trial office that does all this but in a different way."

The department began as the Quality Control Center in 1986 and has evolved as the organization grew.

Dana-Farber/Harvard Cancer Center is a collaborative effort with Beth Israel Deaconess Medical Center, Brigham and Women’s Hospital, Children’s Hospital Boston, Harvard Medical School, Harvard School of Public Health, and Massachusetts General Hospital.

QACT serves the research functions of the cancer center with a staff of 15, Russell says. Among the department’s responsibilities:

  • protocol registration and eligibility checking for all patients joining clinical trials;
  • computerization of clinical trial data for the institutions’ in-house trials;
  • an internal auditing component;
  • data safety monitoring.

The QACT department reports to the director of clinical trials support and through him to the senior vice president for research, Russell says. The department works with the institute’s IRB and its support office, the Office for the Protection of Research Subjects.

"I’m a member of the IRB and we receive all the new protocols that are activated," she says. "We manage the data safety monitoring plan and the data safety monitoring boards for the institute, and we submit those reports to the IRB."

The computerization of clinical trial data takes up much of the department’s staff, six people out of the total 15. But it’s the audit function that provides what Russell calls "the big bang for the buck."

Three staffers look closely at open protocols — reviewing the medical records for compliance to the protocols. They conduct 40-45 such audits each year.

"They’re reviewing whether the protocol is followed appropriately," Russell says. "Were dose modifications made appropriately? Was adverse event reporting done when required, and was the treatment given according to protocol?"

When the audit is completed, a formal report is sent to the Clinical Investigations Policy and Oversight Committee, which oversees both OPRS and QACT.

The audit system has helped Russell’s department identify problems and correct them. For example, an audit turned up confusion in how to report treatment deviations and violations, and so the forms were revised. The department now is developing a policy on ensuring consistency after a principal investigator is replaced on a protocol, after that problem was identified in an audit.

The QACT is continuing to evolve to meet the needs of the organization. Russell says the department currently is working on implementing a computerized clinical trial management system.

With the computerized system, data managers or clinical research coordinators will enter data electronically. The system can be used to create calendars for patients and researchers, to track such things as which tests need to be ordered.

"A lot of people are probably already using them but most of the academic centers are not because they’re very expensive," she says.

While the QACT program has been a component of Dana-Farber’s human subjects protection for years, the second innovation cited by the Health Improvement Institute is relatively new.

Quality of Informed Consent QuIC (see chart) is a validated questionnaire used by researchers at Dana-Farber Cancer Institute to examine how well research participants understand the clinical trials in which they’re involved.

Steven Joffe, MD, MPH, attending physician and hospital ethicist at the institute, says the QuIC program was developed because of concerns that the informed consent process wasn’t well understood.

"I think the first step in improving any process is really studying it, understanding it, evaluating it well," Joffe says. "If you want to study outcomes of the informed consent process, you need tools and some agreement on how you go about studying it. And that was where we perceived there to be a big hole."

As he looked at the existing research on the subject, he found that it was difficult even to decide exactly what to study.

"We really hadn’t figured out how to go about evaluating the informed consent process — what questions to ask, how to answer them, how to do research, how to evaluate the process in a way that was reproducible, that could be done from one place to the next," Joffe says, "so that two people, working in different places, for example, could ask the same questions, use the same methods, and come to answers that could be compared."

Comprehension is measured

The team wanted to develop a survey that could be used across multiple types of trials, even different diseases, testing basic core information that human subjects would need to know about participation in a clinical trial.

They started with the basic elements of informed consent spelled out in federal regulations — purposes of the research, foreseeable risks, benefits, etc. Questions on the survey were framed in the broadest possible terms, Joffe says. For example, participants wouldn’t be asked about specific physical risks of a treatment, but rather whether there are additional risks of any kind associated with the trial.

Each question is framed as a declarative statement with three possible answers: Agree, Disagree, or Unsure. Some statements are true, and some are false.

Joffe says that after his team developed the questionnaire, it was reviewed by a panel consisting of physician/oncologist/ethicist, a senior biostatistician, and a nurse ethicist. They helped refine the survey, which first went into use in 1999.

Since the institute began using QuIC, Joffe says it has helped identify factors that can lead to improved understanding of informed consent — and to changes in practices at the institute:

Removal of language barriers. People whose first language was not English but who received their consent information in English had more problems than other subjects.

Joffe says that realization has led to a heightened sensitivity among researchers at Dana-Farber of the need for translators. "You have to be very careful when you’re consenting somebody, even when you think their English is reasonably good," he says. "You may need a translator for the consent conference, even if you wouldn’t in the course of an ordinary clinic visit."

Time to think it over. "We found that those who signed up on day one were less well informed than those who took their consent forms home and thought about it for at least a day," he says.

Joffe says that whenever possible, researchers now encourage subjects not to sign up right away, but to go home, discuss the matter with family, perhaps have another conversation with the research nurse.

"There are lots of things that can happen in those few days that might make a difference," he says.

Perceptions of understanding. One troubling finding, Joffe says, is that neither the patients nor their doctors seemed to be good judges of how much the patients understood.

"We asked patients these questions that allowed us to classify their level of understanding and then we asked them how well they think they understand," he says. "And there wasn’t a lot of relationship between how well you think you understand it and how well you do, at least by our measures."

Surveys of doctors showed that same type of disconnect.

"If that’s true, then it’s hard for me, when I’m sitting down with my patient to consent them for a trial, to gauge their understanding and to know how to proceed — when I should be worried and be doing more and when I should be comfortable."

Importance of the form itself. Joffe says that when he began, he hypothesized that patients weren’t reading forms very carefully, and so the content of the forms themselves wouldn’t make much difference in how well patients understood them.

"We thought it was going to be mostly about the conversations that people had with the doctors and the nurses," he says. "I think we proved ourselves wrong on that one. Most patients said they read their consent forms fairly carefully, and we found that those who did read them carefully did a lot better."

He says that an informed consent template offered by the National Cancer Institute, one that used a question-and-answer format, led to better understanding of the trials.

Dana-Farber’s team still is using and refining the survey, adding more detailed questions, particularly aimed at patients in randomized trials.

But Joffe sees a value for QuIC that goes beyond its use as a research tool. He says an investigator or even an IRB could administer this survey as a way of monitoring its informed consent process to make sure it’s working as it should.

"People are talking about IRBs monitoring consent as it happens, and this may be a tool that will allow them to do that," he says. "Maybe this is one way that they can do some sort of rational assessment of the consent process and whether it’s working.

"I’ve had a number of people contact me to talk about using it in their own work. Some of them as researchers, others as IRB members or consent monitors who are thinking about how to bring it into their setting."