Data integrity should be top priority
Protect data to help protect subjects
As research institutions become more thoroughly electronic in their IRB and clinical trial systems, they should not lose sight of the importance of securing all transmitted data, an expert says.
“Institutions need to know what sites can do in collaboration with sponsors to promote data integrity through the life of data analysis,” says Michelle Stickler, DEd, CIP, director of the office of research subjects protection at Virginia Commonwealth University in Richmond.
“The ultimate goal is to protect research participants, but at the same time we cannot produce new data without data integrity,” Stickler says. “As we move to more use of electronic systems and electronic source documentation, there’s going to be a greater need to think about who has access to these systems, is entering the data, and how we are making sure it’s the person we think it is.”
The trend toward electronic data suggests that research sites will one day be required by federal regulatory agencies to submit everything electronically, Stickler predicts.
“As we move to requirements to submit everything electronically, there’s greater emphasis on collecting e-source documentation and data to get through the system and to pay more attention to the computer systems we’re designing,” she says.
Research institutions and IRBs also should know answers to these questions:
- How is information encrypted?
- Have secure transmission systems been established from site to sponsor?
- How can the institution verify if original information is put into the electronic source document?
- Which kind of system is being used to prevent problems from occurring?
The Food and Drug Administration (FDA) uses the acronym ALCOA as a guide for electronic-based and other evidence, and this is a good guide to use when assessing the integrity of an institution’s electronic data system, Stickler suggests.
Using ALCOA as a guide, the data need to be:
- Attributable: Make data attributable to a source, and institutions should attribute data entry to a person or device or mechanism, Stickler says.
- Legible: Data have to be something the FDA can interpret, whether it’s electronic or handwritten.
- Contemporaneous: If researchers include observation in their study, then data are entered in real time without a time lag in which some differences and inaccuracies could occur.
- Original: Research records should include the first, most accurate, and reliable recording of data.
- Accurate: Make sure what is being collected is, indeed, what actually occurred, Stickler says.
“We have to have secure systems,” Stickler says.
While sponsors are responsible for much of this, sites also should maintain data security through knowing precisely who is responsible for creating data entered in the system, she adds.
“Whether it’s a nurse coordinator or another person, everyone who enters data should have a unique identifier in the electronic system,” she says. “If data are coming from electronic sources, like a blood pressure machine, then each piece of machinery needs to have its own unique identifier, and the site should maintain that list.”
Sponsors also should maintain the list, and these two lists should be cross-checked when data are entered that are not on the list, Stickler says.
“You should make sure all information is from a reliable source,” she explains. “This is part of a mechanism to make sure we’re not getting tainted data and that we know where it’s coming from, so we’re not getting a random reading from someone who is not enrolled in the study, for example.”
Another important component of ensuring data integrity is source verification.
“This is the responsibility of the site, as well as the sponsor,” Stickler says. “The clinical research associate is the first line of attack for verification.”
Institutions should ensure study participants meet all inclusion criteria and that records are capturing all of the medications subjects are taking and that lab results make sense, Stickler suggests.
“If one patient is making progress and then the next time is not doing very well, it needs to be checked out,” she says.
The risk of having data inconsistencies and errors is that the FDA will not accept the data, she adds.
If this happens, it could lead to the sponsor failing to have adequate data to support claims about a new product, and the site might be blackballed by sponsors due to its poor data integrity, Stickler says.
From a human research protection perspective, this means the time subjects spent participating in the study was wasted, and it may leave them unwilling to participate in future research.
“Potentially, we may have increased risk to research participants for very little value in the end,” Stickler says.
Data storage is another big data integrity issue.
“I’ve been reading about some potential issues with data storage when everything is electronic,” Stickler says. “While we think it’s safe and long lasting, and we think we can hold everything electronically, there are papers out saying when you go back to retrieve data there could be some chunks missing.”
Research sites must maintain data for at least two years after the product is evaluated, so they may need to re-evaluate their data storage system to make sure it can handle data for the length of time necessary and test it for long-term integrity issues.
“This is an issue for IRBs, sites, and medical centers,” Stickler says. “Do they have adequate data capacity and retention systems to maintain data electronically for that long?”
The standards suggest having a minimum of two back-up storage locations, she notes.
“Here, we’re looking at a double back-up system with servers here at this institution and another back-up that is within our general region, our city area,” she explains. “A second back up is in a completely different country.”
The reason for the remote back-up is that if a region is struck by a natural disaster, the data stored locally could be affected.
“You want it in a different weather region,” Stickler says.
An example would be when Hurricane Katrina hit New Orleans and caused massive destruction, interrupting clinical trials and annihilating data that was stored locally in flooded buildings.
“In a situation like New Orleans, the data could be gone with nothing to reconstruct,” she adds. “All research subjects have gone through the study, and it’s all been for naught.”
Another aspect to data integrity involves custody documentation and the audit trail.
Research institutions should look at making these two chief improvements, Stickler suggests:
- Limit access: Limit access to audited users. This keeps data accountable.
- Use common sense: “When a programmer leaves a station, log off so data are not sitting there on the screen,” Stickler says. “If a system is idle for five minutes or more, it should be automatically logged off, locking the computer.”
These are small actions that most institutions may not have on their radar, but implementing those actions in the research and IRB workplace could improve data integrity.
“In terms of the audit trail, we look at keeping track of all the changes made to the paper trail, providing explanations of any changes, and the same is true of the electronic trail,” Stickler explains. “Find out when a change is made, who is making the change and have them log in and capture unique information, tying it to the change so it can help us create an audit trail.”
Anyone who changes or adds to electronic data must be required to log in so their input can be captured and traced back to them.
“I think the key to data integrity is being able to tell a story from start to finish so we have all the details, including where data came from and being able to double-check data to verify the data’s quality,” Stickler says.