The trusted source for
healthcare information and
QI office runs more efficiently with good data collection
Consistency is the key
A research institution's quality improvement program can greatly improve its processes and quality by following a few principles: standardize activities and collect and track accurate, comprehensive data.
A QI program's database is the hub of its information, and it's important to have policies and processes in place that describe how data are collected and organized, says Sarah White, MPH, CIP, assistant director of the human research quality improvement program at Partners Healthcare in Boston, MA.
The Partners Healthcare QI program office serves as a regulatory support and education group for the Partners research community and among other services often provides onsite audits at the request of investigators and clinical trial sites. The electronic, online service request form, which collects demographic information of the person making the request and other visit details, makes this a much easier task than if it were handled with telephone calls or letters. A standard list of observations is an efficient way to document and track what was observed during the audit.
As a whole, the database can be queried to provide metrics to track down service demand and to identify trends of noncompliance.
"We've standardized our program activities to accurately track services," White adds. "We've also spent a lot of time thinking about how we are communicating this information to different groups within Partners."
Since the QI program standardized activities and observations and developed a way to automatically generate the reports, the staff time spent on drafting a QI report has greatly decreased.
"We went from maybe two or three hours in writing the report to 30 minutes," White says. "It's freed up a lot of time, which allows us to provide more education and service to the clinical sites."
The next step would be to make the process even more efficient by having QI specialists take laptops to site reviews and key in information as they go through the review. When this happens, it would be possible to generate a report almost immediately, White notes.
QI reviewers have to provide some personal input in the reports, but most of the information is generated from the data collection fields, she adds.
Making data collection improvements such as these are key to building a more efficient and useful database. A QI program can collect multiple gigabytes of data, but if it's not structured in a way that makes it efficient or logical for communicating for others, then it's not very useful.
Some suggestions for questions to ask when structuring data collection are as follows:
"We also have a standardized way of tracking department and funding source information," White says. "We have made those categories consistent with other groups in the institution, such as the IRB, so if we want to collaborate and pull metrics together, we have similar lists."
Once a site agrees on the main features of standardizing data, then the electronic data collection tool can be modified to make it easier to collect what is necessary.
For instance, a site might want a drop-down list for selecting dates on a calendar. This could reduce the number of errors made by staff inputting the wrong month or year.
"Dates were one of our biggest problems, which is why we use the drop-down calendar," White says. "Doing this ensures a consistent date entry, so when we want to run metrics between certain dates, the data are clean and queried efficiently."
Using menus of options for data input creates a more consistent data-tracking process.
"When you're tracking information you find that it's very easy for four different people to enter information in four different ways," White explains. "The goal is to get the information in consistently, in order to get it out efficiently."
So when a QI program is called by a department or institution to summarize data on a particular item, this can be done quickly and accurately.
"We've spent a lot of time in the last 18 months creating a very robust database to collect information," White notes. "It starts with an online service request form that the online community can fill out with the standard fields for last name, first name, service they wish us to provide, institution's name, department's name, funding source, and years involved in research."
When the information is entered, it's automatically uploaded in the institution's web-based database, and it's available for QI staff to review.
"Then we communicate with the site to schedule a QI visit and enter visit details into the system," White says.
The next step is to standardize the information observed at the site visit.
QI specialists compare a site's CT activities with requirements in federal regulations, institutional policies, and good clinical practice.
"If they are doing something not in compliance with those things, we make an observation, which generally is standardized in some way so we can track them," White explains. "We've taken a lot of time to build an observation index that has broad observations categorized in different areas of noncompliance and documentation."
When QI specialists make the observation, they use a drop-down list to find the correct category and observations, and the observation is logged into the database.
The Partners QI office has about 100 different observations and eight different categories.
"Those are all tracked and linked to a specific regulatory citation and to a suggested corrective action," White says. "That's our greatest efficiency that our database can generate a report for them."
Despite a standard observation index, each report generally requires some additional information specific to the site, such as examples of the observations of noncompliance, White notes.
However, this is a small piece compared with the entire report, she adds.
It takes some planning, time and work to build an observations catalog. Each time a QI reviewer makes an observation that is novel, it is considered for addition to the index.
But there are rewards for this level of planning and detail. For instance, a QI program can check monthly for trends in the observations reported. Also, QI programs can scale the observations according to the seriousness of the noncompliance.
"We scale observations based on an internal scale of one to three," White says. "The scale is a way we communicate internally: if you have 15 different observations, which are the critical ones?"
The scale is flexible based on the context.
For instance, during an onsite audit a QI specialist could observe that adverse event tracking and assessment is inadequate. In discussion with the site, the QI specialist could learn that adverse events are in fact tracked and assessed; however, documentation is poor. Or they could learn that, in fact, adverse events are not tracked and assessed on a routine basis, which would be a major cause of concern.
In the former case, the observation would be scaled to 1; the latter, to 3.
CT investigators have responded positively to the electronic data collection and reports, partly because the QI program now can identify more easily their noncompliance trends and correct them before they're audited by external regulators.
"They can have an internal group look at their data in an objective way and tell them if they're doing anything wrong," White says. "In addition, we are able to provide research management with information to understand where noncompliance is occurring and if there are educational gaps and unmet needs."