Expert: Develop process to assess needs
Performance metrics help
Human research office managers sometimes need to identify ways to improve workflow and efficiency. One way to do this is to develop a process for evaluating resources and assessing needs, an expert advises.
A research department that oversees various regulatory offices has to weigh competing needs with limited resources, notes David Wynes, PhD, vice president for research administration at Emory University in Atlanta.
"We all have limited resources, so the key thing I have to rely on is a timely program evaluation," he says.
Any human research protection program (HRPP) or IRB office that has not done a clear program assessment probably doesn't know what kind of resources and opportunities are available, he adds.
"You need to know where you have weaknesses and where your needs are," Wynes says.
Program assessments should be a regular occurrence since regulatory and institutional changes make it counterproductive to maintain the status quo, he adds.
For instance, with the advanced notice for proposed rulemaking involving the Common Rule, now is a good time to perform a program assessment.
Wynes offers these suggestions for how to perform a program assessment:
• Identify goals: "What is it in theory that you're trying to accomplish?" Wynes says. "How is the program supposed to operate?"
Other questions to ask are as follows:
— Does your goal include achieving compliance with the Common Rule, Food and Drug Administration (FDA), or Association for the Accreditation of Human Research Protection Programs (AAHRPP) of Washington, DC?
— Do you include in your goal statement the purpose of protecting research subjects?
— What are the activities your program does that go beyond its goals and mission?
— Have you identified IRB mission drift?
"You look at it and say, 'Are we doing what we set out to do or are we doing more?'" Wynes says. "If you are doing more, then you need to ask if that's intentional or accidental."
• Look for dynamic situations: "There are dynamic situations where maybe the theory you started with is no longer there," Wynes says.
For example, the regulations do not require human research protection programs to have a post-approval monitoring program, he says.
"I would maintain that if you don't have a team to conduct post-approval monitoring, then you don't know what's happening in the field," Wynes says. "If you rely strictly on the paper or electronic reports that come in from the field, you won't have a true picture of what's happening at your institution."
So in situations like this, a program might go beyond its original mission, but this extra work serves the overall purpose, he adds.
In another situation, an IRB might have a rule to never provide expedited review for studies. Every protocol submission receives full board review.
"In those cases, I would hope they'd look at it and say, 'Why do we do things that way?'" Wynes says. "Maybe they have a reason, but larger institutions might crumble under the weight of that process."
The program evaluation should include a look at the theory behind the rule and its implementation. Then the HRPP manager should assess the impact and effectiveness of continuing the current practice, he adds.
• Design performance metrics: With metrics, an HRPP director could quickly spot problem areas.
"Are we doing the turnaround on an expedited review that we should be doing? Are we doing turnaround on a full-board review?" Wynes says. "You have to have metrics and customer feedback on what is working or not working in the process."
The metrics could include FDA inspections, sponsored audits of investigational new drug trials, and other data that already have been collected.
Also, the metrics should include efficiency and workflow measures.
"We all know there are multiple ways of doing things, so look at the workflow," Wynes says.
One suggestion for assessing workflow is to have managers develop process maps, he suggests.
"This past fall I brought in someone to conduct a workshop on process mapping for my senior leadership," Wynes says. "I asked all of them to do process mapping, and then we'll look at those maps as we go along."
The purpose of a process map is to outline what employees are doing and why they are doing it.
"Then you challenge and question it and look for critical information that gets you to the point where you need to be," Wynes says. "If we go through process mapping, and I find I have a resource problem, then that puts me in a better position to find solutions."
For instance, Wynes can use the process mapping findings to make a better case for hiring new staff or to shift resources from a less critical area to an emerging need.
"Look at what metrics are available, and there should be many as we move to electronic systems," he says. "There are metrics embedded in databases that most people might not think about."
• Use data to provide staff feedback: The goal is to find information that could lead to small operational changes and efficiencies, Wynes says.
"What your program theory says doesn't synch with what people are doing because there will be mission drift over time, and people have competing demands," he says.
"I've done things like asked people to say during annual reports what their goals are for the coming year," he explains. "Then we talk with their directors about their goals and how these fit with the institution's mission."
Six months later, directors could come back with reports on how the employees did in meeting those goals and provide feedback.
"Ask for their goals once a year and check on them against the previous year," Wynes says. "I want this to be intentional; if something changed, that reason should be articulated."