How does the evidence rate?
Knowing what's good and what's not
If you read it in a peer reviewed journal, it must be right — right? And if there is an evidence-based practice, then the evidence must be stellar. Not so fast, says Lisa Spruce, DNP, RN, ACNS, ACNP, ANP, CNOR, director of evidence-based perioperative practice at the Association of periOPerative Registered Nurses (AORN) in Denver. Spruce is a big advocate of healthcare stakeholders becoming critical readers and understanding exactly what kind of data makes for good evidence. Doing so can make anyone better at determining what practices to mimic or adapt to local needs, and what can just be ignored.
AORN has some 30 different recommended practices, with new ones written as the need arises. All of them are based on a specific set of actions, Spruce says. First, the author of a proposed practice will sit down with a librarian and go over topics, key words, clinical questions, and the scope of those questions. "We might be looking at interventions, education, or existing best practices about something to see what the evidence shows."
AORN has adopted the Johns Hopkins nursing evidence appraisal tools, she says (available at http://www.nursingworld.org/Research-toolkit/Johns-Hopkins-Nursing-Evidence-Based-Practice). Each appraisal starts with an initial literature search that may bring up between 200 and a thousand articles, says Spruce. All have to come from peer-reviewed journals and may have to be within a particular time period. Usually an author and an appraisal reviewer — usually a doctoral-prepared nurse — read through the abstracts to see what might be applicable. Any studies that speak to the proposed practice are then gathered and read in full to evaluate the strength and quality of the study and its data.
She explains that studies that are randomized and controlled receive the highest rating — level 1. Quasi-experimental or non-experimental studies get lower scores. The reviewers will look at the sample size and whether it applies to the population for which a proposed evidence-based practice is being considered — a study of children in Africa wouldn't be useful if the practice relates to vitamin D deficiency in older women in Northern Europe. They look at whether the results are consistent and clear, and whether they use a good methodology.
Reviewers each give each study reviewed a quality grade of A, B, or C, she says. "If there is disagreement on the grade, a third person is brought in to give an opinion." Each resulting recommended practice will include a list of articles that support its use and the strength and quality associated with each of them. "The idea is to be completely transparent about the process," Spruce says.
The practices are all given a rating, she says. Most recently, AORN has been using the Oncology Nursing Society model (available online at http://www.ons.org/Research/media/ons/docs/research/outcomes/weight-of-evidence-table.pdf), but Spruce says AORN is developing its own model to rate practices. Some may have strong evidence, some may have moderate evidence. There are some that have no evidence — possibly because you can't ethically run the kind of experiment you'd need to collect good data. Still, if the benefits of doing the practice outweigh the harm, then even a practice with no evidence behind it can get a strong recommendation, Spruce says.
For instance, there is a recommendation that anyone having throat surgery with electrocautery and oxygen both in use should have wet packs packed in the throat to prevent the risk of fire. "It's not supported by high levels of evidence," she says. "But you can't do an experiment where you see if you start a fire in a patient's throat when you don't have wet packs."
This isn't hard to do, she says. And doing it even once gives you a much clearer view of the maxim that you shouldn't believe everything you read. How a study was done, who paid for it, and the quality of the data can all be eye-opening. It may prove to you that the way you've always done something is still the best way, or it could show you that the new method is safer or faster or has better outcomes. "New evidence becomes available, and we need to think of that as we care for our patients. You need to learn to critically assess what you do and why you do it." Spruce says once you've gone through the process you'll "read articles differently every single day."
If you want more information on how to do your own research for evidence based practices, she suggests looking to specialty societies and other organizations that promote evidence-based protocols. "Look at posters at conferences," she says. "Often they give you a clearer idea about how research is done. Most importantly, read journal articles. Start a journal club or join one, where you can learn how to read research with a critical eye."
For more information on this topic, contact Lisa Spruce, DNP, RN, ACNS, ACNP, ANP, CNOR, Director, Evidence-Based Perioperative Practice, Association of periOperative Registered Nurses (AORN), Denver, CO. Telephone: (303) 755-6304.