Poorly done research can have big impact on public perception, policy

CT findings can be misrepresented

It's up to researchers and research journals to provide checks and balances to important investigative findings, but what happens when bad research seeps through the cracks and even becomes widely reported as the truth?

Several investigators say they've encountered situations where studies with poorly done methodology or other problems have captured the public's attention and had a broader impact than warranted.

For instance, an unpublished and statistically inaccurate dissertation that said grief counseling actually harms people seeking help was cited in a major journal article and led to public skepticism of grief counseling. Worse, some health care organizations used the faulty research as justification for cutting back on grief counseling services, says Dale G. Larson, PhD, a professor in the department of counseling psychology at Santa Clara University in Santa Clara, CA.

Larson recently published a paper refuting the flawed research, saying the negative characterization of grief counseling has no empirical grounding.1

The research process is supposed to catch flawed studies through peer-reviewed journals and follow-up research by other investigators. But the process doesn't always work as well as expected, and sometimes a flawed study can capture the public eye in a way that gives it a long life.

This happened with the negative reports on grief counseling. Even after Larson began to speak publicly about the flaws of the initial research and after he told a reporter from Newsweek that there really is no scientific evidence that grief counseling causes harm, the national news magazine published a major story that was critical of grief counseling, he says.

"The June 18 [Newsweek] article said that grief counseling was harmful, and it led with those [incorrect] statistics," Larson says.

Larson and other investigators have found that it's a lot more difficult to correct a public misconception based on flawed research than it is to gain attention for a study that says something provocative and new.

"The news media hypes interesting news on page one, and if it turns out to be wrong, the retraction is printed deep back in the first section of the paper," says George Diamond, MD, FACC, senior research scientist, emeritus, at Cedars-Sinai Medical Center in Los Angeles, CA.

Studies that show problems with a popular drug or technology or that refute a common therapy can lead to sensational headlines.

"The media always loves a black and white headline, but the reality is always gray," says Sanjay Kaul, MD, director of the cardiology fellowship training program and director of the vascular physiology and thrombosis research laboratory at Burns and Allen Research Institute, Cedars-Sinai Medical Center in Los Angeles, CA. Kaul also is a professor at the David Geffen School of Medicine at the University of California, Los Angeles.

Diamond and Kaul testified on July 30, 2007, at an FDA advisory committee reviewing data on the safety of a Type 2 diabetes drug called rosiglitazone (Avandia®). They were among the three authors of a paper on the uncertain effects of rosiglitazone and myocardial infarction, which is slated for publication in the Oct. 16, 2007, issue of the Annals of Internal Medicine.2

The hearing was prompted by a May 21, 2007, article published in the New England Journal of Medicine that linked rosiglitazone to a significantly increased risk of heart attack.3 The NEJM article was a meta-analysis of 42 trials on the drug, and it contended that the drug produced a statistically significant 43% increase in myocardial infarction, as well as a borderline significant increase in cardiovascular death, Diamond explains.

"The study received a huge amount of press the day the article came out," Diamond says. "I looked more carefully at the article as posted on the New England Journal of Medicine's web site, and I was struck by the paucity of the number of events in the trials that were analyzed."

Of 42 trials and 28,000 patients, there were fewer than 150 events, and there were many trials that had no events within either the treatment or control arms, he explains.

"So the study's conclusion was based upon a paucity of event data and that gave me pause," Diamond says. "I discussed it with Dr. Kaul and he was similarly impressed, and on that basis we redid an analysis of the data."

As cardiologists, Diamond and Kaul were interested in finding out the truth about this drug and did not do their analysis because of any financial incentives or ties to rosiglitazone's manufacturer GlaxoSmithKline.

Using a more sophisticated statistical methodology that could deal with zero-event trials, Diamond and Kaul determined that the odds ratio for cardiovascular death or myocardial infarction was substantially less than what was reported.

"The confidence analysis was very wide, and there were no grounds for the national hysteria engendered by the publication of this report," Diamond says.

Diamond and Kaul presented their analysis before the FDA advisory board, which had some members who were intent on removing the diabetes drug from the market.

FDA statisticians had done a similar analysis and reached the same conclusion Diamond and Kaul had.

When the FDA board finally decided on rosiglitazone's future there was a 22 to 1 vote to keep the drug on the market, Diamond says.

"A number of physicians rely on this drug to maintain their patients' blood glucose," Diamond says.

Although future studies might suggest greater risk of cardiovascular problems with the use of the drug, the available literature is inconclusive, so the drug shouldn't be pulled from the market based on one flawed meta-analysis, Diamond and Kaul say.

Sometimes flawed studies are promoted by people who have a political agenda, Diamond notes.

In the case of the study questioning rosiglitazone's safety, there were members of Congress and others who wanted to use this as an example of why the FDA needs restructuring, he says.

And with so many different methodological techniques, research tools, and other strategies for analyzing data, it's easy to pull false alarms, Diamond says.

"Many eyes and many tools used can lead to a number of false alarms, but they can also lead to the early detection of a number of real fires that can be put out," he adds.

There are pros and cons, and the best strategy might be for the FDA to establish a proactive task force that looks at the problem in isolation of emotion and hysteria and come up with standards for monitoring safety before and after approval of drugs, Diamond suggests.

"Skepticism is a very useful tool for navigating through the labyrinth of hype and misinformation," Kaul says. "Be skeptical and try to verify a study, and if you cannot replicate it, then you have to dump it."

That's the beauty of the scientific process, Kaul notes.

"There's always a correction factor," he says. "The person who does an analysis has to be skeptical of his or her own analysis, and this has to be done in a dispassionate manner."

Not every study that contradicts previous perceptions or findings is wrong. In fact, often these studies that find flaws in accepted treatments or medications are part of the scientific process, filling in needed details within the existing body of research.

For example, Kaul also has investigated and testified before an FDA advisory panel on the topic of medicated stents, which quickly replaced the old uncoated stents in heart surgeries in the past few years, Kaul says.

"Typically the bare metal stents would be successful in 80 to 90 percent of cases, but in 10 to 20 percent of cases [the arteries] would reblock," Kaul says.

So cardiologist embraced the new coated stents, which greatly reduced the reblockage or restenosis. The problem was that the new stents caused more blood clotting, Kaul explains.

"Stent thrombosis, even if it's an infrequent occurrence, can cause a heart attack in up to 70 percent of cases," Kaul says. "So we raised questions of whether trading stent thrombosis for reblockage was a worthwhile trade-off."

Once a paper was published on the stent thrombosis link to coated stents, and the FDA convened on the issue, cardiologists stopped using coated stents as often, Kaul says.

"There are certain patients where the risk profile may favor the benefits of medicated stents," Kaul says. "And those are the types of patients where its use is beneficial."

Medical professionals sometimes embrace new technology too quickly and overuse it as enthusiasm exceeds evidence, Kaul notes.

"Whenever there's a technology trigger it quickly leads to a peak of inflated expectations, followed by a rapid fall, and trough of disillusionment and plateau of productivity," Kaul explains.

The key is to remain scientifically skeptical and to keep in mind that most new research is not replicated, meaning it doesn't prove to be fact over time, Kaul adds.


  1. Larson DG, Hoyt WT. What has become of grief counseling? An evaluation of the empirical foundations of the new pessimism. Prof Psych Res Pract 2007:38:347-355.
  2. Diamond GA, Bax L, Kaul S. Uncertain effects of rosiglitazone on the risk for myocardial infarction and cardiovascular death. Ann Intern Med 2007;148(8). Epub 2007 Aug 6.
  3. Nissen SE, Wolski K. Effect of rosiglitazone on the risk of myocardial infarction and death from cardiovascular causes. N Engl J Med 2007;356:2457-2471. Epub 2007 May 21.