Special Feature

Disease Management in Oncology: Where Does Oncology Stand? Hey, Let's Manage the Case, not the Dollars

By Thomas J. Smith, MD

As part of a larger project exploring the quality of care for cancer patients in the United States, I have had the opportunity to look for peer-reviewed data on "disease management." For the perplexed, disease management promises to save everyone a lot of money and improve quality of care by creating a system that takes the patient from one process to another, with set steps for good outcomes along the way. Good disease management programs track results, so that if quality is not present, it can be added. As Chassin says, "managed care is not the problem, quality is."1

All this fits with the increasing application of standard business principles to health care. As medical malpractice attorney Angela Holder once said, bemoaning the commercialization of practice, "Give me a pound of health care." Now she would want a pound of quality health care.

Is cancer different from heart disease?

Bettina Kuroski, one of the seminal thinkers in the design of better cancer care systems, argues that it is. She recently outlined six ways that cancer is different, after noting that it has a bulls-eye due to high cost, variations in management, and discrete (bean-countable) episodes of care. ( I would recommend someone in your practice buy this book. No, I do not get royalties.)2

First, cancer is not one disease like myocardial infarction.

Second, long-term survival is the goal. A good program should change the practice of those practitioners whose patients are failed by first line, curative treatment. That would need to start before treatment begins, and would require a high volume of curable illnesses for each practitioner. Not possible, except for adjuvant therapy or the rare specialist in curable cancer.

Third, oncology incomes are dependent on interventions. "Chemotherapy is the only commodity sold in large quantities in doctor's offices in the United States." Many guidelines would restrict the use of chemotherapy and supportive care upon which current physician incomes are based. It is not likely that increased payments for "J" codes and "E and M" codes will make the difference up.

Fourth, most disease management models focus on patient compliance, e.g., asthma and diabetes. We don't do very much of that, and have little evidence that it helps. It might be because we have not looked.

Fifth, the high revenues and profits that can be generated in the current system have encouraged investors to seek maximization of revenue rather than lower-cost disease management. I guess that is the good news.

Finally, mistakes in cancer management-if they happen-are really terrible, such that the fear of litigation or bad results has led to fear of upsetting the status quo.

Does disease management work?

Changes in disease management have been reported to show some dramatic improvements, but the data may be proprietary and not available. For instance, coordinated disease management by an expert team is reported to have expanded home care services for AIDS patients by 600% but decreased total costs by nearly 50%; however, there are no actual data in the report.3 And beware of glowing reports. The same drive that makes us report our first 21 patients with a 56% response rate is found in business managers, too. And they get just as many rewards for showing their programs don't work. The publication bias against "Well, we tried to improve our practice but it is still dysfunctional" is strong.

What is the data?

Most of it is from small pieces of oncology, mostly surgery. There is remarkably little known about how medical and radiation oncologists practice in their offices, and the processes and results of their care.

Litwin and associates tried to improve the care and lower the costs of radical prostatectomy. After implementation of a clinical pathway,4 length-of-stay (LOS) in the hospital decreased from five to 3.6 days, and costs decreased from $7916 to $6934, a 12% savings. Most of this was due to standardized preop and postop management. In one of those "let the buyer beware" statistical reviews, they compared the LOS during pre and post periods. (There is no other way to do it. Randomized trials of this sort of experiment are impossible.) The hospital LOS had already begun to decrease from 7, to 6.1, to 5, to 3.6 days, even before the guideline. Is the decline real? Certainly. Due to CPG? Some part, almost certainly.

They next asked, were patients harmed? After CP implementation, formal quality-of-life stayed high, as did patient satisfaction; however, there were no data before the clinical pathway.5 Of note, the decreased LOS did not lower patient satisfaction. (Or, alternatively, the instrument is not sensitive, or the differences were missed.)

And further improvements can be made, based on data from Edmonton, Alberta where patients undergoing transurethral resection of the prostate (TURP) were compared in a home vs. hospital stay study. There were no differences in any important medical outcomes: readmission to hospital, use of health care services, complications, etc. were all the same. Even patient satisfaction stayed good. Based on the success of the pilot program, this early discharge program was implemented community wide.6

One Georgia cancer group redesigned their practice to put medical, radiation, surgical and all other services in one system. They made clinical pathways and treatment protocols for all common illnesses, tracked them, and had final accountability for making sure things got done well and inexpensively. Now, this sounds too good to be true: 1) in three years, the practice increased from 16 to 24 physicians, the number of offices increased from 12 to 17; 2) physician compensation went up 20%; 3) the cost of service was reported to be reduced by 50%, mostly by reduced hospitalization; 4) patient encounters were doubled; and 5) clinical research referrals went up 300%. According to Feinberg and Feinberg, this model has preserved autonomy, decreased variance, and facilitated clinical research.7 However, this is the equivalent of an abstract of a promising phase II trial, with little actual data presented and needs to be confirmed. I wonder how much had to do with entrepreneurship and accountability, two essential components of successful programs, as much as the treatment protocols.

On the academic cancer side, Morris and colleagues at M. D. Anderson saw the need to reduce variance and cut costs.8 So, they established practice guidelines and collaborative care paths in 1994 for gynecologic oncology surgery. Headed by a doctor who really wanted to improve care, it included a group of four gynecologic oncologists and one nurse practitioner. There was a strong administrative mandate to control costs and maximize patient outcomes. They maintained accountability by documentation along each care path, including standard forms, standard data collection sets, patient education forms, etc.

The comparison is also one of those "let the buyer beware" convenience samples, but there is no other way to do it. The results from the first 30 patients following the path were compared to 29 patients matched for age, indications for surgery, stage, and attending surgeon. These patients were chosen from the time when the care paths were being discussed, to see if there was a "Hawthorne Effect" of better behavior while under observation. A second control group of 73 patients was chosen from the time before the care paths were being discussed.


Results of the Care Path Implementation

Outcome of interest
Pre-planning for the path 
Planning for the path 
Path Patients 
< 0.002 
< 0.001 
Laboratory tests
< 0.001 
< 0.001 
Professional fees
Length of stay (days)
< 0.001 

Complication rates remained low, and patient satisfaction remained high, even with the shorter LOS. The care path, and being held accountable to results from it, improved outcomes, decreased length of stay, decreased costs, and kept patient satisfaction high. They further reduced length of stay to three days.

Morris et al made some specific suggestions:

· Involve the doctors who are providing the services. Care paths must be physician-driven to work.

· The team approach is essential. Docs are important, but don't know all that goes on.

· The paths should be based on outcomes (e.g., discharge when pain is controlled, not on day 4).

· The system must be able to track the outcome.

· Care paths should be defined using formal methods, based on evidence and consensus.

There are some other small scale examples, all dealing mostly with surgery.9,10 I could not find any in the peer-reviewed literature for routine, non-surgical care.

Take Home Message

Disease management should work in most of cancer care, but the data are scarce. It does work with surgical care. We need more data about medical and radiation oncology practice, from actual chart audit about the processes and outcomes of care, before specific recommendations can be made. We can bet that disease management companies will continue to spring up, as long as there is substantial profit to be made in the system. And, we need to remain open to management models that promise to help the care, rather than just cut costs or maximize the profit at each step.


      1. Brook RH. Managed care is not the problem, quality is. JAMA 1997;278:1612-1614.

      2. Kurowski B. Six key challenges in oncology disease management. Dis Man 1998;1:99-101.

      3. The Boston Consulting Group: The promise of disease management, Boston, The Boston Consulting Group, Inc.; 1995.

      4. Litwin MS, et al. Cost-efficient radical prostatectomy with a clinical care path. J Urology 1996;155:989-993.

      5. Litwin MS, et al. Patient satisfaction with short stays for radical prostatectomy. Urology 1997;49(6):898-906.

      6. Wilson DE, et al. Caremap management for postoperative prostatectomy care at home: A comparative study. Can J Surg 1997;40(1):39-43.

      7 Feinberg B, Feinberg I. Overall survival of the medical oncologist: A new outcome measurement in cancer medicine. Cancer 1998;82(10 Sup):2047-2056.

      8. Morris M, et al. An outcomes management program in gynecologic oncology. Obstet Gynecol 1997;89:485-492.

      9. Katterhagen G. Physician compliance with outcome-based guidelines and clinical pathways in oncology. Oncology 1996;10:113-121.

    10. Patton MD, Katterhagen JG. Critical pathways in oncology: Aligning resource expenditures with clinical outcomes. J Oncol Man 1994;July/August:16-21.