Sildenafil Citrate for Erectile Dysfunction in Men Receiving Multiple Antihypertensive Agents
Source: Pickering TG, et al. Am J Hypertens. 2004;17:1135-1142.
Erectile dysfunction (ED) is commonplace in men at middle-age and beyond, the same era when hypertension (HTN) incidence begins to rise steeply. Since ED is widely recognized as a vasculopathy, in large part related to endothelial dysfunction, HTN can contribute to the burden of ED. Additionally, antihypertensive medications can lead to ED.
Sildenafil can produce dramatic, potentially disastrous blood pressure reductions if concomitantly administered with organic nitrates (eg, nitroglycerin), and is thusly contraindicated. Little studied is the efficacy and safety of sildenafil in men already on multiple antihypertensive agents.
Pickering and colleagues studied middle-aged men with ED (n = 562), all of whom were taking 2 or more antihypertensives. Men were randomized to sildenafil (titrated to most efficacious, best tolerated dose) or placebo, and monitored for medication efficacy as well as adverse events.
Sildenafil provided efficacy in restoring erectile function similar to that seen in prior trials including non-hypertensive patients. It was also well tolerated, and sildenafil-related hypotension was not identified. These data are encouraging that sildenafil may be used safely and efficaciously in men already receiving multi-drug regimens for HTN.
Risk of Fracture after Androgen Deprivation for Prostate Cancer
Source: Shahinian VB, et al. N Engl J Med. 2005;352:154-164.
In the population of men with advanced prostate cancer (PCA) as a whole, androgen deprivation, through medical or surgical orchiectomy, produces benefits in morbidity and mortality. In 2 distinct PCA populations—men with localized disease, and men with post-prostatectomy PSA elevations—gonadotropin-releasing hormone agonists (GNRH) are finding increased use, despite lack of proof for survival advantage. Because androgen deprivation results in rapid bone loss, an increased risk of fracture would be anticipated to result. Data from the National Cancer Institute’s Surveillance, Epidemiology, and End Results program and Medicare provided the information from which 50,613 men with prostate cancer could be studied.
Between 1992 and 1997, the relative risk of any fracture for men who had received nine or more doses of 9 was increased approximately 1.5 fold. The relative risk for fracture requiring hospitalization was even greater. Indeed, men who received orchiectomy or greater than 9 doses of GNRH in the year after diagnosis had a lower fracture-free survival rate, compared with those who did not receive androgen deprivation. Noting these risks, Shahinian and colleagues suggest caution, at least in the distinct populations mentioned above, in use of androgen deprivation until benefits of treatment are more clearly delineated.
Chronic Stress Accelerates Ultraviolet-Induced Cutaneous Carcinogenesis
Source: Parker J, et al. J Am Acad Dermatol. 2004;51:919-922.
Long-term consequences of stress in human beings are difficult to measure, but it is believed that excessive emotional stress contributes to important adverse health outcomes such as myocardial infarction. Whether stress impacts cancer is little understood. To elucidate this issue, Parker and colleagues studied carcinogenesis in animals.
Mice were exposed to the stressor of being placed in a compartment (without opportunity for escape) for 60 minutes daily containing odor from fox urine, one of their natural predators. After 2 weeks of daily stress, the frequency was reduced to thrice weekly for 30 weeks. During the study period, animals were irradiated with UV light known to induce skin lesions.
Animals exposed to stress began to manifest neoplasm dramatically earlier than control animals (8 weeks vs 21 weeks), and with greater frequency (5 fold-increase in stress-subjected animals). Stressed animals in which tumors developed survived less well compared to controls (who also developed tumors). Although animal studies do not directly translate into human agenda, these data support the conceptual framework for a deleterious impact of stress upon carcinogenesis, as well as survival from established carcinoma.
Air Contrast Barium Enema, CT Colonography, and Colonoscopy
Source: Rockey DC, et al. Lancet. 2005;365:305-311.
Clinicians share a common goal in respect to early diagnosis and treatment of colon cancer, yet the optimum pathway with which to reach this goal is controversial. Colonoscopy (COL), air contrast barium enema (ACBE), and computed tomographic colonography (CTC, also known as virtual colonoscopy) all have their advocates, but no comparative trial of these methods in the same population has been yet carried out.
Rockey and colleagues recruited a group of patients who were acknowledged to be at high risk for colon cancer: subjects with positive FOBT, a recent history of rectal bleeding, iron deficiency anemia, or a strong family history for colon cancer. This selected population (n = 614) underwent all three investigations, accomplished within 14 days of whenever ACBE was performed.
As you may have suspected intuitively, colonoscopy won out on all accounts. For colonic lesions greater than 10 mm, there was no statistically significant difference between ACBE and CTC (sensitivities 48% and 59%, respectively), but COL had a sensitivity of 98%. A similar picture emerged for smaller lesions (6-9 mm). CTC did offer the diagnostic advantage that additional extracolonic abnormalities were discovered in literally over 50% of persons, although most of these abnormalities ultimately were not considered clinically important. On the other hand, 16 highly significant findings were uncovered on CTC: 12 abdominal aortic aneurysms and 4 malignant masses. The results of this trial would encourage colonoscopy being the preferred investigation method, though future evolution in CTC may ultimately challenge this conclusion.
Risk Stratification for In-Hospital Mortality in Acutely Decompensated Heart Failure
Source: Fonarow GC, et al. JAMA. 2005;293:572-580.
Heart failure (CHF) remains the most common diagnosis resulting in hospital admission in the United States. Because the risk for mortality amongst CHF patients at the time of acute decompensation is not insubstantial, it would be valuable to be able to discern which individuals with acutely decompensated CHF are at greatest risk, and provide correspondingly enhanced intensive management strategies to their care (eg, intensive care unit monitoring, rather than telemetry). Risk factors for increased mortality in the setting of chronic heart failure are fairly well defined, and include such parameters as age, ejection fraction, and BNP levels.
The ADHERE population (Acute Decompensated Heart Failure National Registry) has captured data from hospitalized patients at 263 sites in the United States. Using data provided from 33,046 hospitalizations, a mortality risk factor profile was developed. Subsequently, a similar population (n = 33,229) was used to validate the risk stratification system derived retrospectively from the first population.
The single best predictor for mortality was an elevated BUN (> 43 mg/dL). The next best predictors were low SBP (< 115 mm Hg) and elevated creatinine (> 2.75 mg/dL). CHF patient populations stratified using these 3 readily measured items demonstrated a marked 12.9 increased in mortality odds ratio, compared to those with lowest risk as assessed by the same markers. Identifying acutely decompensated CHF patients with unfavorable BUN, SBP, and Creatinine at admission may provide a useful tool for risk stratification and resource allocation.
Obesity, Weight Gain, and the Risk of Kidney Stones
Source: Taylor EN, et al. JAMA. 2005; 293:455-462.
The lifetime prevalence of kidney stones in American men and women (10% and 5%, respectively) merits attention from the clinical community to help identify modifiable risk factors. It has been suggested that insulin resistance, a consistent concomitant of obesity, favors formation of calcium stones and alters metabolism of ammonium, which can unfavorably impact urine pH. Overweight adults excrete more uric acid, which also may favor formation of urate stones.
In an effort to identify relationships between weight, weight gain, BMI, and waist circumference with subsequent kidney stone formation, Taylor and colleagues performed a prospective study that included more than 245,000 individuals, comprised of cohorts from the Health Professionals Follow-up Study, and the Nurses Health Study.
Over a total of 46 years of follow-up, clear patterns of increased risk for stone formation emerged: men or women who weighed more than 220 pounds, men or women who gained more than 35 pounds after young adulthood (age 21 for men, 18 for women) regardless of actual weight attained, and those with BMI over 30 all had increased relative risk of experiencing a kidney stone.
Excessive weight gain and obesity are associated with an augmented risk for kidney stones, and provide another reason to intervene in patients who have weight management issues.