A Catalogue of Cognitive Dispositions to Respond (CDRs) that May Lead to Diagnostic Error

Aggregate bias: When physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are invoking the aggregate fallacy. The belief that their patients are atypical or somehow exceptional may lead to errors of commission; e.g., ordering x-rays or other tests when guidelines indicate none are required.

Anchoring: The tendency to perceptually lock on to salient features in the patient�s initial presentation too early in the diagnostic process, and failing to adjust this initial impression in the light of later information. This CDR may be severely compounded by the confirmation bias.

Ascertainment bias: Occurs when a physician�s thinking is shaped by prior expectation; stereotyping and gender biases are examples.

Availability: The disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience with a disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time, it may be underdiagnosed.

Base-rate neglect: The tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate, and distorting Bayesian reasoning. However, in some cases, clinicians may (consciously or otherwise) deliberately inflate the likelihood of disease, as in the strategy of �rule-out-worst-case scenario� to avoid missing a rare but significant diagnosis.

Commission bias: Results from the obligation toward beneficence, in that harm to the patient can only be prevented by active intervention. It is the tendency toward action rather than inaction, and is more likely in overconfident physicians. Commission bias is less common than omission bias.

Confirmation bias: The tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter often being more persuasive and definitive.

Diagnosis momentum: Once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries, (e.g., patients, paramedics, nurses, and physicians) what might have started as a possibility gathers increasing momentum until it becomes definite and all other possibilities are excluded.

Feedback sanction: A form of ignorance trap and time-delay trap CDR. Making a diagnostic error may carry no immediate consequences, as considerable time may elapse before the error is discovered, if ever, or poor system feedback processes prevent important information on decisions getting back to the decision maker. The particular CDR that failed the patient persists because of these temporal and systemic sanctions.

Framing effect: How diagnosticians see things may be influenced strongly by the way in which the problem is framed; e.g., physicians� perceptions of risk to the patient may be influenced strongly by whether the outcome is expressed in terms of the possibility that the patient may die or that they might live. In terms of diagnosis, physicians should be aware of how patients, nurses, and other physicians frame potential outcomes and contingencies of the clinical problem to them.

Fundamental attribution error: The tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine the circumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities and other marginalized groups tend to suffer from this CDR.

Gambler�s Fallacy: Attributed to gamblers, the belief that if a coin is tossed 10 times and is heads each time, the 11th toss has a greater chance of being tails (even though a fair coin has no memory). An example would be a physician who sees a series of patients with chest pain in clinic or the ED, diagnoses all with an acute coronary syndrome, and assumes the sequence will not continue.

Gender bias: The tendency to believe that gender is a determining factor in the probability of diagnosis of a particular disease when no such pathophysiological basis exists.

Hindsight bias: Knowing the outcome may profoundly influence perception of past events and prevent a realistic appraisal of what actually occurred. In the context of diagnostic error, it may compromise learning through either an underestimation (illusion of failure) or overestimation (illusion of control) of the decision maker�s abilities.

Multiple alternatives bias: A multiplicity of options on a differential diagnosis may lead to significant conflict and uncertainty. The process may be simplified by reverting to a smaller subset with which the physician is familiar, but may result in inadequate consideration of other possibilities. One such strategy is the three-diagnosis differential: �It is probably A, but it might be B, or I don�t know [C].� While this approach has some heuristic value, if the disease falls in the C category and is not pursued adequately, it will minimize the chances that some serious diagnoses can be made.

Omission bias: The tendency toward inaction, rooted in the principle of nonmaleficence. In hindsight, events that have occurred through the natural progression of a disease are more acceptable than those that may be attributed directly to the action of the physician. The bias may be sustained by the reinforcement often associated with not doing anything, but may prove disastrous.

Order effects: Information transfer is a U-function: A tendency to remember the beginning part (primacy effect) or the end (recency effect). Primacy effect may be augmented by anchoring. In transiations of care, where information transferred from patients, nurses, or other physicians is being evaluated, care should be taken to give due consideration to all information, regardless of the order in which it was presented.

Outcome bias: The tendency to opt for diagnostic decisions that will lead to good outcomes, rather than those associated with bad outcomes, thereby avoiding chagrin associated with the latter. It is a form of value bias in that physicians may express a stronger likelihood in their decision making for what they hope will happen rather than what they really believe might happen. This may result in serious diagnoses being minimized.

Overconfidence bias: There is a universal tendency to believe we know more than we do. Overconfidence reflects a tendency to act on incomplete information, intuitions or hunches. Too much faith is placed in opinion instead of carefully gathered evidence.

Playing the odds: Also known as frequency gambling. The tendency in equivocal or ambiguous presentations to opt for a benign diagnosis on the basis that it is significantly more likely than a serious one.

Posterior probability error: Occurs when a physician�s estimate for the likelihood of disease is unduly influenced by what has gone before for a particular patient. It is the opposite of the Gambler�s fallacy in that the physician is gambling on the sequence continuing.

Premature closure: A powerful CDR accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision-making process, and accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim, �When the diagnosis is made, the thinking stops.�

Psych-out error: Psychiatric patients appear to be particularly vulnerable to the CDRs described in this list, and to other errors in their management, some of which may exacerbate their condition. They appear especially vulnerable to fundamental attribution error. A variant of psych-out error occurs when serious medical conditions are misdiagnosed as psychiatric conditions.

Representativeness restraint: Drives the diagnostician towards looking for prototypical manifestations of disease: �If it looks like a duck, walks like a duck, and quacks like a duck, then it is a duck.� Yet restraining decision making along these pattern recognition lines leads to atypical variants being missed.

Search satisficing: Reflects the universal tendency to call off a search once something is found. Comorbidities, second foreign bodies, other fractures, and coingestants in poisoning may all be missed.

Sutton�s slip: Takes its name from the apocryphal story of the Brooklyn bank robber Willie Sutton, who, when asked by a judge why he robbed banks, is alleged to have replied, �Because that�s where the money is!� The diagnostic strategy of going for the obvious is referred to as Sutton�s Law. The slip occurs when possibilities other than the obvious are not given sufficient consideration.

Sunk costs: The more clinicians invest in a particular diagnosis, the less likely they may be to release it and consider alternatives. This is an entrapment form of CDR more associated with investment and financial considerations. However, for the diagnostician, the investment is time and mental energy, and for some ego may be a precious investment.

Triage cueing: The triage process occurs throughout the health care system, from self-triage by patients to the selection of specialists by the referring physician. In the ED, triage is a formal process that results in patients being sent in particular directions that cue their subsequent management. Many CDRs are initiated at triage, leading to the maxim, �Geography is destiny.�

Unpacking principle: Failure to elicit all relevant information (unpacking) in establishing a differential diagnosis may result in significant possibilities being missed.

Vertical line failure: Routine, repetitive tasks often lead to thinking in silos � predictable, orthodox styles that emphasize economy, efficacy, and utility. Though often rewarded, the approach carries the inherent penalty of inflexibility. In contrast, lateral thinking styles create opportunities for diagnosing the unexpected, rare or esoteric. An effective lateral thinking strategy is simply to pose the question: �What else might this be?�

Visceral bias: The influence of affective sources of error on decision-making has been widely underestimated. Visceral arousal leads to poor decisions. Countertransference, both negative and positive feelings toward patients, may result in diagnoses being missed.

Yin-yang out: When patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have been �worked up the yin-yang.� The yin-yang out is the tendency to believe that nothing further can be done to throw light on the dark place where, and if, any definitive diagnosis resides for the patient; i.e., the physician is let out of further diagnostic effort.

Source: Adapted with permission from Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:1-6. There is considerable overlap among CDRs, some being known by other synonyms. These, together with further detail and citations for the original work, are described in Croskerry P. Achieving quality in clinical decision making: Cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184-1204; and Hogarth RM. Judgment and Choice: The Psychology of Decision. Chichester, England: Wiley; 1980.