To err is human: A case-based review of cognitive bias and its role in clinical decision making

Main Article Content

Siraj Mithoowani
Andrew Mulloy
Augustin Toma
Ameen Patel

Abstract

Cognitive biases, or systematic errors in cognition, are important contributors to diagnostic error in medicine. In our review, we explore the psychological underpinnings of cognitive bias and highlight several common biases using clinical cases. We conclude by reviewing strategies to improve diagnostic accuracy and by discussing controversies and future research directions.

Résumé

Les préjugés cognitifs, ou les erreurs systématiques dans la cognition, sont des contributeurs importants à l'erreur diagnostique dans la médecine. Dans notre examen, nous explorons les fondements psychologiques du biais cognitif et soulignons plusieurs préjugés communs en utilisant des cas cliniques. Nous concluons en examinant les stratégies visant à améliorer la précision diagnostique et en discutant des controverses et des futures orientations de recherche.


Research in the field of behavioural psychology and its application to medicine has been ongoing for several decades in an effort to better understand clinical decision making.1 Cognitive biases (systematic errors in cognition) are increasingly recognized in behavioural economics2 and more recently have been shown to affect medical decision making.3 Over 100 such cognitive biases have been identified and several dozen are postulated to play a major role in diagnostic error.4 Cognitive errors can take many forms and in one study contributed to as many as 74% of diagnostic errors by internists. 5 Most of these errors were due to “faulty synthesis” of information, including premature diagnostic closure and failed use of heuristics.5 Inadequate medical knowledge, on the other hand, was rare and mostly identified in cases concerning rare conditions. 5 Professional organizations such as the Royal College of Physicians and Surgeons of Canada and the Canadian Medical Protective Association have since been working to raise awareness of cognitive bias in clinical practice.6

In our review, we explore the role of cognitive bias in diagnostic error through the use of clinical cases. We also review the literature on de-biasing strategies and comment on limitations and future directions of research.

The Dual Process Theory




A prevailing theory to explain the existence of cognitive bias is the dual process theory, which asserts that two cognitive systems are used in decision making, herein called System 1 and System 2 (Table 1). 2,7

System 1 can be thought of as our intuitive mode of thinking. It generates hypotheses rapidly, operates beneath our perceptible consciousness and makes judgments that are highly dependent on contextual clues. System 1 is characterized by heuristics (short cuts, or “rules of thumb”) and is an important component of clinical judgment or expertise. In contrast, System 2 is slow, deliberate, analytical and more demanding on cognition. It applies rules that are acquired through learning and it can play a “monitoring role” over System 1, and thus overrides heuristics when their use is inappropriate.

The dual process theory implies that errors result when inappropriate judgments generated by System 1 fail to be recognized and corrected by System 2. Maintaining constant vigilance over System 1 would be both impractical and time consuming for routine decisions and would diminish the value of intuition. It follows that a more practical way of improving reasoning is to identify the most common biases of System 1 and to recognize situations when mistakes are most likely to occur.2

Alternative Theories of Cognition




Variations of dual process theory have further refined our understanding of medical decision making. Fuzzy trace theory, for example, proposes that individuals process information through parallel gist and verbatim representations.8 The “gist” is analogous to System 1 and represents the bottom-line “meaning” of information. This representation is subject to an individual’s worldview, emotions and experiences. In contrast, verbatim representations are precise, literal and analogous to System 2. Fuzzy trace theory is particularly useful in explaining how patients might interpret health information. Proponents of this theory contend that in order for information to lead to meaningful behavioural change, physicians must appeal to both gist and verbatim representations when communicating with patients.8 Other models, such as dynamic graded continuum theory, do away with the dichotomy of System 1 and System 2 and instead represent implicit, automatic and explicit cognitive processes on a continuous scale.9 These single system models are useful to compare against dual process theory but have not replaced it as a well-established framework for understanding and mitigating cognitive bias in clinical decision making.7

Case 1: A 55-Year-Old Male with Retrosternal Chest Pain




A 55year-old non-smoking male was assessed in a busy Emergency Department (ED) for retrosternal chest pain. Past medical history is significant for osteoarthritis for which he takes naproxen. On review of his history, the patient has had multiple visits for retrosternal chest pain in the previous two months. At each encounter, he was discharged home after a negative cardiac workup.

Vital signs in the ED were within normal limits except for sinus tachycardia at 112 beats per minute. On exam, the patient was visibly distressed. Cardiac and respiratory exams were normal. There was mild tenderness in the epigastrium. Basic blood-work revealed leukocytosis (16.0 × 109/L), a mildly elevated high sensitivity cardiac troponin, and no other abnormalities. An ECG revealed T wave flattening in leads V3-V4.

The patient was referred to the internal medicine service with a diagnosis of non-ST-elevation myocardial infarction and treated with aspirin, clopidogrel, and fondaparinux. Several hours later, the patient became more agitated and complained of worsening retrosternal and epigastric pain. On re-examination, heart rate had increased to 139 beats per minute, blood pressure dropped to 77/60 and he had a rigid abdomen. Abdominal radiography revealed free air under the right hemi-diaphragm and the patient was rushed to the operating room where a perforated gastric ulcer was detected and repaired.

The case above illustrates numerous cognitive biases, including:

1. Premature diagnostic closure: the tendency to accept a diagnosis before it is fully verified.4

2. Anchoring: the tendency to over-emphasize features in the patient’s initial presentation and failing to adjust the clinical impression after learning new information.4

3. Confirmation bias: the tendency to look for confirming evidence to support a diagnosis, rather than to look for (or explain) evidence which puts the diagnosis in question.4

In this case, the physician based the diagnosis of myocardial infarction primarily on symptoms of chest pain and an elevated cardiac troponin. However, several other objective findings were present and when taken together, suggested a diagnosis other than myocardial infarction. These included a tender epigastrium, leukocytosis, and resting sinus tachycardia. These symptoms/signs were not explicitly explained or investigated before a treatment decision was made. Premature diagnostic closure is one of the most common cognitive biases underlying medical errors5 and it affects clinicians at all levels of training.10 It is multifactorial in origin5 and is especially common in the face of other cognitive biases such as anchoring and confirmation bias.

The physician in this case “anchored” to a diagnosis of cardiac chest pain given the patient’s previous ED visit history and his/her best intentions of ruling out a “worst case scenario.” Anchoring can be especially powerful in the face of abnormal screening investigations that have been reviewed even before the physician has acquired a history or performed a physical examination. If the physician had reviewed the screening investigations before seeing the patient, he/she might have narrowed the differential diagnosis prematurely, failed to gather all the relevant information and failed to adjust the clinical impression based on new information.

The physician demonstrated confirmation bias by failing to explain the abnormalities that put the diagnosis of myocardial infarction in question (e.g. tender epigastrium, leukocytosis). Confirmation bias arises from an attempt to avoid cognitive dissonance, a distressing psychological conflict which occurs when inconsistent beliefs or theories are held simultaneously.11 In one study evaluating clinical decision making amongst 75 psychiatrists and 75 medical students,12 13% of psychiatrists and 25% of medical students demonstrated confirmation bias when searching for information after having made a preliminary diagnosis. In this study, confirmation bias resulted in more frequent diagnostic errors and predictably impacted subsequent treatment decisions.

An appropriate consideration of all diagnostic possibilities is the first step in avoiding diagnostic error. While acquiring information, physicians should step back and consolidate new data with the working diagnosis, as failure to do so can result in confirmation bias.13 All abnormal findings and tests, especially if considered clinically relevant should be explained by the most probable diagnosis. An alternate diagnosis or the possibility of more than one diagnosis should be considered when an abnormal finding or test cannot reasonably be explained by the working diagnosis.

Tschen et al observed a team of physicians working through a simulated scenario which had diagnostic ambiguity.14 Two approaches were found to be effective in reducing the effect of confirmation bias: explicit reasoning and talking to the room. Explicit reasoning involves making causal inferences when interpreting and communicating information. Talking to the room is a process whereby diagnostic reasoning is explained in an unstructured way to a team member or colleague in the room. This allows the clinician the opportunity to elaborate on their thoughts and observers to point out errors or suggest alternate diagnoses in a shared mental model.

Case 2: A 30-Yearold Male with Confusion and Seizures




A 30-year-old homeless male is found confused on the street by paramedics and brought to the ED for assessment. Empty bottles of alcohol were noted at the scene. The CIWA (Clinical Institute Withdrawal Assessment for Alcohol) protocol is initiated and he is given several doses of lorazepam to minimal effect. Several hours after the patient is admitted, a resident on-call is paged for elevated CIWA scores on the basis of diaphoresis and agitation. Several additional doses of lorazepam are ordered which fail to completely resolve the symptoms. Gradually, the patient becomes more obtunded. The on-call resident orders a capillary blood glucose and it measures 1.1 mmol/L. Intravenous D50W is promptly administered, the blood glucose normalizes and the patient’s level of consciousness improves.

The case above illustrates the following biases:

1. Availability bias: the tendency to weigh a diagnosis as being more likely if it comes to mind more readily.4

2. Diagnostic momentum: the tendency for labels to “stick” to patients and become more definite with time.4

Although the symptoms of diaphoresis and agitation are not specific to alcohol withdrawal, this diagnosis was deemed most likely based on how readily it came to mind, the empty alcohol bottles at the scene, and potentially on the patient’s demographics. The unproven diagnosis of alcohol withdrawal “stuck” with the patient despite minimal improvement after a therapeutic trial of benzodiazepines.

Availability bias has been shown to affect internal medicine residents. In one single-centre study,15 18 first-year and 18 second-year residents were exposed to case descriptions with associated diagnoses as part of an exercise. They were then asked to diagnose a series of new cases, some of which appeared similar to those they had previously encountered but with pertinent differences that made an alternate diagnosis more likely. Second year residents had lower diagnostic accuracy on these similar-appearing cases; a result consistent with availability bias. First year residents were less prone to this bias because of their limited clinical experience. Most importantly, subsequent reflective diagnostic reasoning countered the bias and improved accuracy.

General Strategies to Avoid Cognitive Bias




Interventions aimed at mitigating diagnostic error due to cognitive bias take several approaches.

1. Improving clinical reasoning

2. Reducing cognitive burden

3. Improving knowledge and experience

Despite a large number of proposed interventions, there is a lack of empirical evidence supporting the efficacy of many de-biasing strategies. 16 What follows is a brief review of the current evidence.

Improving Clinical Reasoning




Several “de-biasing” strategies have been proposed to improve clinical reasoning. De-biasing strategies assume that System 1 processes are more prone to bias due to their heavy reliance on heuristics and therefore the solution is to activate System 2 at critical points in decision making. De-biasing occurs in several stages: at first an individual is educated about the presence of a cognitive bias, they then employ strategies to eliminate that bias and finally they maintain those strategies in the long term.17

Metacognition, or “thinking about thinking,” involves reflecting on one’s own diagnostic reasoning. Internal reflection along with awareness of potential biases should allow the clinician to identify faulty reasoning. However, the evidence underlying reflective practice is mixed.16 Several studies have tried to encourage reflective practice and System 2 processes by instructing participants to proceed slowly through their reasoning18 or by giving participants the opportunity to review their diagnoses.19 These studies have found minimal or no impact on reducing the rate of diagnostic error. On the other hand, some studies have shown improved diagnostic accuracy when physicians are asked to explicitly state their differential diagnoses along with features that are consistent or inconsistent with each diagnosis.20 These results suggest that if reflective practice is to be effective, it must involve a thorough review of the differential diagnosis as opposed to simply taking additional time.

Reducing Cognitive Burden




Tools that reduce the cognitive burden placed on physicians may reduce the frequency of diagnostic errors. One suggestion has been to incorporate the use of checklists in the diagnostic process. These checklists would be matched to common presenting symptoms and include a list of possible diagnoses. One randomized controlled trial failed to show a statistically significant reduction in the diagnostic error rate with the use of checklists, except in a small subgroup of patients treated in the ED. 21 These findings challenge the results of two other studies that found checklists to be effective in improving scrutiny22 and diagnostic accuracy23 when interpreting electrocardiograms. More advanced forms of clinician decision support systems have also been studied.24 Software programs such as DXplain generate a list of potential diagnoses based on a patient’s chief complaint. In one study, when the software provided physicians a list of possible diagnoses before evaluating patients, diagnoses were 1.31 times more likely to be correct. 25 The use of diagnostic support tools may grow in the future as they are integrated into electronic medical record systems.

Improving Knowledge and Experience




A combination of experience, knowledge and feedback are integral in developing a clinician’s intuition to produce the best hypotheses. Experience without feedback can lead to overconfidence, which itself is a cognitive bias. The evidence supporting feedback is strong. Fridriksson et al showed a significant reduction in diagnostic error when referring doctors were provided feedback on the identification of subarachnoid hemorrhage.26 A systematic review of 118 randomized trials concluded that feedback was effective in improving professional practice.27 The specific characteristics of the best feedback were elusive. In general, however, feedback was thought to be most effective when it was explicit and delivered close to the time of decision making.

Conclusions




In our review, we explore clinical decision making through the lens of dual-process theory. However, multiple different dual-processing models are still being explored and fundamental questions are still under debate. For example, some experts believe that instead of focusing on de-biasing strategies, the key to improving intuitive (System 1) processes is simply to acquire more formal and experiential knowledge.19 Other unanswered questions include: the impact and magnitude of cognitive bias in actual clinical practice, which biases are most prevalent in each medical specialty and which strategies are the most effective in mitigating bias. Further study is also needed to assess the impact of novel educational methods, such as case-based and simulation-based learning, which are promising venues where trainees may identify and correct cognitive biases in a directly observed setting.

References




1. Norman G. Research in clinical reasoning: past history and current trends. Med Educ 2005;39:418–27.

2. Kahneman D. Thinking, fast and slow. Farrar, Straus and Giroux; 2011.

3. Croskerry P. From mindless to mindful practice — cognitive bias and clinical decision making. N Engl J Med 2013;368:2445–8.

4. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine 2003;78:775–80.

5. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.

6. Parush A, Campbell C, Hunter A, et al. Situational awareness and patient safety - a short primer. Ottawa ON: The Royal College of Physicians and Surgeons of Canada; 2011.

7. Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online 2011;16.

8. Reyna VF. A theory of medical decision making and health: fuzzy trace theory. Med Decis 2008;28:850–65.

9. Osman M. An evaluation of dual-process theories of reasoning. Psychonom Bull review 2004;11:988–1010.

10. Dubeau CE, Voytovich AE, Rippey RM. Premature conclusions in the diagnosis of iron-deficiency anemia: cause and effect. Med Dec Mak 1986;6:169–73.

11. Nickerson RS. Confirmation bias: A ubiquitous phenomenon in many guises. Rev Gen Psychol 1998;2:175.

12. Mendel R, Traut-Mattausch E, Jonas E, et al. Confirmation bias: why psychiatrists stick to wrong preliminary diagnoses. Psychol Med 2011;41:2651–9.

13. Pines JM. Profiles in patient safety: confirmation bias in emergency medicine. Acad Emerg Med 2006;13:90–4.

14. Tschan F, Semmer NK, Gurtner A, et al. Explicit reasoning, confirmation bias, and illusory transactive memory: a simulation study of group medical decision making. Small Group Res 2009;40:271–300.

15. Mamede S, van Gog T, van den Berge K, et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA 2010;304:1198–203.

16. Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Safe 2012;21:535–57.

17. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf 2013.

18. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med 2014;89:277–84.

19. Monteiro SD, Sherbino J, Patel A, Mazzetti I, Norman GR, Howey E. Reflecting on diagnostic errors: taking a second look is not enough. J Gen Intern Med 2015;30:1270–4.

20. Bass A, Geddes C, Wright B, Coderre S, Rikers R, McLaughlin K. Experienced physicians benefit from analyzing initial diagnostic hypotheses. Can Med Educ J 2013;4:e7–e15.

21. Ely JW, Graber MA. Checklists to prevent diagnostic errors: a pilot randomized controlled trial. Diagnosis 2015;2.

22. Sibbald M, de Bruin ABH, Yu E, van Merrienboer JJG. Why verifying diagnostic decisions with a checklist can help: insights from eye tracking. Adv Health Sci Educ Theory Pract 2015;20:1053–60.

23. Sibbald M, de Bruin ABH, van Merrienboer JJG. Checklists improve experts' diagnostic decisions. Med Educ 2013;47:301–8.

24. Garg AX, Adhikari NKJ, McDonald H, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005;293:1223–38.

25. Kostopoulou O, Rosen A, Round T, et al. Early diagnostic suggestions improve accuracy of GPs: a randomised controlled trial using computer-simulated patients. Br J Gen Pract 2015;65:e49–54.

26. Fridriksson S, Hillman J, Landtblom AM, Boive J. Education of referring doctors about sudden onset headache in subarachnoid hemorrhage. A prospective study. Acta Neurol Scand 2001;103:238–42.

27. Jamtvedt G, Young JM, Kristoffersen DT, O'Brien MA, Oxman AD. Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Qual Saf Health Care 2006;15:433–6.

Abstract 1196 | HTML Downloads 1515 PDF Downloads 510