Thinking about thinking.
Being homeless and having nystagmus, the patient was felt to be intoxicated and an alcohol level was drawn. When it was found to be zero, a neurologist was consulted. The neurologist noted vertical nystagmus, slight impairment of lateral gaze, and upon walking the patient, pronounced midline ataxia. The patient was diagnosed as having acute Wernicke's disease and made a complete response to thiamine.
A young woman was seen in the emergency department agitated and crying with a complaint of tight limbs and inability to keep her neck from spontaneously extending. She had been ill with the "flu."
A diagnosis of hysteria and anxiety was made. Another physician, passing by the room immediately recognized the posturing and elicited a history of past prochlorperazine use for vomiting. A diagnosis of acute dystonic reaction due to phenothiazine was made, the patient responding quickly and completely to anticholinergic medication.
Virtually every honest medical worker will admit to having stereotyped patients as "hysterical female," "drunk," "homeless," "secondary gain," "malingerer," "crock" or "frequent flier."
Those who say they would never put such a label on a patient is likely not to have been fatigued, overworked or stressed in a busy emergency department at 2 a.m. on a weekend.
As physicians, part of our job is to make a diagnosis. And especially in an emergency department, there is a significant pressure to make it quickly and accurately. In general, we do quite well because of our human ability to see patterns and to look at constellations of symptoms and signs as being part of a specific diagnosis, be they Wernicke's disease, acute dystonic reactions, congestive heart failure, gastroenteritis or strep throat.
Unfortunately, as is the case with the "face on Mars"--which is nothing more than a geological feature--sometimes we see wrong patterns or patterns where they don't exist.
Ascertainment bias is when we shape our thinking by prior expectation. In our training, we see many patients who fit various uncomplimentary stereotypes. We may take this sort of thinking further by blaming patients for their illnesses (motorcycle accident victims, for not wearing helmets, psychiatric patients not taking their medication, for relapses of their underlying condition, obese individuals on whom we have difficulty doing procedures.)
This judgmental behavior, which may affect both our interaction with the patient and the treatment, is called fundamental attribution error. It is most human to behave in this fashion, but it may lead to errors of omission.
Those who doubt that they themselves have behaved this way should answer this question: "What is your first thought when asked to see a consult on the psychiatric (or corrections) unit?"
A middle-aged man presented to an ED with syncope. No obvious cardiac or neurological cause was found. A young physician elicited the history that the patient had driven 1,500 miles nearly nonstop two days previously. Recalling something about pulmonary embolus and syncope, a lung scan was ordered and performed, revealing a massive perfusion defect.
Most cases of syncope, of course, are not due to pulmonary embolism; indeed, many are difficult to diagnose at all. The fact that a diagnostic workup may often not reveal any abnormality, however, should not dissuade an appropriate attempt to find one.
Violation of the unpacking principle (failing to elicit all relevant information) may lead to a missed diagnosis.
A related issue is "Yin-Yang out," a tendency to believe nothing further can be learned by studying the patient ("They've been worked up the (or 'out the') Yin-Yang").
This may prove ultimately to be true, but to adopt such a strategy at the outset is fraught with the chance of a variety of errors. (1)
No one's immune
The collection of heuristics, biases and familiar patterns of diagnosis have been called CDRs (cognitive dispositions to respond) by Croskerry. (1-3) The three recommendations he listed were to:
1. Recognize that these errors are common and affect all medical workers. Physicians are not immune from biased thinking any more than they are immune to disease. Biases aren't inherently bad, but some may lead us in the wrong direction.
2. Remove the sense of inevitability that these cognitive errors have to occur at the same frequency that they currently do.
3. Remove the pessimism that we can't decrease cognitive bias or errors. The fact that something is difficult to achieve does not mean that it is impossible. We have made strides in technology and in human behavior that a generation ago would have been considered impossible.
Here are some approaches that begin to deal with the problem.
* Develop awareness and insight into the issue
Simply being aware that we are prone to cognitive bias that could adversely affect diagnosis and management of patients is important and a significant step.
* Ask yourself about biases
"Am I biased because of how I feel about the patient, what happened to me yesterday, or what happened in a similar case 10 years ago after which I felt badly or was sanctioned?"
Asking this question may stop or slow diagnostic momentum and allow for consideration of other possibilities. Diagnostic momentum is the tendency for initial statements to snowball into a diagnosis that may not be appropriate.
Another homeless patient may have a potentially treatable condition missed. Chest pain can turn into "crushing" chest pain, and the continued use of the language by others seeing the patient may snowball to the point where the patient is considered to have a myocardial infarction before it is truly proven.
"Anxiety" may be considered psychogenic as the descriptive term is passed from caregiver to caregiver. Anxiety often is psychogenic; occasionally, however, it is the presenting manifestation of acute respiratory insufficiency.
Take a few moments to examine the thinking process, not just what needs to be done.
* Decrease reliance on memory
We have an excellent memory, but it is subject to selective bias. If you ever kept a diary, it is worth reading it to see how different a specific situation was from the one remembered.
* Try to minimize time pressures
If you are too busy to do something right, you are too busy. Not everything is a life-or-death emergency; the need to do something immediately raises costs, not only monetary but personal, due to stress.
* Try to remove the authority gradient (3)
Most physicians can recall being embarrassed by a nurse or family member who turned out to make the correct diagnosis. In aviation, crew resource management is a technique used to marshal all available knowledge in the event of a crisis. When in doubt, consider asking for advice, even from those junior.
* Develop a feedback system to find out what happens to specific patients (4)
Most physicians are grateful to learn what happened to an individual they saw only once. Often, the initial diagnosis was changed or modified and the information never conveyed to the first treating physician.
As long as feedback is used to learn, and not to judge (remember hindsight bias), it should be helpful and likely received with gratitude. In medicine, we do learn from anecdotes about specific patients, for they are powerful, often staying with us for years. But if inaccurate or remembered poorly, the conclusions will be wrong, staying with us for years, too!
* Know your "lawyer zones"
A lawyer zone is an area of the body or a diagnosis where pathology is difficult to detect, easily missed, and disastrous if missed. Such areas include the apex of the lung, the axilla on a mammogram, sudden, severe headaches, the navicular bone, the head of the pancreas or the ovary.
1. Croskerry P. "The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them." Academic Medicine 78(8), 2003.
2. Croskerry P. "The Cognitive Imperative. Thinking how we think." Academic Emergency Medicine 7(11), 2000.
3. Croskerry P. "Profiles in Patient Safety: Authority Gradients in Medical Error." Academic Emergency Medicine, 11(12), 2004.
4. Croskerry P. "The Feedback Sanction." Academic Emergency Medicine, 7(11), 2000.
By Michael S. Smith, MD, MS
Michael S. Smith, MD, MS, a statistician, wants to help people in the medical community use statistics to make better, faster and easier decisions. He is self-employed and may be reached at 520-410-7917 or email@example.com
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||Safety Check|
|Author:||Smith, Michael S.|
|Date:||May 1, 2005|
|Previous Article:||Following a proven path to success.|
|Next Article:||Get creative!|