By Charles A. Pilcher MD FACEP
American Medical News published an informative essay by Kevin B. O’Reilly on December 13, 2010, about errors in diagnosis and why doctors make them. According to Gordon Schiff, MD, associate director of the Center for Patient Safety Research and Practice at Brigham and Women’s Hospital, “The problem of diagnostic errors has gotten short shrift in the broader patient safety movement.” The article focused on “thinking mistakes” as opposed to “system errors,” and was both refreshingly honest and depressingly true.
None of us is without error. We all make mistakes. Sometimes we can blame it on some fault of the “system,” but most often we have only ourselves to blame. So if we back up a step and ask “What happened that I made that error for which I must now accept blame?” we begin to learn something about ourselves as physicians – and maybe even as attorneys, too.
But I’ll get to that in a moment.
Another recent article in the New England Journal of Medicine by Dr. David C. Ring has garnered a lot of press. In it Dr. Ring recounts the time when he performed the wrong operation (carpal tunnel surgery) on a patient instead of the intended trigger finger release. While the scenario leading up to the error was evaluated in detail – communications errors, personnel changes in the OR, last patient of the day, etc. – all aspects analyzed seem to be superficial excuses. The article fails to mention the over-riding fact that the surgery schedule that day was simply too busy. The department was trying to operate – literally – at more than capacity. There was no margin. There was no time to regroup, to thoughtfully consider next steps, to assure that everyone was on the same page and all was in order.
Margin is crucial. That’s why emergency departments are such a hectic, potentially high risk area in which to work. The ED doesn’t have a “surge protector.” Staff can’t be scheduled for the maximum anticipated volume, but the average. Even then there is down time, and the better staffed the department, the more down time there is. Staffing to the average means that some days, there’s simply no margin, and it’s on those days where the opportunities for diagnostic error need to be monitored most closely.
Back to the American Medical News article…
Error occurs. About 5% of autopsies find clinically significant conditions that were missed and could have affected the patient’s survival, according to O’Reilly. Also, 40% of malpractice suits are for “failure to diagnose.” These are rarely “system errors,” like mis-filing a pathology report that a tumor was malignant, but more often “thinking errors.”
There are several reasons why we make mistakes in our thought processes, when we had the knowledge and ability to think correctly. As listed in a 2003 article in Academic Medicine, these “thinking errors” include:
- Anchoring bias – locking on to a diagnosis too early and failing to adjust to new information.
- Availability bias – thinking that a similar recent presentation is happening in the present situation.
- Confirmation bias – looking for evidence to support a pre-conceived opinion, rather than looking for information to prove oneself wrong.
- Diagnosis momentum – accepting a previous diagnosis without sufficient skepticism.
- Overconfidence bias – Over-reliance on one’s own ability, intuition, and judgment.
- Premature closure – similar to “confirmation bias” but more “jumping to a conclusion”
- Search-satisfying bias – The “eureka” moment that stops all further thought.
The most fascinating and most common of these is “anchoring bias.” According to Dr. Schiff, “We jump to conclusions. We always assume we’re thinking about things in the right context, and we may not be. We don’t do a broader search for other possibilities.”
As thinking errors move to the forefront of patient safety, many medical schools are beginning to teach “metacognition,” or “thinking about thinking.” The busier the OR or the ER gets, the more this becomes important. It’s second nature to work up a chest pain patient for an MI when the waiting room is full, but more important than ever to keep a broader perspective and consider a couple other killers, for example pulmonary embolism and dissecting aneurysm.
Some experts say that information technology will help us overcome our biases, broaden our perspective and avoid diagnostic errors. Perhaps. But health IT has it’s own biases. Remember GIGO – garbage in, garbage out. A simple example is an over-reliance on “template charting,” whether electronic or in paper form. Let’s say the patient tells the triage nurse “I’ve been vomiting and my chest hurts.” If one chooses too early the template for “Vomiting,” “Gastroenteritis,” or “Abdominal Pain,” one could easily lead oneself and others astray, causing them to overlook the fact that what the patient really meant to say at triage was “I started having this heavy chest pain and have been vomiting ever since.” If the template is too focused, the patient may well be discharged with an undiagnosed MI – or worse.
“Thinking problems” can be at least partially avoided by simply being aware that they exist. And “metacognition” practiced by both physicians and attorneys can lead both to make fewer “diagnostic errors.”