I am almost never right: a doctor admits how he thinks

One of the principles that drives my curiosity is that the first and second (and third) thing I think is likely wrong.  Driving past this insecurity into a confident stance of ignorance is hard.  At the foundation is accepting my failures is using them as a springboard toward my next magnificent failure.  When I am finally right, it is a pleasant surprise.

As a doctor, the time scale that my wrongness exists on is compressed.  When you come to me and I have to figure out why, for example, your feet are swelling up, my mind creates a list of things I think I know.  We call it a differential diagnosis.  I start with things that are common or things that are likely to kill you, like heart failure or a blood clot in your leg veins.  I’ll ask more questions to find out if you are at risk for these conditions.

Multitasking is a myth.

Dr. O, to justify his own short-comings.

While I am thinking,  I might not hear what you say because I am working out a plan to make sure I’m not going to kill you or let you die.  Doctors are smart, but no one can pay full attention to two separate stimuli at the same time. Multitasking is a myth.

I’ll try to switch back and forth between hearing your concerns and listening to my thoughts fast enough to be able to consider both of them, but this is hard.  The less I have to think (the more practice I have had), the less I will have to think and the more I may be able to listen… or the less I will listen because the more I think I know what you are going to say.

Other than immediately life threatening causes of feet swelling, I might also have some thoughts about obstruction of your lymphatics or maybe failure of the valves in your legs that stop blood from flowing backwards.  These thoughts are like pitchers in the bullpen… except that they are NOT like pitchers in the bullpen.  They cannot throw pitches unless I pay attention to them.  There is very little if any unconscious processing of complex ideas. I have to attend to every thought so it can make a difference.

The common story that ours brains think primarily unconsciously while the thoughts we notice are just the tip of the iceberg is just a convenient illusion.

Dr. O, to justify his apparent inability to figure things out without trying.

I have to intentionally practice for years as a medical student and resident, running through scenarios over and over, consciously, with purpose, so that when I access my knowledge the answers are nearly automatic, the pathways clear, no warm-up required.  If I were to rely on my unconscious to do the bulk of the processing, my patients would drop like flies.

Dealing with limited information is hard. One way to do it, is to use Bayesian reasoning. At its core, Bayesian reasoning is simple: you change the presumed probability of an outcome based on the evidence.  This is something we do intuitively, but poorly.

A joke among Emergency Physicians is that a sign outside the door stating “ALL THOSE WHO SIGN IN MUST SUBMIT TO A RECTAL EXAM,” would greatly increase the probability that the patients they see would actually have a life threatening condition.

Let’s make it real.

If I choose a patient at random from all of my patients and I know that 5% of my patients suffer from heart failure, then without any more information, I should have a 5% suspicion that Mrs. Random has heart failure. However, if I have seen a lot of heart failure patients recently, then I am likely to overestimate the chance Mrs. Random has heart failure. I expect it because it is the most present in my mind.

After my initial assumptions, I ask a question or perform a test and the result is not consistent with heart failure, my suspicion of heart failure should diminish.  To do the real math requires statistics, but I don’t have time for that.  I’m going to order more tests just to be sure.

Who wants to be sued?  But more importantly, who wants a patient to die?

Much of medicine is algorithmic, which means we follow an automatic decision tree.  If I get this result, then I do this.  These algorithms are meant to reduce error in decision making. They also prevent us from missing life-threatening things we may be underestimating. They implicitly use Bayesian reasoning which states that prior information (and all new information) affects the probability of an outcome. 

However, because medicine is complicated, communication is often not that great, and tests can give false results, doctors like me are likely to follow two or three branches of the tree at the same time, to cover our arses.

All of this gets expensive fast.  Very, very, very expensive.

Also, the more tests I perform, the more likely I am to get a false positive and treat something that doesn’t need treatment.  This incurs more risks for the patient.

Whew! I’m gonna stop now and hope you learned at least these things:

  • Dr. O cannot multitask. He needs to pay attention.
  • Dr. O’s subconscious is not going to find the right answer.
  • Bayesian reasoning requires that Dr. O adjusts the probability of his suspected diagnoses based on prior and developing evidence.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.