Decision Making Has Its Semmelweis, Now It Needs a Gawande
What’s the difference between a physician and a surgeon?
If you were sick in the Middle Ages you had three options: the church, the local healer, and the physician. A physician was the most expensive and least effective. If you had a migraine, for example, he would cut you open and bleed you. The local healer would give you an amulet to wear on your head and an herbal tincture made from walnuts. You were better off with her because she did not bleed you – same with the church. Sadly, this was the state of affairs for most of human history. As the medieval scholar James Hannam notes, “the history of medicine until the 19th century… is a history of failure.”
Surgeons are a different story. They made great advancements starting with the ancient Greeks. By the Middle Ages they could provide stiches, set bones, remove tumors and perform successful amputations. Why? Surgeons had access to something physicians did not: clear feedback. Physicians struggled because they didn’t know that they were getting it wrong – the source of an illness and the effectiveness of a treatment were obscure. When surgeons operated it was clear – sometimes fatally clear – when something did not work.
Improving human judgment and decision-making is challenging because, like a physician before the modern era, it’s difficult to know when you’ve screwed up. Here’s the problem: awareness doesn’t seem to help (no one ever became more rational after reading a few decision-making books). Some research shows that we can overcome some biases in the short term. But we always return to our default state, in which we are ignorant of our ignorance. The question is: if awareness doesn’t work, what does?
Let’s return to medicine.
In 1847 Ignaz Semmelweis was working in the Maternity Department in a Vienna hospital when he noticed something: women in a ward run by doctors were contracting puerperal fever (or childbed fever) and dying at a rate nine times higher than women in a ward run by midwives. What caused the asymmetry? Semmelweis acquired a vital clue when his colleague Jakob Kolletschka, a doctor, cut his finger while conducting an autopsy and died of puerperal fever a few days later. Midwives did not preform autopsies, Semmelweis reasoned, so doctors must be transferring the fever from the corpses to the mothers. He was right. The solution? Hand washing.
Despite Semmelweis’ research, every year in the United States about 2 million people every year contract an infection they did not have before entering the hospital. Worse, hundreds of thousands of patients die in hospitals each year from avoidable mistakes. Many of these mistakes, just like the microbes that cause puerperal fever, are invisible.
Consider the following real story. A patient was undergoing an operation to remove a tumor from his stomach when his heart stopped. The cause was a mystery. The patient was not losing blood, his lungs were receiving oxygen, and the doctors were not detecting any abnormalities. That’s when the anesthesiologist remembered that the patient had a low potassium level and that he accidently gave him a dose one hundred times more than what was expected. It was a lethal amount. When the team realized this they administered appropriate drugs to counteract the potassium. The patient’s heart started again and he pulled through.
This story and others like it come from Atul Gawande’s bestseller The Checklist Manifesto. Drawing from strategies used in the aviation and engineering industry, Gawande suggests that surgical teams adopt checklists. The purpose of the checklist is not instructional but to prevent trivial mistakes by forcing surgical teams to avoid “errors of ineptitude” (mistakes we commit because we don’t make proper use of our knowledge). It works. In 2012 Gawande reports that in eight hospitals that adopted the checklist strategy complication rates feel 35 percent and death rates feel 47 percent.
In other words, knowledge isn’t the problem – we can fix nearly every illness today. It’s human error, and a checklist is an easy solution. In this way, Semmelweis identified the problem and Gawande showed us how to avoid it.
This brings me back to human rationality. Research on judgment and decision-making has its Semmelweis (Kahneman and Tversky) but it lacks a Gawande.
Last week I finished Francesca Gino’s Sidetracked: Why Our Decisions Get Derailed, and How We Can Stick to the Plan. Like most pop decision making books, it uses the Story-Study-Lesson format to highlight human error for a lay audience. It’s good but we’ve heard it before: systematic biases distort our decisions. Now we need to know what to do about it. Awareness is not enough.
One solution comes from Decisive: How to Make Better Choices in Life and Workby Chip and Dan Heath. The Heath brothers propose a four step strategy summed up by the acronym WRAP: Widen Your Options (to avoid narrow framing), Reality-Test Your Assumptions (to avoid confirmation bias), Attain Distance Before Deciding (to avoid short-term emotion), Prepare To Be Wrong (to avoid overconfidence). The beauty of WRAP is that it is a checklist. Its effectiveness is unclear, but I think it’s the right idea.
People unfamiliar with decision-making research are like physicians before Semmelweis: without clear feedback they don’t know that they are screwing up. Research from Kahneman and Tversky and books like Sidetracked highlight our mistakes. But the nature of biases means that’s not enough. We need tools to compensate for them, just like medical professionals needed Gawande’s checklist. Improving decision-making does not mean studying biases. We need to figure out how to use our knowledge of our ignorance.
Image via Robert Kneschke/Shuttershock