Skip to content

The Sartre Fallacy Part II: Is It Inevitable?

If we know that we are bad at predicting and can account for the underlying psychology then why do we continue to make bad predictions?
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Years before my run in with Monseiur Sartre I landed a summer job in the painting business. If you’ve painted houses perhaps you ran into the same problem I did: poor planning. One summer I discovered that a one week job took closer to two weeks; a three week job lasted about a month and a half, and so on. I devised a rule of thumb: double your completion date. The problem is I didn’t stick to this heuristic even though I knewit was true. Why? Experience and knowledge do not necessarily improve judgment; we’ve seen, in fact, that sometimes the opposite occurs. The mind is stubborn – we stick to our intuitions despite the evidence. 


Let’s go beyond the anecdote. In the spring of 2005 Bent Flyvbjerg, Mette K. Skamris Holm, and Søren L. Buhl published an article in Journal of the American Planning Association that presented “results from the first statistically significant study of traffic forecasts in transportation infrastructure projects.” The paper gathered data from rail and road projects undertaken worldwide between 1969 and 1998. They found that in over 90 percent of rail projects the ridership was overestimated and that 90 percent of rail and road projects fell victim to cost overrun. Worse, although it became obvious that most planners underestimate the required time and money their accuracy actually declined over the years. Today a sizable engineering feat completed on time and within budget is an imaginary one.

In Thinking, Fast and Slow Daniel Kahneman describes the planning fallacy as “plans or forecasts that are unrealistically close to best-case scenarios.” Two dramatic examples come to mind. In 1957 The Sydney Opera house was estimated to cost $7 million (Australian dollars) and the completion date was set for early 1963. It opened in 1973 with a price tag of $102 million. Boston’s Big Dig was nearly one decade late and $12 billion dollars overpriced. The one exception that I can think of from the engineering world is New York’s Empire State Building, completed in 410 days, several months ahead of schedule, at $24.7 million, which is close to half of the projected $43 million.

Around the time I was painting houses I discovered more examples of the planning fallacy in other domains. I eventually landed on this question: If we know that we are bad at predicting and can account for the underlying psychology then why do we continue to make bad predictions? Kahneman suggests that to improve predictions we should consult “the statistics of similar cases.” However, I realized that the two biases that contribute to the planning fallacy, overconfidence and optimism, also distort an effort to use similar cases to generate more objective projections. Even when we have access to the knowledge required to make a reasonable estimation we choose to ignore it and focus instead on illusionary best-case scenarios.

This idea returns me to my last post where I coined the term the Sartre Fallacy to describe cases in which acquiring information that warns or advocates against X influences us to do X. I named the fallacy after de Beauvoir’s lover because I acted like a pseudo-intellectual, thereby living less authentically, after reading Being and Nothingness. I noticed other examples from cognitive psychology. Learning about change blindness caused participants in one study to overestimate their vulnerability to the visual mistake. They suffered from change blindness blindness. The planning fallacy provides another example. When planners notice poor projections made in similar projects they become more confident instead of making appropriate adjustments (“We’ll never be that over budget and that late”). This was my problem. When I imagined the worst-case scenario my confidence in the best-case scenario increased.  

After I posted the article I was happy to notice an enthusiastic response in the comment section. Thanks to the sagacity of my commenters I identified a problem with the Sartre Fallacy. Here it is; follow closely. If you concluded from the previous paragraph that you would not make the same mistake as the participants who committed change blindness blindness then you’ve committed what I cheekily term the Sartre Fallacy Fallacy (or change blindness x3). If you conclude from the previous sentence that you would not commit the Sartre Fallacy Fallacy (or change blindness x3) then, mon ami, you’ve committed the Sartre Fallacy Fallacy Fallacy (or change blindness x4). I’ll stop there. The idea, simply, is that we tend to read about biases and conclude that we are immune from them because we know they exist. This is of course itself a bias and as we’ve seen it quickly leads to an ad infinitum problem.

The question facing my commentators and me is if the Sartre Fallacy is inevitable. For the automatic, effortless, stereotyping, overconfident, quick judging System 1 the answer is yes. Even the most assiduous thinkers will jump to the conclusion that they are immune to innate biases after reading about innate biases, if only for a split second. Kahneman himself notes that after over four decades researching human error he (his System 1) still commits the mistakes his research demonstrates.

But this does not imply that the Sartre Fallacy is unavoidable. Consider a study published in 1996. Lyle Brenner and two colleagues gave students from San Jose State University and Stanford fake legal scenarios. There were three groups: one heard from one lawyer, the second heard from another lawyer, and the third, a mock jury, heard both sides. The bad news is that even though the participants were aware of the setup (they knew that they were only hearing one side or the entire story), those who heard one-sided evidence provided more confident judgments than those who saw both sides. However, the researchers also found that simply prompting participants to consider the other side’s story reduced their bias. The deliberate, effortful, calculating System 2 is capable of rational analysis; we simply need a reason to engage it.

A clever study by Ivan Hernandez and Jesse Lee Preston provides another reason for optimism. In one experiment liberal and conservative participants read a short pro-capital punishment article. There were two conditions. The fluent condition read that article in 12-point Times New Roman font; the disfluent condition read the article in an italicized Haettenschweiler font presented in a light gray bold. It was difficult to read and that was the point. Hernandez and Preston found that participants in the later condition “with prior attitudes on an issue became less extreme after reading an argument on the issues in a disfluent format.” We run on autopilot most of the time. Sometimes offsetting biases means pausing, and giving System 2 a chance to assess the situation more carefully.

One last point. If the Sartre Fallacy was inevitable then we could not account for moral progress. The Yale psychologist Paul Bloom observes in a brief but cogent article for Nature that rational deliberation played a large part in eliminating “beliefs about the rights of women, racial minorities and homosexuals… [held] in the late 1800s.” Bloom’s colleague Steven Pinker similarity argues that reason is one of our “better angels” that helped reduce violence over the millennia:

Reason is… an open-ended combinatorial system, an engine for generating an unlimited number of new ideas. Once it is programmed with a basic self-interest and an ability to communicate with others, its own logic will impel it, in the fullness of time, to respect the interest of ever-increasing numbers of others. It is reason too that can always take note of the shortcomings of previous exercises of reasoning, and update and improve itself in response. And if you detect a flaw in this argument, it is reason that allows you to point it out and defend an alternative.

When Hume noted that, “reason is, and ought to be, only the slave of the passion” he was not suggesting that since irrationality is widespread we should lie back and enjoy the ride. He was making the psychological observation that our emotions mostly run the show and advising a counter strategy: we should use reason to evaluate the world more accurately in order to decide and behave better. The Sartre Fallacy is not inevitable, just difficult to avoid.

Part I Here

Image via Wikipedia Commons

Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Related

Up Next
The 3D printing movement just kicked into a higher gear last week with the launch of a wildly successful Kickstarter project – the 3Doodler – that promises to create the world’s first 3D […]