Skip to content
Surprising Science

26 Years Later: What the Challenger Disaster Teaches Us

In his book Blind Spots, Professor Max Bazerman of Harvard Business School argues that the Challenger fiasco exploited inconsistencies in the decision-making mechanisms of the brain. 
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.
What’s the Big Idea?


The space shuttle Challenger crashed on January 28th, 1986, seventy-three seconds after taking off from Cape Canaveral. The ship disintegrated in midair, sending six astronauts and the schoolteacher Christie McAullife plunging into the Atlantic Ocean. The tragedy transfixed America and ended NASA’s golden years.

In the subsequent days and months, the crash became shrouded in myth and rumor (add link: e-space/t/myths-about-challenger-shuttle-disaster/). Misperceptions persist to this day. Many Americans believe, for instance, that the shuttleexploded and that the astronauts were killed instantly. In fact, Challenger broke apart and only sections of it were destroyed. The occupants of the crew cabin were still alive—although most likely unconscious—when they hit the water at 200 miles an hour.

Many Americans claim to have watched the explosion live on television, but this too is untrue. Only one channel—CNN—was showing the launch when the tragedy occurred, and all of the major networks played the accident only on tape delay. Another common myth is that the Environmental Protection Agency banned use of a sealant that could have been used to make Challenger safer. The list goes on and on.

Of all the inaccuracies about the disaster, perhaps the most dangerous is the idea that accidents of this kind are an unavoidable part of space exploration. Travelling to outer space is immensely complex, so the thinking goes, and something is bound to go wrong once in a while. But follow up investigations found that the tragedy was not the result a chaotic, low frequency event; it was the result of an obvious oversight. Flight engineers should have noticed Challenger’s mechanical flaws long before the shuttle took off.

In his book Blind Spots, Professor Max Bazerman of Harvard Business School argues that the Challenger fiasco exploited inconsistencies in the decision-making mechanisms of the brain. Bazerman is an expert in “behavioral ethics,” which seeks to explain how people react when faced with ethical dilemmas. He argues that NASA leadership failed because they did not view the launch decision in ethical terms concerning the lives of the crew. Instead, they allowed political and managerial considerations to drive their decision-making.



What’s the Significance?


Historians—and journalists—tend to assume that people recognize an ethical dilemma when it is presented to them. When writing about tragedies like the Challenger disaster, we often imply that those who behaved immorally did so out of conscious effort. Bazerman, however, argues that ethical lapses are usually unconscious. In his view, people’s emotional needs can be so great that they drown out our ethical considerations completely.

We are also prone to “Groupthink,” the tendency to favor unanimity over careful reasoning. Therefore, we often behave immorally without even realizing it. This is why good people do bad things.

Luckily, people and the organizations that employ them are not slaves to human nature. Bazerman thinks there are several steps that leaders can take to ensure ethical decision-making among their employees. For instance, he tells executives that they should monitor the incentives and managerial structures that they impose on their employees lest conflicts of interest emerge. Also, they should pay close attention to data that might reveal their organization’s biases. For instance, leaders should use hard data to confirm that their companies are hiring sufficient women and minorities; relying on gut feeling isn’t enough.

In striving to improve our ethical decision-making, it also helps to be aware that thinking clearly is a lot more difficult in the heat of the moment. During the planning phase of a decision, we tend to rely on cool-headed rationality. When a crisis hits, however, this kind of thinking takes a back seat to powerful emotions—what  Bazerman calls the “want” self. In Blind Spots, he writes that thinking through your likely emotional response to a situation beforehand can help you prepare for contingencies. “Thinking about your motivations at the time of a decision can help bring the ‘want’ self out of hiding during the planning stage and thus promote more accurate predictions,” he writes.
The purpose of such visualization exercises is not to surrender to the “want” self, Bazerman says. Rather, it is to prepare you for the self-interested emotional inputs that you are likely to experience when a certain scenario arises. By thinking through a scenario beforehand, we can ensure that we will behave ethically when the time comes.


Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Related

Up Next