Skip to content

Putting the Disaster in Natural Disasters. Why Do We Live in Harm’s Way?


            As extreme weather events seem to be increasing around the world, changes consistent with the kinds of things climate change is predicted to cause, many people are realizing something for the first time. Hundreds of millions of people around the world choose to live in areas prone to severe floods, forest fires, hurricanes, earthquakes, volcanoes. Such violent natural events only become ‘disasters’, in fact, because so many people have chosen to live in harm’s way. Why do people expose themselves to such danger? And how does the psychology of this sort of risk-taking challenge the hazard managers who are responsible for mitigating these perils and keeping people safe?

            A central cause of this problem is our misunderstanding of probability. As certain as these events are to occur over the long term, for any given place and any given time earthquakes and tornadoes and floods are low probability events. The chance of the next “Big One” earthquake in southern California – magnitude 7 or greater – is close to 100% sometime in the next 50 years, but it’s only two or three percent for any one of those years. Those low odds of risk in the shorter term, the time frame we care about the most, play right into some serious cognitive limitations that challenge our ability to make intelligent choices about risk.

            We pay more attention to, and worry more about, not necessarily the greatest threat, but what is most salient. A risk can be salient because some example of it is in the news – on our radar screen at the moment – or because we have memories of the risk from some particularly emotional or meaningful experiences. Such memories are burned more deeply into our brains, and come to mind more readily and more powerfully, and our brain knows, in essence, that it better pay more attention when such meaningful memories surface. The people who study the suite of mental shortcuts that subconsciously help us turn partial information into our judgments and decisions call this the Availability Heuristic.

            Well, rare events are mostly NOT salient. They are rarely on our radar screen, and many who live in potentially dangerous areas have never lived through The Big One earthquake, or a Category 4 or 5 hurricane, or an out-of-control raging forest fire burning down their house. They have no powerful personal emotional experience with these threats, memories that would make such dangers more ‘available’ to their brain, more compelling, more salient. In the absence of that salience, another facet of risk perception psychology takes over; Optimism Bias…the feeling we have about lots of risks all the time…”It won’t happen to me.”

            Optimism Bias let’s us do all sorts of risky things that we want to do, from bungee jumping to driving and texting to living in places where we want to live even if those places are dangerous. It lets us buy the house in the beautiful but fire-prone forests of the Colorado mountains, or on the beautiful but hurricane-prone shores of North Carolina. It lets us take the job in LA or San Francisco, where a devastating earthquake in the next 50 years is a near certainty. It helps explain why we live in the lahar zones of volcanoes like Mt. Rainier  or Mammoth Mountain California. It even helps explain why we don’t buy insurance against low probability risks. Fewer than 20% of California homeowners in earthquake prone areas have earthquake insurance. The vast majority of people living in areas of America at risk of severe flooding have no flood insurance. Why get insurance for low probability risks, even if they are disastrous, they wonder? After all, “It won’t happen to ME.”

            And in truth, it probably won’t. Probably. But that’s a whole additional problem. We assume all sorts of things about probability that are just plain wrong. For example, our judgments of what is likely to happen next are based on way-too-small samples of what has happened before. If you flip a coin and it comes up heads ten times in a row, how will it land on the next flip? Many people think tails MUST be more likely. Nope. The sample is too small.

            Gamblers who think this way, lose. But it’s precisely the same gamble people take who live in disaster-prone areas. They think that after deadly tornadoes two years in a row in Moore Oklahoma, or “the 100 year flood” in Missouri in 2011, the odds MUST be lower that it will happen again anytime soon, right? Wrong! Next year’s weather hasn’t gotten that memo yet.

     We also have problems with the arithmetic of risk. Which risk is bigger, 1 in a thousand, 1 in a million, or 1 in ten? A widely-cited study found that one person in five couldn’t answer that seemingly simple question correctly (1 in ten is the bigger…most likely…risk). And 80% of the sample had a high school education or better.

     So we misunderstand probabilities, and we worry less about less salient rare risks with which we have no powerful first hand experience. As a result, we put ourselves in harm’s way. And it’s getting worse. The losses in money from such choices is increasing dramatically, as populations increase in places where nature gets violent.

     These cognitive limitations also make it harder for hazard managers to keep us safe with wise risk mitigation policies. It costs money to reinforce buildings in California, or build safe room shelters in Tornado Alley, but if people aren’t worried enough, they won’t spend it. People don’t want to be told they can’t rebuild their home where the old one blew down in a hurricane or burned down in a forest fire, so they resist intelligent building codes or new more accurate flood maps. They don’t even want to be told they have to pay more for insurance if they choose to live in risky areas, where the rates should be higher, which means the rest of society then has to pay for the recovery when the flood/quake/storm hits.

We get risk right in lots of ways, but our perceptions also cause us to make mistakes, mistakes that can be dangerous, the phenomenon that in my book I call the Risk Perception Gap. We worry about some risks more than the evidence merits, and we don’t worry as much about some as the evidence warns. Natural disasters are an example of not worrying enough, and another example of how the way we perceive and respond to risk is something to worry about all by itself.


Related

Up Next