National Opinions

Inherent biases keep us unprepared for disasters

Rescue team member from the North Carolina National Guard rescued a young child Friday as the rising floodwaters from Hurricane Florence threatened his home in New Bern, N.C.
Rescue team member from the North Carolina National Guard rescued a young child Friday as the rising floodwaters from Hurricane Florence threatened his home in New Bern, N.C. Associated Press

In the wake of Hurricane Florence, thousands of residents have found their homes flooded, only to discover they don’t carry flood insurance. More will face days (or weeks) without electrical power — and then realize that they failed to gather sufficient supplies to endure the post-storm recovery period.

Some will have chosen not to evacuate despite warnings to do so, only to find themselves trapped in their houses, praying that the structures will survive the wind and storm surge. Some may needlessly and tragically lose their lives because of these mistakes.

Lack of preparation helps to explain why the material losses we have experienced after recent disasters have been severe, even when people have been forewarned. And lack of preparation, the research shows, is caused by cognitive biases that lead people to underplay warnings and make poor decisions, even when they have the information they need.

When Hurricane Sandy hit New York and the Mid-Atlantic states in 2012, for example, 40 people drowned because they failed to heed warnings to evacuate from flood-prone coastal areas. Yet the storm had been accurately forecast and people believed the forecasts.

In this case the cognitive bias of excessive optimism kicked in: Residents knew that a storm was at their doorstep and that many people would be affected — they just thought it wouldn’t affect them.

The bias of herd thinking compounded the problem. Looking around and seeing that few others were making preparations, residents felt no social pressure to do more.

In addition to over-optimism and a herd mentality, several other psychological biases undermine preparation. Consider myopia. Sound preparation for disasters requires us to make short-term costly investments (buying insurance, for instance) to stave off a future loss. But most of us tend to be shortsighted, focusing on the immediate cost or inconvenience of preemptive action rather than the abstract penalty for failing to act.

Amnesia is also evident in people’s reactions. Even when we have been through disaster before, we tend to forget what it felt like— the discomfort of being without power for days, the challenges of repairs. While we may remember the facts, emotions are what tends to drive action, and those memories fade the fastest.

Inertia and simplification are also enemies of sound decision-making. When we are unsure of what to do in the face of an incoming storm, we tend to do nothing. In Hurricane Sandy, for example, 90 percent of residents secured supplies — but typically only enough to get them through a single day without power.

It may be discouraging to hear how our minds work to defeat us. (To be sure, there are reasons beyond psychology that people fail to act. They can lack the financial means to do so, or are limited by age or disability.) But there is a silver lining: Knowing why we under-prepare is the first step to knowing how to avoid these mistakes.

The key is to design preparedness measures that anticipate our biases. Consider simplification: the tendency for people to consider themselves prepared after taking one or two actions. The fix? Officials shouldn’t distribute long, generic checklists of preparedness measures. Rather ordered lists should be issued: Tell people, “If you are going to do only one thing to prepare for a storm, it should be this. If you are going to do three, you ought to .. . .”

Recent years have seen tremendous advances in our ability to predict natural. But reducing the costs of those events will require a better understanding of the psychology biases that shape how people make decisions, and better preparedness systems that anticipate and work around these biases.

Robert J. Meyer is the Ecker/MetLife professor of marketing and co-director of the Wharton Center for Risk Management and Decision Processes, at the University of Pennsylvania. He is co-author of “The Ostrich Paradox: Why We Underprepare for Disasters,” with Howard Kunreuther.