You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Worst-Case Scenarios: The Problem of Neglect

Countless people reacted with great prudence to Hurricane Sandy, taking sensible and appropriate precautions. But as Hurricane Katrina demonstrated in New Orleans, and as Hurricane Sandy has shown more recently, some people badly underreact to worst-case scenarios, ignoring official warnings or acting as if the risks are being wildly overhyped. The results can be tragic. When people are failing to take appropriate precautions, what’s the explanation? What can be done to get their attention? Behavioral economists and cognitive psychologists have identified three important clues.

The first involves unrealistic optimism. Some of us show an optimism bias; we believe that we are peculiarly immune from serious risks. In some studies, the overwhelming majority of drivers have been found to think that they are better than the average driver and less likely to be involved in a serious accident. When facing a potential danger, some of us believe that whatever happens, we will be just fine.

Human beings are also subject to availability bias. When people are evaluating risks, they tend to ask whether other, similar events come to mind, thus using what social scientists call “the availability heuristic.” If you immediately think of cases in which officials exaggerated risks, and asked people to take precautions that proved to be unnecessary, you might decide that there’s nothing to fear—a serious problem for some citizens of New Orleans who neglected the risks associated with Katrina. Anecdote-driven judgments can make people think that a boy is falsely crying wolf –even though the wolf is right there at the door.

The final problem is motivated reasoning. For many of us, evaluation of risks is not just a matter of calculation. Our motivations, in the sense of our emotions, our desires, and our wishes, count too, and they can much affect our judgments. If you are strongly motivated to believe that capital punishment deters crime, you are likely to credit the studies that support that belief (even if they’re pretty lousy). So too with worst-case scenarios. If you hate thinking about them, you might dismiss them even if prudence suggests that you shouldn’t.

Support thought-provoking, quality journalism. Join The New Republic for $3.99/month.

Taken together, unrealistic optimism, availability bias, and motivated reasoning can produce unjustified complacency. The problem is especially serious when the three problems are put together with social influences. When people in a particular group or social network start to urge that nothing can go badly wrong, their complacency can be contagious, leading to widespread neglect of serious dangers.

For those who are showing such neglect, can anything be done to promote sensible precautions? Fortunately, the availability heuristic can be the solution, not the problem. People are far more likely to protect themselves if they are reminded of cases in which things really did go wrong. And when a bad outcome triggers strong emotions, people tend to stop thinking about probabilities and to focus directly on the outcome itself. If people are asked how much they would pay to avoid an electric shock, what matters most is the shock itself. If people’s attention can be focused directly on the worst-case scenario, they are likely to try to avoid it.

Of course overreactions, no less than underreactions, can be a serious problem. We don’t want people to take pointless or unreasonable precautions. The good news is that in the face of looming disasters, an appreciation of how people actually evaluate risks can help us to combat, at once, hysteria and neglect.

Cass R. Sunstein teaches at Harvard Law School. From 2009 until this August, he was Administrator of the White House Office of Information and Regulatory Affairs.