Undercover Economist

The psychological biases that leave us unprepared for disaster

This column was written and first published a week before Hurricane Harvey struck the US coast. – TH

Who saw the global financial crisis coming, who didn’t and who deserved blame for the forecasting failure? After a decade of debating these questions, I wonder whether we shouldn’t be asking a different one: even if we had clearly seen the crisis coming, would it have made a difference? Perhaps — but perhaps not.

Consider New Orleans in 2004. With a terrible hurricane bearing down on the city, officials realised that the situation was grim. The levees were in disrepair and a storm surge could flood the low-lying city. A hundred thousand residents would be unable to evacuate without help, and not enough help was available. A plan was hatched to evacuate families to the Superdome, a sports stadium, but managers there warned that it simply could not house so many. If only there had been more warning of disaster.

Some readers will recall, though, that the catastrophe of Hurricane Katrina took place in 2005. The storm of 2004 was Hurricane Ivan, which, after lashing the Caribbean, weakened and turned aside from New Orleans. The city had been given almost a full year’s warning of the gaps in its defences.

The near miss led to much discussion but little action. When Hurricane Katrina hit the city, evacuation proved as impractical and the Superdome as inadequate as had been expected. The levees broke in more than 50 places, and about 1,500 people died. New Orleans was gutted. It was an awful failure but surely not a failure of forecasting.

Robert Meyer and Howard Kunreuther in The Ostrich Paradox (UK) (US) argue that it is common for institutions and ordinary citizens to make poor decisions in the face of foreseeable natural disasters, sometimes with tragic results. There are many reasons for this, including corruption, perverse incentives or political expediency. But the authors focus on psychological explanations. They identify cognitive rules of thumb that normally work well but serve us poorly in preparing for extreme events.

One such mental shortcut is what the authors term the “amnesia bias”, a tendency to focus on recent experience. We remember more distant catastrophes but we do not feel them viscerally. For example, many people bought flood insurance after watching the tragedy of Hurricane Katrina unfold, but within three years demand for flood insurance had fallen back to pre-Katrina levels.

We cut the same cognitive corners in finance. There are many historical examples of manias and panics but, while most of us know something about the great crash of 1929, or the tulip mania of 1637, those events have no emotional heft. Even the dotcom bubble of 1999-2001, which should at least have reminded everyone that financial markets do not always give sensible price signals, failed to make much impact on how regulators and market participants behaved. Six years was long enough for the lesson to lose its sting.

Another rule of thumb is “optimism bias”. We are often too optimistic, at least about our personal situation, even in the midst of a more generalised pessimism. In 1980, the psychologist Neil Weinstein published a study showing that people did not dwell on risks such as cancer or divorce. Yes, these things happen, Professor Weinstein’s subjects told him: they just won’t happen to me.

The same tendency was on display as Hurricane Sandy closed in on New Jersey in 2012. Robert Meyer found that residents of Atlantic City reckoned that the chance of being hit was more than 80 per cent. That was too gloomy: the National Hurricane Center put it at 32 per cent. Yet few people had plans to evacuate, and even those who had storm shutters often had no intention of installing them.

Surely even an optimist should have taken the precautions of installing the storm shutters? Why buy storm shutters if you do not erect them when a storm is coming? Messrs Meyer and Kunreuther point to “single action bias”: confronted with a worrying situation, taking one or two positive steps often feels enough. If you have already bought extra groceries and refuelled the family car, surely putting up cumbersome storm shutters is unnecessary?

Reading the psychological literature on heuristics and bias sometimes makes one feel too pessimistic. We do not always blunder. Individuals can make smart decisions, whether confronted with a hurricane or a retirement savings account. Financial markets do not often lose their minds. If they did, active investment managers might find it a little easier to outperform the tracker funds. Governments, too, can learn lessons and erect barriers against future trouble.

Still, because things often do work well, we forget. The old hands retire; bad memories lose their jolt; we grow cynical about false alarms. Yesterday’s prudence is today’s health-and-safety-gone-mad. Small wonder that, 10 years on, senior Federal Reserve official Stanley Fischer is having to warn against “extremely dangerous and extremely short-sighted” efforts to dismantle financial regulations. All of us, from time to time, prefer to stick our heads in the sand.

Written for and first published in the Financial Times on 18 August 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)