Tim Harford The Undercover Economist

Other WritingOther Writing

Articles from the New York Times, Forbes, Wired and beyond – any piece that isn’t one of my columns.

Other Writing

Book of the Week 26 – How to do Nothing by Jenny Odell

This thoughtful – and thought-provoking – book takes on a different undertone when read in the light of lockdown. Odell takes her time describing her slow meanderings around the Rose Garden in Oakland, California, sitting and listening to birdsong. Last year that might have felt slightly quirky; this year, we’ve all been doing it.

I was expecting Odell’s book to be something like Cal Newport’s excellent Digital Minimalism, a book I absolutely loved for it’s practically-minded reframing of our troubled relationship with technology.

But Odell is trying to do something different – to reflect on the political and economic structures that surround us and try to monetise our attention. (I don’t think Newport ever used the word ‘neoliberalism’; it springs naturally to Odell’s pen.) I wasn’t won over by this, but perhaps I’m starting in too different a place. For example, when Odell quotes Audre Lorde, “Caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare”, I really don’t grasp what she’s driving at. Evidently I need to read Audre Lorde’s words in their original context.

If you approach this book hoping for a self-help manual you may be disappointed. (Try Newport.) But you may well enjoy it if you approach it for what it is – an extended reflection on nature, humanity and the challenges of the society we have built, from a thoughtful and politically engaged artist and technology writer.

UK: AmazonBlackwell’s

US: AmazonPowell’s

My NEW book The Next Fifty Things That Made the Modern Economy is NOW OUT. Details, and to order on Hive, Blackwells, Amazon or Watersones. Bill Bryson comments, “Endlessly insightful and full of surprises — exactly what you would expect from Tim Harford.”

Receive these posts by email

(You can unsubscribe at any time)

14th of July, 2020Other WritingComments off

Can the pandemic help us fix our technology problem?

We have a technology problem.

By that, I mean that we currently lack the technology to deal with the coronavirus pandemic. We don’t have a cheap, easy, self-administered test. We lack effective medicines. Above all, we don’t have a vaccine.

But I also mean something vaguer and more diffuse. We have a technology problem in the sense that scientific and technological progress has been sputtering for a while. That is evident in the data. The 2010-19 decade of productivity growth in the UK was the lowest for the past couple of centuries, and coronavirus can take no blame for that.

If productivity statistics do not speak to your poetic soul, go into your kitchen and look around. You’ll see little there that you couldn’t have seen 50 years ago. The same could not be said of, say, the 50 years between 1920 and 1970. Or ponder air travel, if you can remember what that is like. Between 1920 and 1970, we went from aviator goggles and fabric-covered biplanes to the Boeing 747 and Concorde. Not only have we failed to surge forward since then, one could even argue that we’ve gone backward.

Given how much we keep being told about the disruptive pace of innovation and the boundless creativity of Silicon Valley, the reality is both surprising and disappointing. After several years pondering the history of inventions and inventors, I wondered whether these two problems might shed light on each other — what can we learn from the pandemic about technology, and what does the history of technology teach us about the pandemic?

Get the incentives right

In 1795, the French government offered a prize of 12,000 francs for inventing a method of preserving food. Napoleon Bonaparte was an ambitious general when the prize was announced. By the time it was awarded, he was France’s emperor, and two years away from his disastrous invasion of Russia. Napoleon may or may not have said: “An army marches on its stomach,” but he was keen to broaden his soldiers’ provisions from smoked and salted meat.

One of the hopefuls who tried his hand at winning the prize was Nicolas Appert, a Parisian grocer and confectioner credited with the development of the stock cube and — less plausibly — the recipe for chicken Kiev. Through trial and error, Appert found if you put cooked food in a glass jar, plunged the jar into boiling water and then sealed it with wax, the food would keep — all this was before Louis Pasteur was born.

Having solved the problem, Monsieur Appert duly claimed his reward.

This is by no means the only example of an innovation prize, a policy tool that has waxed and waned over the years. The most famous was the 1714 Longitude Prize, for solving the problem of how far east or west a ship was. The Royal Society for the encouragement of Arts, Manufactures and Commerce, the RSA, also awarded prizes on a frequent basis, often for safety measures that were regarded as unprofitable but socially valuable. Anton Howes, author of Arts and Minds, a history of the RSA, reckons that the society awarded more than 2,000 innovation prizes between the mid-1700s and the mid-1800s. Some were “bounties”, ad hoc recognition for good ideas; many, however, were classic innovation prizes like that awarded to Appert, which pose an important problem and promise to reward the person who solves it.

Nowadays such prizes are out of fashion. Governments tend to favour a combination of direct support for researchers and the award of an intellectual monopoly, in the form of a patent, to those who develop original ideas. But just like the innovations the RSA rewarded, rapid vaccines can be unprofitable but socially valuable. So a group of the world’s leading economists believes that if we are to maximise the chances of producing that vital coronavirus vaccine at the speed and scale that is required, we need to bring innovation prizes back in a big way.

This team, known as “Accelerating Health Technologies”, includes Susan Athey, the first woman to win the prestigious John Bates Clark medal, and Michael Kremer, a Nobel laureate.

“Whoever discovers the vaccine first is going to get such a big hug,” joked the Financial Times cartoonist Banx. It’s safe to say that they would get much more than that, but would they get enough? Major pharmaceutical companies have been scarred by earlier experiences, where they sank money into vaccines for diseases such as Zika or Sars, or in 2009 rushed to fulfil large orders for flu vaccines, only to find that demand had ebbed.

The problem is that most vaccine research programmes do not produce successful vaccines, and so companies — understandably — try to keep a lid on their spending until one is proven to work. Anthony Fauci, director of the US’s National Institute of Allergy and Infectious Diseases, lamented the problem in February: “Companies that have the skill to be able to do it are not going to just sit around and have a warm facility, ready to go for when you need it,” he told an Aspen Institute panel.

We need the leading vaccine contenders to invest vastly more in trials and production than they normally would, even though much of that investment will ultimately be wasted. And of course, they already are investing more — up to a point. That is partly an act of good corporate citizenship and partly in response to subsidies from governments or the Gates Foundation. But it may not be sufficient.

After all, the cost of failure will be borne mainly by the companies involved, while the benefits of success will be enjoyed by all of us: the IMF estimates the benefits are more than $10bn for every day that widespread vaccine delivery is hastened. Any inducement the rest of us can offer might be money well spent. So Athey, Kremer and their colleagues have proposed a kind of prize called an “advanced market commitment”, a promise to buy hundreds of millions of doses of a vaccine for a premium price.

This is not an untried idea. In 2004, Kremer and Rachel Glennerster, the current chief economist of the UK’s Department for International Development, proposed the concept of an advanced market commitment (AMC). In 2010, donors promised $1.5bn as an AMC for a pneumococcal vaccine for low-income countries; this dramatically accelerated the rollout of successful vaccines and saved hundreds of thousands of lives. But the AMC is really just a sophisticated variant on the innovation prizes of the 18th and 19th centuries, such as the one claimed by Nicolas Appert. Incentives are not the only thing that matter — but matter they do. If we want a solution that badly, we shouldn’t hesitate to commit to rewarding those who produce it.

It is not such a leap from food preservation to a vaccine.

Don’t overlook what seems simple

On August 4 1945, as the US and USSR were manoeuvring for position in a postwar world, a group of boys from the Young Pioneer Organisation of the Soviet Union made a charming gesture of friendship. At the US embassy in Moscow, they presented a large, hand-carved ceremonial seal of the United States of America to Averell Harriman, the US ambassador. It was later to become known simply as “the Thing”. Harriman’s office checked the heavy wooden ornament for signs of a bug, but concluded that, with neither wires nor batteries, it could do no harm. Harriman mounted the Thing proudly on the wall of his study. From there, it betrayed his private conversations for the next seven years.

Eventually, a British radio operator stumbled upon the US ambassador’s conversations being broadcast over the airwaves. These broadcasts were unpredictable: scan the embassy for radio emissions, and no bug was in evidence. It took yet more time to discover the secret. The listening device was inside the Thing. And it was so subtle, so simple, as to have proved almost undetectable.

The Thing had been designed — under duress in a Soviet prison camp — by none other than Léon Theremin, famous even then for his eponymous musical instrument. Inside it was little more than an antenna attached to a cavity with a silver diaphragm over it, serving as a microphone. There were no batteries or any other source of power. The Thing didn’t need them. It was activated by radio waves beamed at the US embassy by the Soviets, at which point it would broadcast back, using the energy of the incoming signal. Switch off that signal, and it would go silent.

The US agents who examined the Thing for bugs did not understand its potential to do them harm. It seemed too simple, too primitive, to matter. And I worry that we often make the same mistake. When we think about technology, we think of the flashy, sophisticated stuff. We overlook the cheap and the simple. We celebrate the printing press that produced the Gutenberg Bibles, but not the paper that many of those Bibles were printed on. Alongside paper and the RFID tag, place the brick, the postage stamp and, for that matter, the humble tin can: inventions that are transformative not because they are complicated but because they are simple.

We should remember the same lesson when it comes to the innovations that fuel public health. The simplest technologies — such as soap and gloves, and, it seems increasingly likely, cloth masks — have proved invaluable, and are much-missed when in short supply.

And those are just the obvious technologies. The UK and the US stumbled in their efforts to scale up testing in the crucial early weeks of the epidemic. It will take post-pandemic inquiries to establish exactly why — and incompetence is clearly one explanation — but reporters highlighted a shortage of the chemical reagents necessary to conduct the test, the protective gear needed to shield the medical staff and even something as simple as cotton swabs.

Even now, it is too easy to dismiss the potential of truly cheap and simple testing. The economist Paul Romer, another Nobel memorial prize winner, argues that if everyone in a country could be tested twice a month — the equivalent, in the UK, of more than four million tests a day — that should provide enough information to suppress the virus whenever there was an outbreak. That is a vast leap beyond our current testing capacity — but the benefits could be enormous. Imagine a reliable test that was cheap and self-administered, like a pregnancy test or a thermometer. Highly sophisticated is good, but being cheap has a sophistication of its own.

Contact tracing is another simple but vital approach. An age-old idea that requires little more than a phone, a notebook and a small army of persistent and diplomatic people, it was abandoned in the UK for the three gravest months of the crisis, apparently on the basis that the army had yet to be recruited and so the tracing system could cope with no more than five new cases a week. Since the lockdown was eased, we have well over a thousand a day.

Then there are the everyday logistical miracles made possible by other simple inventions, the barcode and the shipping container. Nobody cares about logistics until things go wrong. It has been remarkable to see how resilient retail supply chains have been in the face of the most extraordinary disruption. At a time when much of the world’s population was told not to venture beyond their own front doors, we saw little more than a brief awkwardness in sourcing flour, pasta and toilet paper.

But it has not been so straightforward to duplicate this feat when it comes to testing. Embarrassed by the early deficiency, the UK government set ambitious targets. Ministers then claimed to hit them, first by including testing kits that had merely been posted out, and then by bragging about “capacity”. Meanwhile, the government simply stopped reporting how many people had been tested at all. The logistics of conducting, or even counting, the tests proved challenging enough that for the purposes of meeting targets, logistical problems were simply assumed away.

In our desperation to develop high-tech solutions such as drugs or contact-tracing apps, there is a risk that we ignore the simple technologies that can achieve a lot. As Averell Harriman discovered, it is a mistake to overlook technologies that seem too simple to matter.

Manufacturing matters too

There is more to innovation than a good idea. The food-preserving “Appertisation” technology did not stay in France for long — it migrated across the Channel to seek London’s entrepreneurialism and venture capital, allowing production to scale up. (This was a time when the British were, evidently, not too proud to borrow a good idea from the French.) Appert himself was also trying to expand his operations. He invested his prize money in a food-preservation factory, only to see it destroyed by invading Prussian and Austrian armies. Ideas matter, but factories matter too.

Factories are likely to prove equally fateful for vaccine production. Developing a successful vaccine is far more than just a manufacturing problem, but manufacturing is undoubtedly the kind of challenge that keeps experts awake at night. The candidate vaccines are sufficiently different from each other that it is unfeasible to build an all-purpose production line that would work for any of them, so we need to build several in parallel.

“Imagine that your life depended on completing a home construction project on time,” Susan Athey told the Planet Money podcast. “Anyone who’s ever done a construction project knows that none of them had ever been completed on time . . . literally, if your life depended on it, you might try to build five houses.”

Or to put it another way, if your life depends on a letter being delivered on time, send multiple copies of the letter by as many methods as you can find.

In the case of a coronavirus vaccine, setting up multiple redundant production lines costs money — tens of billions of dollars. But remember that an accelerated vaccine is worth more than $10bn a day. Any reasonable subsidy would be value for money, assuming it increased the probability of quick success. Some subsidies are already available — for example, as part of the US “Warp Speed” project, and from the Gates Foundation. But Michael Kremer wants to see more international co-ordination and more ambition. “We think the scale of the problem and the risks associated with each candidate warrant pursuing a substantially larger number of candidates,” he told me.

Alex Tabarrok, another member of the team, added: “Bill Gates is doing the right thing but even Gates can’t do it all. Governments are acting too slowly. Every week that we delay a vaccine costs us billions.”

Athey, Kremer, Tabarrok and the rest of the team behind the Advanced Market Commitment proposal want to supplement it with generous 85 per cent subsidies for the immediate construction of vaccine factories. The calculation here is that firms are the best judges of their own prospects. A firm with a marginal vaccine will not build much capacity, even with an 85 per cent subsidy. But anyone with a decent chance at producing a vaccine will see the prize on offer, and the subsidies, and start building factories at once.

On the principle of not overlooking what seems simple, even the most sophisticated vaccines rely on ingredients that are all too easy to take for granted. Consider the supply of glass vials. Several doses can be included in a single vial, but that still suggests a demand for hundreds of millions of them if a successful vaccine is made. The vaccine industry is used to operating at scale, but this would be something new: vaccines simply aren’t given to everyone in the world all at once.

Or perhaps the hold-up won’t be the glass, but something else. James Robinson, a vaccine manufacturing expert, told the science writer Maggie Koerth: “A vaccine manufacture . . . might source several thousand ingredients to make a vaccine. But each material is coming from factories with hundreds of sources, and those sources have sources.”

For example, GlaxoSmithKline uses an extract from the soap-bark tree to produce a vaccine-enhancing ingredient called an adjuvant; for some of the vaccines now in development, the adjuvant may enhance their effectiveness or make a certain quantity stretch to more doses. As Koerth noted, however, the bark is harvested in Peru, Chile and Bolivia during the summer months of the southern hemisphere. Last year’s crop was harvested before the coronavirus had become a household name; this year’s harvest will not begin until November.

Disruption can help

It hasn’t just been the past few decades in which apparently remarkable technologies have made an underwhelming impression on the productivity figures. Consider the history of electrification in American factories. In the 1890s, the potential for electricity seemed clear. Thomas Edison and Joseph Swan independently invented usable lightbulbs in the late 1870s. In 1881, Edison built electricity-generating stations at Pearl Street in Manhattan and Holborn in London. Things moved quickly: within a year, he was selling electricity as a commodity; a year later, the first electric motors were used to drive manufacturing machinery.

Yet by 1900, less than 5 per cent of mechanical drive power in US factories was coming from electric motors. Most factories were still in the age of steam. This was because when manufacturers replaced large steam engines with large electric motors, they were disappointed with the results.

I’ve written about the work of economic historian Paul David before. He argued it wasn’t enough merely to replace steam engines with electric motors. The capabilities of those new motors could only be used fully if the factories were redesigned.

While replacing a large steam engine with a large electric motor had achieved very little, electric motors could be efficient at a smaller scale. That meant that each worker could have a small motor at their bench. Wires could replace driveshafts; factories could spread out into lighter, airier spaces; the flow of product could be optimised, rather than being constrained by proximity to the power source.

But a fascinating part of David’s argument is that all this was catalysed by a crisis. After 1914, workers became more expensive thanks to a series of new laws that limited immigration into the US from a war-torn Europe. Manufacturing wages soared and hiring workers became more about quality, and less about quantity. It was worth investing in training — and better trained workers were better placed to use the autonomy that electricity gave them. The recruitment problem sparked by the immigration restrictions helped to spark new thinking about the design of the American factory floor.

Some of the modern parallels are obvious. We have had email, internet and affordable computers for years — and more recently, video-conferencing. Yet until the crisis hit, we had been slow to explore online education, virtual meetings or telemedicine. 3D printing and other agile manufacturing techniques have moved from being curiosities to life-saving ways to meet the new demand for medical equipment. We are quickly learning new ways to work from a distance because suddenly we have had no choice. And we are learning about resilience.

There is no guarantee that a crisis always brings fresh ideas; sometimes a catastrophe is just a catastrophe. Still, there is no shortage of examples for when necessity proved the mother of invention, sometimes many times over.

The Economist points to the case of Karl von Drais, who invented an early model of the bicycle in the shadow of “the year without a summer” — when in 1816 European harvests were devastated by the after-effects of the gargantuan eruption of Mount Tambora in Indonesia. Horses were starved of oats; von Drais’s “mechanical horse” needed no food.

It is a good example. But one might equally point to infant formula and beef extract, both developed by Justus von Liebig in response to the horrifying hunger he had witnessed in Germany as a teenager in 1816. Or, if we are to recognise art as well as science, there is Mary Shelley’s masterpiece Frankenstein, written that same rainy summer beside Lake Geneva; the creature’s isolation mirrors that of the starving peasants she saw, begging for food. One crisis may lead to many creative responses.

The same may be true of this pandemic. Disruptions — even calamitous ones — have a way of bulldozing vested interests and tearing up cosy assumptions, jolting people and organisations out of the status quo.

It is just possible that future generations will point to 2020 as the year the innovation slowdown ended. Even economists need to be able to hope.

Written for and first published in the Financial Times on 11 June 2020.

My NEW book The Next Fifty Things That Made the Modern Economy is NOW OUT; this article is based in part on ideas in that book. Details, and to order on Hive, Blackwells, Amazon or Watersones. Bill Bryson comments, “Endlessly insightful and full of surprises — exactly what you would expect from Tim Harford.”

Receive these posts by email

(You can unsubscribe at any time)

1st of July, 2020HighlightsOther WritingComments off
Other Writing

Book of the Week 22 – The Biggest Bluff by Maria Konnikova

I’ve been a fan of Maria Konnikova’s writing for a while. She’s a Harvard-educated academic psychologist who switching to writing and turned out to be even better at that than psychology – her book, The Confidence Game, is a modern classic with a great mix of psychological research and true storytelling. Just my kind of thing.

The new book, The Biggest Bluff, sees Konnikova taking on the world of professional poker in the hope of learning something about psychology – and perhaps in the hope that her psychological training might give her an edge. 

In some ways this is a cross between James McManus’s astonishing true story Positively Fifth Street (journalist goes to Vegas to cover a murder trial, accidentally goes deep into the world’s biggest poker tournament) and Annie Duke’s excellent Thinking in Bets, in which an academically-trained poker champion shares what she’s learned. No bad thing, that, because both are great books.

Konnikova has a compelling story to tell about her rollercoaster ride through the world of high-stakes poker, and she tells it well, effortlessly weaving in the academic insights in between her lessons from her mentor, Erik Seidel, and the dizzying highs and lows of the table. I loved it.

UK: Blackwells   Amazon

US: Powell’s   Amazon


My NEW book The Next Fifty Things That Made the Modern Economy is NOW OUT. Details, and to order on Hive, Blackwells, Amazon or Watersones. Bill Bryson comments, “Endlessly insightful and full of surprises — exactly what you would expect from Tim Harford.”

Receive these posts by email

(You can unsubscribe at any time)

20th of June, 2020Other WritingComments off

Book(s) of the Week 20: The Next Fifty Things That Made The Modern Economy

Okay, this week I’m plugging my own brand new book, The Next Fifty Things That Made The Modern Economy. 

At least, a little bit. But I have some other books to tell you about too.

One of the joys of writing this book was to be able to pick up two or three wonderful books on each topic, learn all about the history and the characters involved, and then try to figure out how to use what I’d learned to tell a story with a particular lesson about how the economy works. In the case of the Langstroth Beehive, for example, I was able to talk about the Nobel laureate James Meade and his discussion of ‘positive externalities’, the long obsession of economists with bees, as well as the long-standing relationship between bees and humans.

Or when it came to the QWERTY keyboard I could discuss the raging controversy over the topic of ‘technological lock-in’, a crucial issue in the debate over how to regulate big tech companies – while at the same time puncturing some myths about the invention of the typewriter.

I loved writing this book and I hope you’ll love reading it. Click here for more information and links to buy from Hive, Amazon, Blackwell’s or Waterstones. (If you are reading in North America, sorry – only the previous book Fifty Inventions That Shaped The Modern Economy is available.)

Do please consider buying, gifting and/or reviewing the book. It’s not an easy time to be publishing a book – or to be a bookseller – and early support makes a big difference.

Now, I promised OTHER books. Here are some of the many, many books that I consulted while writing The Next Fifty Things and which stuck in my mind.

On Bricks: Brick: A World History by James Campbell and Will Pryce – gorgeous coffee-table photographs of brick structures from around the world.

On Beehives: The Hive by Bee Wilson. Quick, accessible history of the long and ever-changing relationship between humans and bees.

On Tulips: Tulipmania by Anne Goldgar (perfectly punctures the tulipmania myths) and Tulipomania by Mike Dash (further great stories and colourful details).

On GPS: PinPoint by Greg Milner, a book that will also be familiar to fans of Cautionary Tales.

On the ChatBot: The Most Human Human by Brian Christian. One of my favourite books of the decade.

On the Bicycle: The Mechanical Horse by Margaret Guroff. Full of telling social observations.

Lots of others that there is no time to discuss here – but the references tell all.

Stay safe, thanks for reading this post – and if you’ve decided to buy my book, thank you for that, too.


Receive these posts by email

(You can unsubscribe at any time)

27th of May, 2020MarginaliaOther WritingComments off
Other Writing

The Next Fifty Things That Made the Modern Economy – book and talk

“Endlessly insightful and full of surprises — exactly what you would expect from Tim Harford.”

Bill Bryson

I’m delighted to announce the launch of my new book, The Next Fifty Things That Made the Modern Economy. It’s a sequel to the original in which I presented a selection of fifty radical inventions that changed the world – in the form of surprising, instructive and untold stories about the people and ideas behind these technologies.

Now, in this new book, I’m back with another array of remarkable, memorable, curious and often unexpected ‘things’ – inventions that teach us lessons by turns intimate and sweeping about the complex world economy we live in today.

They range from the brick, blockchain and the bicycle to fire, the factory and fundraising, and from solar PV and the pencil to the postage stamp.

Hopefully it won’t be too long before you can stroll into a bookshop and pick up a copy, but you can also pre-order online at Amazon, Blackwell’s, Hive and Waterstones.


More? You want more? Okay. I’ll be speaking at the Hay Festival online on Monday 25 May at 4pm UK time. Because of the times we live in I’ll focus in particular on What the Pandemic Teaches Us About Innovation. Come along!

At the moment the book is not available in the US – sorry – but you can always pick up the first in the series, the US title being Fifty Inventions That Shaped The Modern Economy.

Receive these posts by email

(You can unsubscribe at any time)

21st of May, 2020Other WritingComments off

Why we fail to prepare for disasters

You can’t say that nobody saw it coming. For years, people had warned that New Orleans was vulnerable. The Houston Chronicle reported that 250,000 people would be stranded if a major hurricane struck, with the low-lying city left 20ft underwater. New Orleans’s Times-Picayune noted the inadequacy of the levees. In 2004, National Geographic vividly described a scenario in which 50,000 people drowned. The Red Cross feared a similar death toll. Even Fema, the Federal Emergency Management Agency, was alert: in 2001, it had stated that a major hurricane hitting New Orleans was one of the three likeliest catastrophes facing the United States.

Now the disaster scenario was becoming a reality. A 140mph hurricane was heading directly towards the city. More than a million residents were warned to evacuate. USA Today warned of “a modern Atlantis”, explaining that the hurricane “could overwhelm New Orleans with up to 20ft of filthy, chemical-polluted water”.

The city’s mayor, Ray Nagin, begged people to get away. He was reluctant to make evacuation mandatory because more than 100,000 people had no cars and no way of leaving. The roads out were jammed, anyway. Thousands of visiting conference delegates were stranded; the airport had been closed. There were no emergency shelters. Nagin mooted using a local stadium, the Louisiana Superdome, as a temporary refuge — but the Superdome was not necessarily hurricane-proof and Nagin was warned that it wasn’t equipped to be a shelter.

But then, the storm turned aside. It was September 2004, and New Orleans had been spared. Hurricane Ivan had provided the city, and the nation, with a vivid warning. It had demonstrated the need to prepare, urgently and on a dozen different fronts, for the next hurricane.

“In early 2005, emergency officials were under no illusions about the risks New Orleans faced,” explain Howard Kunreuther and Robert Meyer in their book The Ostrich Paradox. But the authorities did not act swiftly or decisively enough. Eleven months later, Hurricane Katrina drowned the city — and many hundreds of its residents. As predicted, citizens had been unable or unwilling to leave; levees had been breached in over 50 places; the Superdome had been an inadequate shelter.

Surely, with such a clear warning, New Orleans should have been better prepared to withstand Hurricane Katrina? It’s easily said. But as the new coronavirus sweeps the globe, killing thousands more people every day, we are now realising that New Orleans is not the only place that did not prepare for a predictable catastrophe.


In 2003, the Harvard Business Review published an article titled “Predictable Surprises: The Disasters You Should Have Seen Coming”. The authors, Max Bazerman and Michael Watkins, both business school professors, followed up with a book of the same title. Bazerman and Watkins argued that while the world is an unpredictable place, unpredictability is often not the problem. The problem is that faced with clear risks, we still fail to act.

For Watkins, the coronavirus pandemic is the ultimate predictable surprise. “It’s not like this is some new issue,” he says, before sending over the notes for a pandemic response exercise that he ran at Harvard University. It’s eerily prescient: a shortage of masks; a scramble for social distance; university leaders succumbing to the illness. The date on the document is October 12 2002. We’ve been thinking about pandemics for a long time.

Other warnings have been more prominent. In 2015, Bill Gates gave a TED talk called “The next outbreak? We’re not ready”; 2.5 million people had watched it by the end of 2019. In 2018, the science journalist Ed Yong wrote a piece in The Atlantic titled “The Next Plague Is Coming. Is America Ready?” Now we know the answer, and it wasn’t just the Americans who were unprepared.

Officialdom had also been sounding the alarm. The World Health Organization and the World Bank had convened the Global Preparedness Monitoring Board (GPMB), chaired by Elhadj As Sy of the Red Cross and Gro Harlem Brundtland, a former director of the WHO. The GPMB published a report in October warning of “a cycle of panic and neglect” and calling for better preparation for “managing the fallout of a high-impact respiratory pathogen”. It noted that a pandemic “akin to the scale and virulence of the one in 1918 would cost the modern economy $3 trillion”.

Alongside these authoritative warnings were the near misses, the direct parallels to Hurricane Ivan: Sars in 2003; two dangerous influenza epidemics, H5N1 in 2006 and H1N1 in 2009; Ebola in 2013; and Mers in 2015. Each deadly outbreak sparked brief and justifiable alarm, followed by a collective shrug of the shoulders.

It is understandable that we have too few doctors, nurses and hospital beds to cope with a pandemic: spare doctors are expensive. It is less clear why we have so few masks, are so unprepared to carry out widespread testing and didn’t do more to develop coronavirus vaccines after the Sars epidemic of 2003, which involved a strain related to the current outbreak. (There was a flurry of activity, but interest waned after 2004.) We were warned, both by the experts and by reality. Yet on most fronts, we were still caught unprepared. Why?


Wilful blindness is not confined to those in power. The rest of us should acknowledge that we too struggled to grasp what was happening as quickly as we should. I include myself. In mid-February, I interviewed an epidemiologist, Dr Nathalie MacDermott of King’s College London, who said it would likely prove impossible to contain the new coronavirus, in which case it might well infect more than half the world’s population. Her best guess of the fatality rate at the time was a little under one per cent. I nodded, believed her, did the maths in my head — 50 million dead — and went about my business. I did not sell my shares. I did not buy masks. I didn’t even stock up on spaghetti. The step between recognising the problem and taking action was simply too great.

Nor did the broadcast of my radio interview with MacDermott on the BBC seem to spark much in the way of disaster planning. Psychologists describe this inaction in the face of danger as normalcy bias or negative panic. In the face of catastrophe, from the destruction of Pompeii in AD79 to the September 11 2001 attacks on the World Trade Center, people have often been slow to recognise the danger and confused about how to respond. So they do nothing, until it is too late.

Part of the problem may simply be that we get our cues from others. In a famous experiment conducted in the late 1960s, the psychologists Bibb Latané and John Darley pumped smoke into a room in which their subjects were filling in a questionnaire. When the subject was sitting alone, he or she tended to note the smoke and calmly leave to report it. When subjects were in a group of three, they were much less likely to react: each person remained passive, reassured by the passivity of the others.

As the new coronavirus spread, social cues influenced our behaviour in a similar way. Harrowing reports from China made little impact, even when it became clear that the virus had gone global. We could see the metaphorical smoke pouring out of the ventilation shaft, and yet we could also see our fellow citizens acting as though nothing was wrong: no stockpiling, no self-distancing, no Wuhan-shake greetings. Then, when the social cues finally came, we all changed our behaviour at once. At that moment, not a roll of toilet paper was to be found.

Normalcy bias and the herd instinct are not the only cognitive shortcuts that lead us astray. Another is optimism bias. Psychologists have known for half a century that people tend to be unreasonably optimistic about their chances of being the victim of a crime, a car accident or a disease, but, in 1980, the psychologist Neil Weinstein sharpened the question. Was it a case of optimism in general, a feeling that bad things rarely happened to anyone? Or perhaps it was a more egotistical optimism: a sense that while bad things happen, they don’t happen to me. Weinstein asked more than 250 students to compare themselves to other students. They were asked to ponder pleasant prospects such as a good job or a long life, and vivid risks such as an early heart attack or venereal disease. Overwhelmingly, the students felt that good things were likely to happen to them, while unpleasant fates awaited their peers.

Robert Meyer’s research, set out in The Ostrich Paradox, shows this effect in action as Hurricane Sandy loomed in 2012. He found that coastal residents were well aware of the risks of the storm; they expected even more damage than professional meteorologists did. But they were relaxed, confident that it would be other people who suffered.

While I realise some people are paranoid about catching Covid-19, it’s egotistical optimism that I see in myself. Although I know that millions of people in the UK will catch this disease, my gut instinct, against all logic, is that I won’t be one of them. Meyer points out that such egotistical optimism is particularly pernicious in the case of an infectious disease. A world full of people with the same instinct is a world full of disease vectors. I take precautions partly because of social pressure and partly because, intellectually, I know they are necessary. But my survival instinct just isn’t doing the job, because I simply do not feel my survival is at stake.

The fact that the epidemic started in China, among ethnically Asian people, can only have ­deepened the sense of personal invulnerability in the west. As epidemiologist Neil Ferguson told the FT: “What had happened in China was a long way away, and it takes a certain type of person to take on board that this might actually happen here.”

The virus started to feel real to Europeans only when Europeans were suffering. Logically, it was always clear that the disease could strike middle-class people who enjoy skiing holidays in Italy; emotionally, we seemed unable to grasp that fact until it was too late.

A fourth problem, highlighted by Meyer’s co-author Howard Kunreuther, is what we might call exponential myopia. We find exponential growth counterintuitive to the point of being baffling — we tend to think of it as a shorthand for “fast”. An epidemic that doubles in size every three days will turn one case into a thousand within a month — and into a million within two months if the growth does not slow.

Donald Trump’s boast, on March 9, that there had been only 22 deaths in the US, was ill-judged in light of what we know about exponential growth, but he is hardly the only person to fail to grasp this point. In 1975, the psychologists William Wagenaar and Sabato Sagaria found that when asked to forecast an exponential process, people often underestimated by a factor of 10. The process in that study was much slower than this epidemic, doubling in 10 months rather than a few days. No wonder we find ourselves overtaken by events.

Finally, there’s our seemingly limitless capacity for wishful thinking. In a complex world, we are surrounded by contradictory clues and differing opinions. We can and do seize upon whatever happens to support the conclusions we wish to reach — whether it’s that the virus is being spread by 5G networks, is a hoax dreamed up by “the Dems” or is no worse than the flu.

Both Robert Meyer and Michael Watkins made an observation that surprised me: previous near misses such as Sars or Hurricane Ivan don’t necessarily help citizens prepare. It is all too easy for us to draw the wrong lesson, which is that the authorities have it under control. We were fine before and we’ll be fine this time.

This, then, is why you and I did not see this coming: we couldn’t grasp the scale of the threat; we took complacent cues from each other, rather than digesting the logic of the reports from China and Italy; we retained a sunny optimism that no matter how bad things got, we personally would escape harm; we could not grasp what an exponentially growing epidemic really means; and our wishful thinking pushed us to look for reasons to ignore the danger.


The true failure, however, surely lies with our leaders. We are humble folk, minding our own business; their business should be safeguarding our welfare, advised by expert specialists. You or I could hardly be expected to read Gro Harlem Brundtland’s October Global Preparedness Monitoring Board report, and if we did, it is not clear what action we could really take. Surely every government should have someone who is paying attention to such things?

Margaret Heffernan, the author of Uncharted, warns that the same mental failings that blind us to certain risks can do the same to our leaders.

“We hang around with people like ourselves and if they’re not fussed, we’re not fussed,” she says. “Gro Harlem Brundtland lives inside a global health institution, so she cares. Most politicians don’t.”

While politicians have access to the best advice, they may not feel obliged to take experts seriously. Powerful people, after all, feel sheltered from many everyday concerns.  Heffernan argues that this sense of distance between the powerful and the problem shaped the awful response to Hurricane Katrina. Leaked emails show the response of Michael Brown, then the director of Fema.

One subordinate wrote: “Sir, I know that you know the situation is past critical. Here some things you might not know. Hotels are kicking people out, thousands gathering in the streets with no food or water… dying patients at the DMAT tent being medivac. Estimates are many will die within hours…”

Brown’s response, in its entirety, was: “Thanks for update. Anything specific I need to do or tweak?” That’s a sense of distance and personal impunity distilled to its purest form.

Sometimes, of course, the feeling of invulnerability is an illusion: in early March, the British Prime Minister Boris Johnson jovially declared that people would be “pleased to know” that he was shaking hands with everybody at a hospital tending to patients with coronavirus, and inviting people to make their own decisions about such matters. It was a shamefully irresponsible thing to say — but it also spoke volumes about his misplaced intuition that he could come to no harm. Within weeks, the story of Johnson had become a classical tragedy, the hero laid low by his own larger-than-life qualities.


We should acknowledge that even foreseeable problems can be inherently hard to prepare for. A pandemic, for example, is predictable only in broad outline. The specifics are unknowable. “What disease? When? Where?” says Heffernan. “It’s inherently unpredictable.”

The UK, for example, ran a pandemic planning exercise in October 2016, dubbed “Exercise Cygnus”. That forethought is admirable, but also highlights the problem: Cygnus postulated a flu pandemic, perhaps a strain of the H1N1 virus that killed tens of thousands in 2009, and many millions in 1918. Covid-19 is caused by a coronavirus instead, a relative of the Sars-Cov strain from the 2003 outbreak. Some of the implications are the same: we should stockpile personal protective equipment. Some, such as the danger of flu to young children, are different.

In any case, those implications seem broadly to have been ignored. “We learnt what would help, but did not necessarily implement those lessons,” wrote Professor Ian Boyd in Nature in March. Boyd had been a senior scientific adviser to the UK government at the time. “The assessment, in many sectors of government, was that the resulting medicine [in terms of policy] was so strong that it would be spat out.”

Being fully prepared would have required diverting enormous sums from the everyday requirements of a medical system that was already struggling to cope with the nation’s needs. The UK’s National Health Service was short of staff before the crisis began, seems to have had woefully inadequate stores of protective equipment for doctors and nurses, and has long pursued a strategy of minimising the use of hospital beds.

It’s this quest for efficiency above all else — in the NHS, and modern organisations in general — that leaves us vulnerable. The financial crisis taught us that banks needed much bigger buffers, but few carried the lesson over to other institutions, such as hospitals.

“On a good day, having 100 per cent of your intensive care beds in use looks efficient. The day a pandemic strikes is the day you realise the folly of efficiency. You’ve got to have a margin,” says Heffernan.

These margins are hard to maintain, though. In 2006, Arnold Schwarzenegger — then governor of California — announced an investment of hundreds of millions of dollars in medical supplies and mobile hospitals to deal with earthquakes, fires and particularly pandemics. According to the Los Angeles Times, emergency response teams would have access to a stockpile including “50 million N95 respirators, 2,400 portable ventilators and kits to set up 21,000 additional patient beds wherever they were needed”.  It was impressive.

But after a brutal recession, Schwarzenegger’s successor, Jerry Brown, cut the funding for the scheme, and the stockpile is nowhere to be found. Brown isn’t the only one to look for something to cut when funds are tight. Managers everywhere have long been promoted on their ability to save money in the short term.

I spoke to a friend of mine, a senior NHS consultant who had contracted Covid-19 as he tended his patients. Recovering in self-isolation, he reminisced about the days that he was told to find cuts of five to 10 per cent — and the fact that his hospital was no longer providing coffee for staff meetings as a cost-saving exercise. That seems like a memo from another era — but it was just a few weeks ago. As the cost-saving measures were being introduced in the UK, Italians had started to die.


The pandemic has offered us few easy choices so far. Nor are there many easy answers to the general problem of preparing for predictable catastrophes. It is too tempting to look at a near miss like Hurricane Ivan or Sars and conclude that since the worst did not happen then, the worst will not happen in the future. It is tempting, too, to fight the last war: we built up reserves in banking after the financial crisis, but we did not pay attention to reserve capacity in health, vaccine production and social care.

Preparedness is possible. Margaret Heffernan points to Singapore, a tiny country with front-line experience of Sars, acutely aware of its geographical vulnerability.

“The foresight unit in Singapore is the best I’ve ever encountered,” she says. “There are serious people working through very serious scenarios, and there’s a diversity of thinking styles and disciplines.”

Serious scenarios are useful, but as the UK’s Exercise Cygnus demonstrated, serious scenarios are no use if they are not taken seriously. That means spending money on research that may never pay off, or on emergency capacity that may never be used. It is not easy to justify such investments with the day-to-day logic of efficiency.

Singapore isn’t the only place to have prepared. Almost four years ago, philanthropists, governments and foundations created the Coalition for Epidemic Preparedness Innovations. Cepi’s mission is to support and develop technologies and systems that could create vaccines more quickly. While the world chafes at the idea that a vaccine for the new coronavirus might take more than a year to deploy, such a timeline would have been unthinkably fast in the face of earlier epidemics. If such a vaccine does arrive within a year — there is no guarantee it will arrive at all — that will be thanks to the likes of Cepi.

Still, we are left wondering what might have been if Cepi had existed just a few years earlier. In October 2019, for example, it started funding vaccine “platform” technologies to enable a more agile, rapid response to what it called “Disease X… a rapidly moving, highly lethal pandemic of a respiratory pathogen killing 50 [million] to 80 million people and wiping out nearly 5 per cent of the world’s economy”. That’s preparedness; alas ­Disease X may have arrived just a little too soon for the preparedness to bear fruit.


And what of New Orleans? In the summer of 2017, it was underwater again. A vast and expensive system of pumps had been installed, but the system was patchy, under-supplied with power and unable to cope with several weeks of persistent rain. It does not inspire confidence for what will happen if a big hurricane does strike.

Robert Meyer says that while the city has learnt a lot about preparation, “Katrina was not close to the worst-case scenario for New Orleans, which is a full category-five storm hitting just east of the city”.

The same may be true of the pandemic. Because Covid-19 has spread much faster than HIV and is more dangerous than the flu, it is easy to imagine that this is as bad as it is possible to get. It isn’t. Perhaps this pandemic, like the financial crisis, is a challenge that should make us think laterally, applying the lessons we learn to other dangers, from bioterrorism to climate change. Or perhaps the threat really is a perfectly predictable surprise: another virus, just like this one, but worse. Imagine an illness as contagious as measles and as virulent as Ebola, a disease that disproportionately kills children rather than the elderly.

What if we’re thinking about this the wrong way? What if instead of seeing Sars as the warning for Covid-19, we should see Covid-19 itself as the warning?

Next time, will we be better prepared?

Written for and first published in the FT Magazine on 18/19 April 2020.

My NEW book The Next Fifty Things That Made the Modern Economy is out in the UK in two weeks and available to pre-order; please consider doing so online or at your local bookshop – pre-orders help other people find the book and are a BIG help.

Receive these posts by email

(You can unsubscribe at any time)

9th of May, 2020HighlightsOther WritingComments off
Other Writing

Remembering Peter Sinclair

Peter Sinclair died yesterday, after many days in hospital with covid-19. It’s a heavy blow. Peter was an inspirational economics teacher and a wonderfully kind man. Peter inspired a generation of great economists and economics journalists, including Dave Ramdsen (long time head of the Government Economic Service), Camilla Cavendish, Tim Leunig, Evan Davis, and Diane Coyle, who posts her own memories. He also taught David Cameron. I’m envious of all of them because Peter left for the University of Birmingham before I had the chance to have the full benefit of his teaching. My loss, Birmingham students’ gain.

But even in a few short months he had a profound influence on me. When I was floundering in my Oxford entrance interviews – I hadn’t got a clue what was going on as I was being grilled by the formidable philosophy tutor – Peter was the one beaming and nodding and encouraging, as though everything was going brilliantly. And when I decided to drop economics and specialise in philosophy, Peter took the trouble to send a long, handwritten letter, full of encouragement, gently suggesting that I reconsider. I remember it vividly. I took his advice. It changed my life.

Peter had many friends, and went to great lengths to keep in touch with and support his former students. They will all be grieving today. My thoughts are very much with his wife Jayne, his family, and his close friends.

I bumped into him on the street a couple of years ago. He was smiling, bobbing around, waving enthusiastically, behaving as though there was nobody in the world he’d rather see. That extravagant friendliness was so very like him. It is how I’ll remember him.

1st of April, 2020Other WritingComments off

How not to lose your mind in the Covid-19 age

here are as many responses to the Covid-19 pandemic as there are people to respond. Some have of us have children to home-school. Some of us have elderly relatives to worry about; some of us are the elderly relatives in question. Some of us have never been busier; others have already lost their jobs.

One experience is common, however: wherever the virus has started to spread, life is changing radically for almost everyone. It’s a strange and anxious time, and some of the anxiety is inevitable. For many people, however, much of the stress can be soothed with – if you will pardon the phrase – one weird trick.

First, a diagnosis. Most of us, consciously or not, have a long list of things to do. As the virus and the lockdowns have spread, many of the items on the to-do list have simply evaporated. At the same time, a swarm of new tasks have appeared, multiplying by the day: everything from the small-yet-unfamiliar (“get toilet paper” and “claim refund on cancelled holiday”) to the huge-and-intimidating (“organise an inspiring home-school curriculum” or “find a new job”).

The change is so fast and comprehensive that for most of us it is unprecedented. Even a divorce or an international relocation is more gradual. The death of a spouse might be the only experience that comes close. No wonder that even those of us who are safe and well and feel loved and financially secure find ourselves reeling at the scale of it all.

To the extent that the problem is that the to-do list is unrecognisable, the solution is oddly simple: get the to-list back in order. Here’s how.

Get a piece of paper. Make a list of all the projects that are on your mind. David Allen, author of the cult productivity manual Getting Things Done, defines a project as “any multistep outcome that can be completed within a year”. So, yes: anything from trying to source your weekly groceries to publishing a book.

That list should have three kinds of projects on it.

First, there are the old projects that make no sense in the new world. For those that can be mothballed until next year, write them down and file them away. Others will disappear forever. Say your goodbyes. Some part of your subconscious may have been clinging on, and I’m going to guess that ten seconds of acknowledging that the project has been obliterated will save on a vague sense of unease in the long run.

Second, there are the existing projects, some of which have become more complicated in the mid-pandemic world. Things that you might previously have done on automatic may now require a little thought. Again, a few moments with a pen and paper will often tell you all you need to know: what’s changed? What do I now need to do? What, specifically, is my next action? Write it down.

Third, there are brand new projects. For me, for example, I need to rewrite the introduction to my forthcoming book (‘How To Make The World Add Up, since you were wondering). It’s going to seem mighty strange without coronavirus references in it. Many of us need to devote more than a little attention to the sudden appearance of our children at home. Some of us need to hunt for new work; others, for a better home-office set-up. Many of us are now volunteering to look after vulnerable neighbours.  In each case, the drill is the same: sketch out the project, ask yourself what the very next step is, and write it down.

Occasionally, you may encounter something that’s on your mind – the fate of western civilisation, for example, or the fact that the health service desperately needs more ventilators and more protective equipment. For my family, it’s an elderly relative, suffering from dementia, in a locked-down nursing home. We can’t visit him. He can’t communicate on the phone or comprehend a video chat. There is, for now, literally nothing we can do but wait and hope. Acknowledging that fact – that there is no action to be taken – is itself a useful step.

I won’t pretend that in this frightening time, working through your to do list in a systematic way will resolve all anxieties. It won’t. But you may be surprised at how much mental energy it saves – and at the feeling of relief as all these confusing and barely-acknowledged new responsibilities take shape and feel more under your control.

Or so it seems to me. Good luck, and keep safe.


Oh – and in case it wasn’t obvious, this week’s Book of the Week is David Allen’s superb Getting Things Done.

My NEW book The Next Fifty Things That Made the Modern Economy is out in the UK in May and available to pre-order; please consider doing so online or at your local bookshop – pre-orders help other people find the book and are a huge help.

Receive these posts by email

(You can unsubscribe at any time)

29th of March, 2020MarginaliaOther WritingResourcesComments off

Book of the week 11: Uncharted by Margaret Heffernan

“The sagacious businessman is constantly forecasting,” said the great economist Irving Fisher, a man thoroughly convinced of the power of data to make the future legible. Fisher transformed economics and made millions as an entrepreneur, but died in penury. He is now best remembered as the tragic figure who, shortly before the cataclysmic Wall Street crash of 1929, informed the nation: “Stocks have reached what looks like a permanently high plateau.”

Poor Professor Fisher appears early on in Uncharted. Margaret Heffernan’s book is less a smackdown of failed forecasts than an engaging ramble across our attempts to predict, control, explore or embrace an uncertain future. Heffernan is admired for books that question the received wisdom of how management works; she is a business guru who brings the stern discipline of good sense to the business book genre. In this book, she turns her attention to a topic that absorbs most business leaders — and the rest of us too: how to think about what the future holds. Gazing into the future is not fruitless, she argues, but it is unnerving and hard work. Lazy and fearful, we are far too quick to reach for overblown gurus, or misleading data or other useless guides. Even a good tool, such as GPS, can dull our senses.

“What matters most isn’t the predictions themselves but how we respond to them, and whether we respond to them at all,” she writes. “The forecast that stupefies isn’t helpful, but the one that provides fresh thinking can be.”

And fresh thinking is what Heffernan wishes to provoke, mostly through storytelling, occasionally through rhetoric. Are we trapped by history? Only if we let our own narratives confine us. Can parents use an app to “predict life outcomes and . . . maximise the life-long potential of your child”? No. She finds the idea appalling.

Better, she suggests, to explore, empower, experiment. Whether you’re running a multinational, pondering a career change or being a parent, the same wisdom applies: sometimes things go wrong, or go right, and we don’t know why. Keep your eyes open. Stay engaged. Listen to others. Don’t be afraid to change course. Contribute to your community, and make connections before trouble strikes: “Don’t exchange business cards in a crisis.”

At times, Uncharted resembles a collection of secular sermons illustrated with a story. Heffernan stands in the pulpit quietly admonishing us to be a little wiser, reflect a little more, to do the things that deep down we already know we should be doing.

Moments of counterintuitive astonishment are scarce, but the book is probably better for that. And it largely avoids the usual suspects: Apple, Google, 3M, the US military. Instead, we find ourselves in the shoes of a disillusioned Catholic priest, realising he has fallen in love and getting no help from the Church. Or in a room with a diverse group of Mexicans, from mobsters to senators, as they try to explore the future with a scenario-planning exercise. Or with the management of Nokia, wondering if there is life after cell phones. These are subtle tales of struggle and compromise.

The storytelling is not without its flaws. Physicist Marzio Nessi morphs into a Mr Messi, who is surely a different kind of genius. A discussion of fresh ideas in healthcare required multiple re-readings to sort out who was doing what, where, and whether these were diverse experiments across the nation. More than once I checked the index because I assumed I’d missed something. These are small things, but in a book that tries to flow so freely across so many stories, they are barnacles that produce a drag.

That said, Heffernan is generally a deft storyteller and the book’s reliance on such stories is a strength. Bad “smart thinking” books offer 2×2 matrices and jargon; good ones offer theory and evidence. Heffernan steps outside the category entirely. She wants us to engage with the particularities of people, places and the problems they faced — to empathise with them, reflect on our own lives and our own careers, and to draw our own conclusions.

Uncharted is not a book to skim in the business class lounge. Heffernan’s approach is more like a music lover trying to broaden the appreciation of a patient friend. “Here’s an example; listen to this; here’s another. Compare, contrast. Now do you see what I’m getting at?” It is messy, and occasionally frustrating, but wise and appealingly human.

UK: AmazonBlackwell’s

US: Amazon – Powell’s (Publishes Sep 2020)
Written for and first published in the Financial Times on 19 February 2020.

Catch up on the first season of my podcast “Cautionary Tales” [Apple] [Spotify] [Stitcher]

My book “Fifty Things That Made the Modern Economy” (UK) / “Fifty Inventions That Shaped The Modern Economy” (US) is out now in paperback – feel free to order online or through your local bookshop.

Receive these posts by email

(You can unsubscribe at any time)

16th of March, 2020MarginaliaOther WritingComments off
Other Writing

The changing face of economics

Robert Solow, the Nobel laureate economist, says he had long been “bothered” by the fact that most people — even educated people — “had no clear idea of what economics is, and what economists do”.

Solow was born in Brooklyn in 1924, to what he has described as a “lower-middle-class family”, and grew up during the Great Depression.

Although his father always had work, Solow has said that from about the age of eight onwards, he was conscious that his parents were constantly worrying, “and their worries were purely economic: what was going to happen, could they continue to make ends meet”.

This awareness would shape his thinking throughout his life. He won a scholarship to Harvard at 16 and began an academic career that would see him reach the top of his field, winning the Nobel in 1987 for his contributions to the theory of economic growth.

Yet despite such acclaim, Solow, who is now 95, felt that his subject remained frustratingly opaque to the general public.

Then, a few years ago, he was seated by chance next to the photographer Mariana Cook at a friend’s dinner party. Cook had recently completed a project photographing 92 mathematicians, ranging from Fields Medal winners to promising young men and women at the start of their careers.

Solow suggested that she embark on a similar series of portraits, but of economists — and Cook agreed.

As he writes in the introduction to the resulting book, which contains 90 black-and-white portraits shot by Cook over the course of three years: “The idle thought became a reality, and I found myself involved in many ways. Naturally, I had to ask myself: was making a book of portraits of academic economists a useful or reasonable or even a sane thing to do?”

It is a fair question. Economics remains a perplexing discipline. It is often regarded as purely the study of money. (Far from it: indeed, some critics complain that economists aren’t as interested in studying money as they should be.) It is easily caricatured as overly mathematical, full of absurdly unrealistic assumptions, elitist and corrupted by proximity to business and finance.

And, as with any caricature, there is some truth in all of these complaints.

So what actually is economics? Alfred Marshall began his enduringly influential 1890 book Principles of Economics: “Political economy or economics is a study of mankind in the ordinary business of life; it examines that part of individual and social action which is most closely connected with the attainment and with the use of the material requisites of wellbeing.”

“The ordinary business of life.” It is not a bad definition, even now. But economics has changed since Marshall’s day. What is being studied has changed, and how, and even who does the studying.

Start with the “what”. It might seem obvious that economists should stick to the study of the economy — the production and consumption of goods and services that are either traded in markets or could be. They never really did stay in their lane: Thomas Robert Malthus was a proto-environmentalist and an inspiration for Charles Darwin; John Stuart Mill was a philosopher; John Maynard Keynes was intellectually promiscuous.

But it was Gary Becker and his followers who systematically applied the methodological tools of economics to social issues such as racial discrimination, the family and addiction.

Some of the ideas Becker championed — notably the use of education to improve “human capital” — became so mainstream as to be a cliché. Others remain controversial.

But nobody bats an eyelid when the economist Emily Oster publishes books of advice on pregnancy and parenting, when Steven “Freakonomics” Levitt opines on when to rob a bank, or even when the Financial Times publishes a column using economics to give tips on dating and etiquette. Economic imperialism is here to stay.

The “how” is also changing. Twenty years ago, the economist Ed Lazear published a paper, “Economic Imperialism”, with Becker at its centre.

Lazear argued that economic imperialism had been a success because “economics stresses three factors that distinguish it from other social sciences. Economists use the construct of rational individuals who engage in maximising behaviour. Economic models adhere strictly to the importance of equilibrium as part of any theory. Finally, a focus on efficiency leads economists to ask questions that other social sciences ignore.”

This is, I think, a fair summary of the state of play in 1999. But two decades on, economics is no longer quite so taken with the assumption of rationality. With Nobel memorial prizes for behavioural economics going to Daniel Kahneman (2002), Robert Shiller (2013) and Richard Thaler (2017), it has now become perfectly acceptable to publish economics papers with an alternative view of human decision-making.

That is not the only change in the toolkit of economics. The first modern randomised clinical trial was run by a man trained in economics, Austin Bradford Hill, in the late 1940s — but the methodology did not become widespread in economics until the 21st century.

The randomistas — most prominently the 2019 Nobel laureates Abhijit Banerjee, Esther Duflo and Michael Kremer — put the experimental results centre stage; the considerations that Lazear highlighted are not forgotten, but they are left in the wings.

Other economists are broadening the tools of economics by taking advantage of huge datasets and operating on the fringes of computer science. Two prominent examples are Susan Athey — the first female winner of the John Bates Clark Medal — and Raj Chetty, who won the same prize at the tender age of 33. Among the sources of this new data rush are internet traffic, cell-phone metadata, satellite imagery and the ballooning administrative datasets used by large organisations to run their businesses.

If the “how” is changing quickly, the “who” is stubbornly resistant to change. Economists used to be white and male. Now they are mainly white or Asian, and male.

Of course, there are some spectacular exceptions: in 2005, when I began writing my column for the FT, there was no female winner of the Nobel memorial prize in economics. There are now two.

Even more perplexingly — given that the award is for younger researchers — there was no female winner of the John Bates Clark Medal. There are now four, which is progress. Women such as Elinor Ostrom, Claudia Goldin and Janet Yellen have reached the very top of the profession, as did the late Alice Rivlin.

But economics still lacks the diversity it needs to reach its full potential. The Royal Economic Society has launched a “Discover Economics” campaign to address this, but it will take more than a recruitment drive: a 2014 study, “Women in Academic Science”, concluded that while other academic disciplines had been levelling the playing field, economics was an exception. We need to do better.

Economics is a controversial discipline, and that is not likely to change. Whereas scientists only occasionally have to dip their toes into political waters such as climate change or vaccination, most of what economists study — from inequality to immigration, trade to taxation — lies squarely in the middle of the political battlefield.

Still, some of us are doing our best, and all of us are human, as these portraits show. It is nice to be reminded of that.

Mariana Cook’s book is “Economists“.
Written for and first published in the Financial Times on 21 December 2019.

Catch up on the first season of my podcast “Cautionary Tales” [Apple] [Spotify] [Stitcher]

Receive these posts by email

(You can unsubscribe at any time)


20th of January, 2020Other WritingComments off