Tim Harford The Undercover Economist

Undercover EconomistUndercover Economist

My weekly column in the Financial Times on Saturdays, explaining the economic ideas around us every day. This column was inspired by my book and began in 2005.

Undercover Economist

When doing nothing is the best option

The leaders of the free world are returning from their holidays. Must they? Surely no good can come of this.

While on vacation Donald Trump managed to eject most of his advisers, threaten a nuclear war with an unabashed North Korea, and display an unnerving willingness to see things from the Nazi point of view. Goodness knows what he will do now he’s fully back on the job. Theresa May returned from her Easter holiday with the splendid idea of calling a snap general election, so I can hardly contain my excitement as I wait for her latest brainstorm.

A flawed leader leaves us grateful for the quiet days, and one of the saving graces of Mr Trump’s administration is that, while he has many bad ideas, he is not always committed to them. Promising to build a wall, rip up Nafta and discriminate against Muslims and transgender people is damaging enough, but at least the follow-through has been patchy. It is a fragile mercy, but Mr Trump seems to prefer complaining about the US government to leading it.

Mrs May’s lack of leadership is more valuable. The British people have dealt the British establishment an unplayable hand: a parliament strung out between several lunatic fringes, and a referendum result that is hard to interpret and even harder to deliver. With the prime minister powerless, her ministers are showing signs of quiet realism. Yes, the country is chugging towards a train-crash Brexit, but at least our politicians are tying fewer hostages to the tracks.

Since I disagree with most of what Mrs May and Mr Trump are trying to do I might be expected to celebrate every day on which they do not do it. But there may be a deeper principle here: in many areas of life we demand action when inaction would serve us better.

The most obvious example is in finance, where too many retail investors trade far too often. One study, by Brad Barber and Terrance Odean, found that the more retail investors traded, the further behind the market they lagged: active traders underperformed by more than 6 percentage points (a third of total returns) while the laziest investors enjoyed the best performance.

This is because dormant investors not only save on trading costs but avoid ill-timed moves. Another study, by Ilia Dichev, noted a distinct tendency for retail investors to pile in when stocks were riding high and to sell out at low points.

It would be nice to recommend laziness as a universal principle, but alas many companies have turned consumer inertia into a revenue stream. Sometimes we must rouse ourselves to cancel a gym membership or find a cheaper insurance policy. Still, there are many situations where doing nothing is a sound tactic.

The same can be said of medicine. It is a little unfair on doctors to point out that when they go on strike, the death rate falls. Nevertheless it is true. It is also true that we often encourage doctors to act when they should not. In the US, doctors tend to be financially rewarded for hyperactivity; everywhere, pressure comes from anxious patients. Wiser doctors resist the temptation to intervene when there is little to be gained from doing so — but it would be better if the temptation was not there.

Some politicians expertly dodge demands for action. Tony Blair was often accused of recycling announcements, turning a single policy into a dozen press releases. But better one decent policy announced a dozen times than a dozen half-baked policies each announced once.

The argument for passivity has been strengthened by the rise of computers, which are now better than us at making all sorts of decisions. We have been resisting this conclusion for 63 years, since the psychologist Paul Meehl published Clinical vs. Statistical Prediction. Meehl later dubbed it “my disturbing little book”: it was an investigation of whether the informal judgments of experts could outperform straightforward statistical predictions on matters such as whether a felon would violate parole.

The experts almost always lost, and the algorithms are a lot cleverer these days than in 1954. It is unnerving how often we are better off without humans in charge. (Cue the old joke about the ideal co-pilot: a dog whose job is to bite the pilot if he touches the controls.)

Perhaps it is no coincidence that many august institutions are designed not to support wise action but to prevent foolishness. Supreme courts, independent central banks and the EU are often at their best when applying the brakes. No wonder so many of the deepest Eurosceptics — from Jeremy Corbyn to Marine Le Pen — are the politicians with the longest list of self-harming policies.

It is human nature to believe something must always be done. Yet we overrate our abilities to do it and it is awfully hard to make the case for passivity. The task is not made easier by campaigners wanting a policy, newspapers wanting a story or the patient wanting a pill. Who dares to offer them nothing?

Written for and first published in the Financial Times on 1 Sep 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Trump, Bannon, and the terrible lure of zero-sum thinking

As visual metaphors go, it wasn’t bad: Donald Trump ignoring expert advice and risking calamity by staring up at the sun as the moon’s shadow passed across America. Self-destructiveness has become a habit for this president — and for his advisers. A recent example: former White House chief strategist Steve Bannon called Robert Kuttner, a prominent progressive journalist, to declare that his internal foes in the administration were “wetting themselves”. Shortly after Mr Kuttner wrote about the conversation, Mr Bannon was out.

But the truly harmful temptation here is not eclipse-gazing or indiscreet interviews. It’s another idea that Mr Bannon proposed to Mr Kuttner: that the US was in “an economic war with China”. It seems intuitive; many ordinary Americans feel that they cannot win unless China loses. But the world economy is not like a game of football. Everyone can win, at least in principle. Or everyone can lose. Falling for Mr Bannon’s idea of economic war makes the grimmer outcome far more likely.

Like many dangerous ideas there is some truth in it. The American middle class has been suffering while China has been booming. Branko Milanovic, author of Global Inequality (UK) (US), has produced a striking elephant-shaped graph showing how, since the late 1980s, the rich have been doing well, as have many other groups, including the Asian middle class. But earnings near though not at the top of the global income ladder have stagnated. That does not demonstrate harm from China: there is the fall of the Soviet Union to consider, and the struggles of Japan.

However, another study, from David Autor, David Dorn and Gordon Hanson, has shown the lasting impact of the “China shock”. It was no surprise that competition from China put some Americans out of work, but Mr Autor and his colleagues showed that the effects were more locally concentrated, deeper and more enduring than expected. These are important and worrying findings.

But Mr Bannon’s “economic war” is a cure far worse than the disease, and a misdiagnosis of how the world economy works. America has still benefited from trade with Asia and attacking China — even metaphorically — will do nothing for the American middle class. This is because it is surprisingly hard to find a zero-sum game in the real world.

Most commercial transactions offer benefits to both sides, otherwise why would they take place at all? A trip to a restaurant provides good food and a pleasant evening for me, gainful employment for the waiting staff and the chef, and a lively environment for the neighbourhood. Everyone can gain. There are zero-sum elements to the affair: every penny I hand over is a loss to me and a gain to the restaurant staff or owner. But it is best all round not to obsess too much on such matters.

Zero-sum thinking apparently makes for good politics but bad policy. The UK government has shown an unnerving tendency to treat its EU negotiations as a zero-sum affair, in which the Europeans can “go whistle”, in the words of foreign secretary Boris Johnson.

In the Brexit referendum the Vote Leave campaign turned on a zero-sum claim: we send money to the EU, we should spend it on ourselves. The form of the argument was as powerfully misleading as the details: the focus on membership fees pulled the attention of voters away from the idea of the EU as a club of co-operating nations.

Populists of all stripes focus on zero-sum arguments because they’re easy to explain and emotionally appealing. Any toddler understands the idea of grabbing what someone else has; most adults prefer a situation where everyone gains.

The theory of zero-sum games was developed by the mathematician John von Neumann and the economist Oskar Morgenstern in their famous book published in 1944. It works fine for analysing chess and poker, but by itself zero-sum thinking is not much use to an economist who analyses a world full of win-win situations, of gains from trade.

Zero-sum thinking is not even that helpful to a military strategist. Von Neumann was a cold war hawk: “If you say bomb the Soviets tomorrow, I say why not today?”, Life magazine quoted him as saying. “If you say bomb them at five o’clock, I say why not one o’clock?” He was a genius, but it does not take a genius to see the blind spot in his thinking.

The populists may lack the genius but they have the same blind spot. Not coincidentally, the focus on zero-sum rhetoric has drawn attention away from more plausible solutions, many of which are purely domestic: higher quality education, publicly funded infrastructure investment, antitrust action to keep markets functioning competitively, and a more constructive welfare state which supports and encourages work rather than stigmatises and punishes idleness.

The biggest risk is that zero-sum thinking becomes self-fulfilling. Given oxygen by years of slow growth, it will lower growth further. By emphasising conflict it will intensify it. The US is not in an economic war with China, but could start one. That might help Mr Trump. It would not help those he claims to defend.

 
Written for and first published in the Financial Times on 25 August 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

The psychological biases that leave us unprepared for disaster

This column was written and first published a week before Hurricane Harvey struck the US coast. – TH

Who saw the global financial crisis coming, who didn’t and who deserved blame for the forecasting failure? After a decade of debating these questions, I wonder whether we shouldn’t be asking a different one: even if we had clearly seen the crisis coming, would it have made a difference? Perhaps — but perhaps not.

Consider New Orleans in 2004. With a terrible hurricane bearing down on the city, officials realised that the situation was grim. The levees were in disrepair and a storm surge could flood the low-lying city. A hundred thousand residents would be unable to evacuate without help, and not enough help was available. A plan was hatched to evacuate families to the Superdome, a sports stadium, but managers there warned that it simply could not house so many. If only there had been more warning of disaster.

Some readers will recall, though, that the catastrophe of Hurricane Katrina took place in 2005. The storm of 2004 was Hurricane Ivan, which, after lashing the Caribbean, weakened and turned aside from New Orleans. The city had been given almost a full year’s warning of the gaps in its defences.

The near miss led to much discussion but little action. When Hurricane Katrina hit the city, evacuation proved as impractical and the Superdome as inadequate as had been expected. The levees broke in more than 50 places, and about 1,500 people died. New Orleans was gutted. It was an awful failure but surely not a failure of forecasting.

Robert Meyer and Howard Kunreuther in The Ostrich Paradox (UK) (US) argue that it is common for institutions and ordinary citizens to make poor decisions in the face of foreseeable natural disasters, sometimes with tragic results. There are many reasons for this, including corruption, perverse incentives or political expediency. But the authors focus on psychological explanations. They identify cognitive rules of thumb that normally work well but serve us poorly in preparing for extreme events.

One such mental shortcut is what the authors term the “amnesia bias”, a tendency to focus on recent experience. We remember more distant catastrophes but we do not feel them viscerally. For example, many people bought flood insurance after watching the tragedy of Hurricane Katrina unfold, but within three years demand for flood insurance had fallen back to pre-Katrina levels.

We cut the same cognitive corners in finance. There are many historical examples of manias and panics but, while most of us know something about the great crash of 1929, or the tulip mania of 1637, those events have no emotional heft. Even the dotcom bubble of 1999-2001, which should at least have reminded everyone that financial markets do not always give sensible price signals, failed to make much impact on how regulators and market participants behaved. Six years was long enough for the lesson to lose its sting.

Another rule of thumb is “optimism bias”. We are often too optimistic, at least about our personal situation, even in the midst of a more generalised pessimism. In 1980, the psychologist Neil Weinstein published a study showing that people did not dwell on risks such as cancer or divorce. Yes, these things happen, Professor Weinstein’s subjects told him: they just won’t happen to me.

The same tendency was on display as Hurricane Sandy closed in on New Jersey in 2012. Robert Meyer found that residents of Atlantic City reckoned that the chance of being hit was more than 80 per cent. That was too gloomy: the National Hurricane Center put it at 32 per cent. Yet few people had plans to evacuate, and even those who had storm shutters often had no intention of installing them.

Surely even an optimist should have taken the precautions of installing the storm shutters? Why buy storm shutters if you do not erect them when a storm is coming? Messrs Meyer and Kunreuther point to “single action bias”: confronted with a worrying situation, taking one or two positive steps often feels enough. If you have already bought extra groceries and refuelled the family car, surely putting up cumbersome storm shutters is unnecessary?

Reading the psychological literature on heuristics and bias sometimes makes one feel too pessimistic. We do not always blunder. Individuals can make smart decisions, whether confronted with a hurricane or a retirement savings account. Financial markets do not often lose their minds. If they did, active investment managers might find it a little easier to outperform the tracker funds. Governments, too, can learn lessons and erect barriers against future trouble.

Still, because things often do work well, we forget. The old hands retire; bad memories lose their jolt; we grow cynical about false alarms. Yesterday’s prudence is today’s health-and-safety-gone-mad. Small wonder that, 10 years on, senior Federal Reserve official Stanley Fischer is having to warn against “extremely dangerous and extremely short-sighted” efforts to dismantle financial regulations. All of us, from time to time, prefer to stick our heads in the sand.

Written for and first published in the Financial Times on 18 August 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Challenge is all too easily ducked by today’s knowledge workers

“The man whose whole life is spent in performing a few simple operations, of which the effects, too, are perhaps always the same . . . generally becomes as stupid and ignorant as it is possible for a human creature to become.” This anxiety about the stupefying effects of cog-in-a-machine manufacturing sounds like a line from Karl Marx. It is, in fact, from Adam Smith’s The Wealth of Nations.

As the anniversary of Smith’s death was this week, it seemed like a good moment to reflect on the Scottish philosopher’s warning about the deadening effect of repetitive work. Smith knew that specialisation and the division of labour weren’t about to disappear, so he advocated publicly funded schools as a path to more fulfilling work and leisure.

The emergence of mass production lines made Smith’s words seem prophetic; but many repetitive jobs have since been taken by machines. So, has his warning about stultifying work been rendered obsolete?

The Wealth of Nations is almost a quarter of a millennium old, and we should not expect every word to ring true today. But correctly read, Smith’s anxiety continues to resonate — and not just for people with repetitive jobs, but knowledge workers too.

The modern knowledge worker — a programmer, a lawyer, a newspaper columnist — might appear inoculated from Smith’s concern. We face not monotony but the temptations of endless variety, with the entire internet just a click away. All too easily, though, we can be pulled into the soothing cycle of what slot-machine designers call a “ludic loop”, repeating the same actions again and again. Check email. Check Facebook. Check Instagram. Check Twitter. Check email. Repeat.

Smith would not have dreamt of a smartphone, but what is a ludic loop but “performing a few simple operations, of which the effects, too, are perhaps always the same”?

Smith was concerned about jobs that provided no mental challenge: if problems or surprises never arose, then a worker “has no occasion to exert his understanding, or to exercise his invention, in . . . removing difficulties which never occur.”

For the modern knowledge worker, the problem is not that the work lacks challenge, but that the challenge is easily ducked. This point is powerfully made by computer scientist Cal Newport in his book Deep Work (US) (UK). Work that matters is often difficult. It can be absorbing in mid-flow and satisfying in retrospect, but it is intimidating and headache-inducing and full of false starts.

Email is easier. And reading Newport’s book I realised that email posed a double temptation: not only is it an instant release from a hard task, but it even seems like work. Being an email ninja looks professional and seems professional — but all too often, it is displacement activity for the work that really matters.

A curious echo of Smith’s warning comes in Robert Twigger’s new book Micromastery (US) (UK). Mr Twigger sings the praises of mastering one small skill at a time: not how to cook, but how to make the perfect omelette; not how to build a log cabin, but how to chop a log. There is much to be said for this. We go deep — as Newport demands — but these sharp spikes of skill are satisfying, not too hard to acquire and a path to true expertise.

They also provide variety. “Simply growing up in the premodern period guaranteed a polymathic background,” writes Twigger. To prosper in the premodern era required many different skills; a smart person would be able to see a problem from many angles. A craft-based, practical upbringing means creative thinking comes naturally. “It is only as we surge towards greater specialisation and mechanisation that we begin to talk about creativity and innovation.”

I draw three lessons from all this. The first is that learning matters. Smith wanted schooling for all; Twigger urges us to keep schooling ourselves. Both are right.

The second is that serious work requires real effort, and it can be tempting to duck that effort. Having the freedom to avoid strenuous thinking is a privilege I am glad to have — but I am happier when I don’t abuse that freedom.

The third lesson is that old-fashioned craft offered us something special. To Smith it was the challenge that came from solving unpredictable problems. To Twigger it is the variety of having to do many small things well. To Newport, it is the flow that comes from deep immersion in a skill that requires mastery. Perhaps all three mean the same thing.

Smith realised that the coming industrial age threatened these special joys of work. The post-industrial age threatens them too, in a rather different way. Fortunately, we have choices.

“The understandings of the greater part of men are necessarily formed by their ordinary employments,” wrote Smith. So whether at work or at play, let us take care that we employ ourselves wisely.

 

Written for and first published in the Financial Times on 21 July 2017.

My new book is “Fifty Things That Made The Modern Economy” – out last week in the UK and coming in a few days in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Think like a supermodel if you want to win from the gig economy

Are we misunderstanding the endgame of the annoyingly named “gig economy”? At the behest of the UK government, Matthew Taylor’s review of modern working practices was published this week. The title could easily have graced a report from the 1930s, and the review is in many ways a conservative document, seeking to be “up to date” while preserving “enduring principles of fairness”.

Mr Taylor, chief executive of the RSA and a former policy adviser to the Blair government, wants to tweak the system. One proposal is to sharpen up the status of people who are neither employees nor freelancers, calling them “dependent contractors” and giving them some employment rights. In the US, economists such as Alan Krueger — formerly the chairman of Barack Obama’s Council of Economic Advisers — proposed similar reforms.

There is nothing wrong with this; incremental reform is often wise. Quaint ideas such as the employer-employee relationship are not yet obsolete. Yet they might yet become so, at least in some industries. If they do, I am not sure we will be ready. The obsolescence I have in mind was anticipated by Silicon Valley’s favourite economist, Ronald Coase. Back in 1937, a young Coase wrote “The Nature of the Firm”, calling attention to something strange: while corporations competed within a competitive marketplace, corporations themselves were not markets. They were hierarchies. If you work for a company, you don’t allocate your time to the highest bidder. You do what your boss tells you; she does what her boss tells her. A few companies dabble with internal marketplaces, but mostly they are islands of command-and-control surrounded by a sea of market transactions.

Coase pointed out that the border between hierarchy and market is a choice. Corporations could extend their hierarchy by merging with a supplier. Or they could rely more on markets, spinning off subsidiaries or outsourcing functions from cleaning and catering to IT and human resources. Different companies make different choices and the ones that choose efficiently will survive.

So what is the efficient choice? That depends on the nature of the job to be done. A carmaker may well want to have the engine manufacturer in-house, but will happily buy bulbs for the headlights from the cheapest bidder.

But the choice between hierarchy and market also depends on the technology deployed to co-ordinate activity. Different technologies favour different ways of doing things. The bar code made life easier for big-box retailers. While eBay favoured the little guy, connecting buyers and sellers of niche products.

Smartphones have allowed companies such as Uber and Deliveroo to take critical middle-management functions — motivating staff, evaluating and rewarding performance, scheduling and co-ordination — and replace them with an algorithm. But gig workers could install their own software, telling it where they like to work, what they like to do, when they’re available, unavailable, or open to persuasion. My app — call it GigBot — could talk to the Lyft app and the TaskRabbit app and the Deliveroo app, and interrupt me only when an offer deserves attention.

Not every job can be broken down into microtasks that can be rented out by the minute, but we might be surprised at how many can. Remember that old line from supermodel Linda Evangelista, “We don’t wake up for less than $10,000 a day”? GigBot will talk to your alarm clock; $10 or $10,000, just name the price that would tempt you from your lie-in.

It is easy to imagine a dystopian scenario in which a few companies hook us in like slot-machine addicts, grind us in circles like cogs, and pimp us around for pennies. But it is not too hard to imagine a world in which skilled workers wrest back control using open-source software agents, join electronic guilds or unions and enjoy a serious income alongside unprecedented autonomy.

Nothing empowers a worker like the ability to walk out and take a better offer; in principle the gig economy offers exactly that. Indeed both scenarios may come true simultaneously, with one type of gig for the lucky ones, and another for ordinary folk.

If we are to take the best advantage of a true gig economy, we need to prepare for more radical change. Governments have been content to use corporations as delivery mechanisms for benefits that include pensions, parental leave, sick leave, holidays and sometimes healthcare — not to mention the minimum wage. This isn’t unreasonable; even a well-paid freelancer may be unable to buy decent private insurance or healthcare. Many of us struggle to save for a pension. But if freelancers really do start to dominate economic activity — if — the idea of providing benefits mostly through employers will break down.

We will need governments to provide essential benefits, perhaps minimalist, perhaps generous, to all citizens. Above that safety net, we need portable benefits — mentioned warmly but briefly by Mr Taylor — so that even a 10-minute gig helps to fill a pension pot or earn time towards a holiday. Traditional corporate jobs have been socially useful, but if you push any model too far from reality, it will snap.

 
Written for and first published in the Financial Times on 14 July 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now week in the UK and coming very soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Fantasy gaming can be better than reality

“The only thing that can make me happy is computer games!” So declares my five-year-old son, tears streaming down his cheeks, with a vehement desperation that merely encourages me to ration this potent experience. I don’t recall my own parents restricting the time I spent gaming, but then I didn’t have access to computers until I was nearly 10, with the arrival of an Oric-1, with 8 glorious colours and a magnificent 48K of memory.

Maybe I didn’t spend much time on the computer anyway — my ability to play games would have been limited by the fact that my mother, something of a hacker, would be hogging it. Or perhaps I found the games less addictive than my son does, although I seem to remember many hours as a teenager playing the magnificent interstellar odyssey Elite. It cannot be denied that games are getting better: varied, beautiful, narratively engaging, and often social, too, with millions logging into online worlds, forging alliances and waging battles, all in character and alongside friends. Some games are dreadful clickfests, little better than slot machines. But that should not discredit all games any more than Fifty Shades of Grey discredits Finnegans Wake.

I stopped playing computer games in 1999, at the age of 25. Increasingly realistic games on ever-larger screens gave me motion sickness — few disincentives are quite as visceral as nausea. Then, in 2004, I met Edward Castronova. He is an economist who had enjoyed some media attention after calculating the gross domestic product per capita of Norrath, the entirely-imaginary setting of an online game called EverQuest. Mr Castronova pointed out that players who slogged away at the mundane parts of the game could earn real money. In-game achievements — stronger characters, magical abilities — could be sold to other players who wanted a short-cut. The wage was about $3.50 an hour, not a lot for a New Yorker, but serious money if you lived in Dar es Salaam.

The surprising idea that people could earn a living slaying orcs attracted some attention, but Mr Castronova pointed out something else: some gamers were spending serious amounts of time online. They played not for money but for fun, devoting many hours a week to engaging, challenging and persistent online roles. And if games are really such fun, who needs reality? Why work in Starbucks when you could command a starship? Mr Castronova published a book, Exodus to the Virtual World (UK) (US), in 2007, describing people turning their back on the physical world and spending more time in virtual ones. It seemed highly speculative at the time. But data from the US labour market increasingly suggest that Mr Castronova was on to something.

Four economists — Mark Aguiar, Mark Bils, Kerwin Kofi Charles and Erik Hurst — have published their latest research paper studying the impact of awesome computer games on the US job market. The basic observation is this: the unemployment rate in America is at its lowest level for 16 years; if it drops a little further it will be at its lowest level since 1969. Yet some people — young men in particular — are completely disengaged from the labour market. (They don’t count as unemployed because they’re not looking for work.) In 2016 — excluding full-time students — 15 per cent of men in their twenties did not work a single week in the entire year. In the year 2000, the last time unemployment was this low, the comparable figure was 8 per cent

So at a time when most of the people looking for jobs find them, why are so many young men not even looking? One explanation is that they think there is no hope, but another explanation is that they would rather be playing a game. Food is cheap; living with your parents is cheap; computer games are cheap. Why work? Distinguishing the two hypotheses is not easy, but Prof Aguiar and his colleagues make a good case that the pull of video games is an important part of the story. Women and older men — who spend less time playing games — are more engaged with the labour market.

This is an alarming trend: if basement-dwelling videogamers are turning their backs on reality, they are missing a vital opportunity to pick up the skills, experience and contacts they will need if they’re ever to earn a proper living. The long-term prognosis is worrying.

Then again, good games do bring happiness. Joblessness is usually a reliable predictor of misery, yet men under 30 are far less likely to be unhappy than in the early 2000s. The proportion saying they’re “very happy” or “pretty happy” has risen from 81 to 89 per cent, almost halving the rate of unhappiness. The reverse is true for men over 30.

Without exception, the longest and firmest friendships in my life are with other gamers. I favour face-to-face games with dice and pencils, but those games still involve me and my friends stepping into a fantasy world. I am worried that so many young men are disconnected from the job market. But perhaps we should stop blaming the games, and see if we can get reality to pull its socks up.

 
Written for and first published in the Financial Times on 7 July 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now in the UK and coming very soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

We are still waiting for the robot revolution

The cash machine turned 50 this week — old enough, I think, to teach us a few lessons about the dawning of a new machine age. It seems a good advertisement for practical innovation that makes life a little easier. But with its very name a promise to replace a human being, the “automated teller machine” seems a harbinger of mass technological unemployment.

The story of the robot takeover has become familiar: robots came first for the bank tellers, and I did not speak out, for I was not a bank teller. Then overnight the robots were driving trucks, performing legal research and interpreting mammographic X-rays. The only jobs remaining were those writing books with titles such as Race Against The Machine and Rise of the Robots.

The difficulty with these visions of technological joblessness is that there are plenty of jobs around at the moment. In the UK, the employment rate is nearly 75 per cent; it hasn’t been higher since records began in 1971. The job situation is not quite so rosy in the US, where participation in the labour market is down since the Clinton and Bush years. Still, the unemployment rate is at a 16-year low. A job apocalypse this is not.

The ATM may help to explain this apparent puzzle. James Bessen of Boston University points out that the ATM did not, in fact, replace bank tellers — there are more bank teller jobs in the US now than when the ATM was introduced.

This should not entirely be a surprise: the original story of the cash machine is that its inventor John Shepherd-Barron had the door of his local bank slammed in his face on a Saturday lunchtime, and was frustrated that there was no way to get his money until Monday morning. Mr Shepherd-Barron didn’t invent a replacement for human tellers so much as a way to get cash at any time of the day or night. Banks opened more branches and employed humans to cross-sell loans, mortgages and credit cards instead. The automated teller worked alongside more human tellers than ever.

The ATM is no outlier here. Mr Bessen found that in the 19th century, 98 per cent of the labour required to weave cloth was automated — yet employment in the weaving industry increased as our demand for clothes more than offset the labour-saving automation. The same process seems to be at work today in the legal sector: artificial intelligence is increasingly being deployed to do tasks once done by legal clerks, but employment of those clerks is up, not down. Mr Bessen has found only one clear example of automation entirely eliminating a human job: elevator operators. There are other jobs that haven’t been taken by robots, but nevertheless have disappeared — Hansom cab driving, or operating a telegraph. Banks are not being replaced by cash dispensers so much as bypassed entirely by contactless payments and online accounts.

Overall, though, machines have been tools that have enhanced human productivity. They automate some routine tasks. This expands output and it also boosts demand for humans to perform complementary, non-routine tasks. This leads to better pay, more interesting work and as many jobs as ever overall.

Or at least — it should. Our chief economic problem right now isn’t that the robots are taking our jobs, it’s that the robots are slacking off. We suffer from slow productivity growth; the symptoms are not lay-offs but slow-growing economies and stagnant wages. In advanced economies, total factor productivity growth — a measure of how efficiently labour and capital are being used to produce goods and services — was around 2 per cent a year in the 1960s, when the ATM was introduced. Since then, it has averaged closer to 1 per cent a year; since the financial crisis it has been closer to zero. Labour productivity, too, has been low.

Plenty of jobs, but lousy productivity: imagine an economy that was the exact opposite of one where the robots took over, and it would look very much like ours. Why? Tempting as it may be to blame the banks, a recent working paper by John Fernald, Robert Hall and others argues that productivity growth stalled before the financial crisis, not afterwards: the promised benefits of the IT revolution petered out by around 2006. Perhaps the technology just isn’t good enough; perhaps we haven’t figured out how to use it. In any case, results have been disappointing.

There is always room for the view that the productivity boom is imminent. A new policy paper from business economists Michael Mandel and Bret Swanson argues that we are starting to find digitally driven efficiencies in physical industries such as energy, construction, transport, and retail. If this happens, Silicon Valley-style innovation will ripple through the physical economy. If.

It is in the nature of exponential growth that the near future can easily outweigh the recent past. But we are still waiting. For now, the machine has stalled and the error message reads: “Sorry: this robot takeover could not be completed at present.”

Written for and first published in the Financial Times on 30 June 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now in the UK and coming soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Why we need to build more homes

If we were living in a movie, the ash-blackened cage looming over West London would be a metaphor for something. Instead, the Grenfell Tower disaster — so catastrophic that we are told we may never know how many people died — is a distinctly un-metaphorical national disgrace.

The least we can do now is learn the lessons of the fire, as we did not after the Lakanal House fire of 2009, which killed six people. Some of those lessons should emerge from the public inquiry. Grenfell Tower was built in 1974 but had recently been renovated. In a sane world such a renovation should have improved safety standards. Apparently we do not live in such a world.

But beyond the life-and-death details of fire safety rules and enforcement, a bigger picture has long been apparent: the British housing system, particularly in London, is in a shocking state. Decades of policy failure have left us with unaffordable housing. That is why the residents of unsafe housing feel trapped and voiceless, unable to afford to move, and powerless to demand change.

A better politician than Theresa May might have used this tragedy to justify housing reform, announcing a bold plan to build a million new quality homes before 2020. That target is less ambitious than it sounds, merely making up many years of undersupply. Having many more decent homes on the market would lower rents and make other housing policy goals — choice, fairness, quality, safety — easier to achieve. But perhaps that expecting too much of Mrs May. Stronger prime ministers in luckier circumstances have failed to make headway on housing.

What of Jeremy Corbyn, the man who is keen to remind us that he’d be quite willing to run the country? The Labour leader certainly made a better job of appearing prime ministerial after the fire, showing concern where Mrs May seemed distant. But this — the performance part of the job — is not what matters most. Boris Johnson could also have played the necessary role to perfection, but that would hardly qualify him to lead the country.

What we really need from our politicians is a willingness to advocate and execute wise policies. Mr Corbyn, instead, focused on grabbing and redistributing property. “There are a large number of deliberately kept vacant flats and properties all over London,” he told ITV journalist Robert Peston, arguing that these properties should be used to house the victims of the fire. When pressed for detail he added, “Occupy, compulsory purchase it, requisition it, there’s a lot of things you can do.”

This is a telling statement — even leaving aside the use of the word “occupy”, which seems to wink at the idea of breaking into other people’s homes. Since Mr Corbyn’s remarks are often uncharitably interpreted by the British press, let us assume that was not his goal.

Still, there is little ambiguity in the word “requisition”. This reflects a consistent theme in Mr Corbyn’s thinking: that the British public can best be served by forcing others — including international corporations and foreign property investors — to bear most of the cost.

It’s not hard to see why this seems appealing. It taps into the same ideals exploited by Donald Trump and the Leave campaign: that there’s money being left on the table, that foreigners are rigging the game against us. Time to give the “Gnomes of Zurich” a poke in the eye.

Such xenophobia-tinged ideas have taken nations to very dark places in the past, but today it is more likely that they will simply lead to inept policies and bad outcomes. Seizing foreign-owned property in London — even proposing seizure — will reduce the tax base and do yet more harm to the reputation of the UK as a grown-up country. (Perhaps that ship has sailed.)

Rich and apparently wasteful foreigners are an easy scapegoat for the problem of high prices and high rents in London. But they have become a target based on anecdotes. What limited data we have — it is admittedly patchy — suggests that the idea of widespread “buy to leave” is a myth. Wealthy foreign investors did not become so by squandering rental income.

We should be taxing all property, with expensive properties taxed at higher rates, occupied or not, foreign-owned or not. There is no need to be vindictive. And we should be building more. There is still scope in London — to say nothing of the leafy south-east — to build safe, pleasant apartments in high-rise and medium-rise blocks. There are costs of doing so, but the costs of not doing so are far greater.

As for the survivors of the Grenfell Tower fire, plans have been announced to offer them homes in a new local development. Good. Surely the British taxpayer is willing and able to do what is right, and pay from our own pockets to rehouse them swiftly and well. There is no need to requisition anything.

This disaster should provoke a rethink not just of rules on sprinkler systems, but the entire rotten edifice of British housing policy. I am not optimistic; I fear we’re getting the politics — and the politicians — we deserve. But the residents of Grenfell Tower deserved so much better.

 

Written for and first published in the Financial Times on 23 June 2017.

My new book is “Fifty Things That Made The Modern Economy“.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Wishful thinking in politics is a recipe for foolishness

Shortly after the hilarious UK election, I had the opportunity to ask a Conservative politician (retired, centrist) what he made of the result. “Good news for a soft Brexit,” was one of his conclusions. That was my impression, too. Since both of us were Remainers, it was a comforting thought.

And then I reflected for a moment. Back when Theresa May was expected to secure a stonking majority, wasn’t that also supposed to be good news for a soft Brexit? I reflected on how I’d felt when Mrs May called the election. My dismay at the political choices on offer was offset a little by the hope that, with a clear victory, Mrs May would be unafraid of the extremist Eurosceptics.

I was forced to admit that I could take almost any scenario and produce a soothing interpretation of it. Wishful thinking is a powerful thing, and common. Recently the data journalist Nate Silver wrote that “Donald Trump is making Europe liberal again” and explored the possibility that Mr Trump’s toxic reputation in Europe was helping to suppress the far-right from Austria’s Freedom party and France’s Front National to Britain’s UK Independence party. Mr Silver made a good case, but I couldn’t help feeling that it was widely shared among my European liberal friends because it’s exactly what they wanted to believe.

A more scientific illustration of the same tendency in US politics comes courtesy of the psychologists Ben Tappin, Leslie van der Leer and Ryan McKay. The researchers wanted to tease apart two psychological tendencies that are often conflated: confirmation bias and desirability bias. Desirability bias is wishful thinking: we see what we want to see. Confirmation bias is our tendency to see what we expect to see (now that I’m aware of confirmation bias, I see it everywhere).

A month before the US presidential election, Mr Tappin and his colleagues recruited US-based experimental participants in four categories: Hillary Clinton supporters who expected her to win, Trump supporters who expected him to win, and supporters of each candidate who expected to be disappointed. Then the researchers showed their participants fresh opinion poll data that suggested either that Mrs Clinton was likely to win, or that Mr Trump was.

The research suggested that people seized not upon what they expected to see, but what they hoped to see instead. They tended to focus on encouraging evidence, whether or not it was surprising. They ignored unwelcome evidence. Desirability bias was stronger than confirmation bias.

There is a place in the world for wishful thinking — for example, in business. Without the sunny overconfidence of entrepreneurs few good ideas would ever get off the ground, because the chances of failure are so high.

Even the successes tend to generate more benefits for customers than shareholders. One study, by the great economist Bill Nordhaus, concluded that US companies retained less than 4 per cent of the social value of their innovation. The other 96 per cent went to customers. Given their inability to profit even when things go well, rational entrepreneurs would never quit their day jobs. Business innovation is built on the back of giddy optimism.

Wishful thinking even has its role in politics. The quest for marriage equality, for civil rights, for votes for women and for the abolition of slavery were all once distant dreams. Emmanuel Macron has surfed to success on a wave of optimism that an untested centrist can fix what ails France; I hope he succeeds, but hope alone will not be enough.

In many cases wishful thinking in politics is a recipe for foolishness. Much of Mr Trump’s appeal lay in the idea that the people who said policy was complicated were lying. Solutions were simple if you were strong and smart. It turns out that wishful thinking does not solve problems, but creates them.

Wishful thinking infects the political left, too. Many diehard Democrats seem bewildered that Republicans have had five full months to impeach their own man and still haven’t done it. In the UK, Jeremy Corbyn and his fan base have been enthused by his better than expected performance but seem not to have noticed that he lost the election.

Last year’s Brexit campaign was based on a simple piece of wishful thinking: Boris Johnson’s idea that the UK could have its cake and eat it. How, exactly, was never quite clear, but desirability bias gave a foolish idea more credibility than it deserved. Voters hoped that Mr Johnson was right, and so they began to believe him: it is so much easier to believe what we already wish is true.

That glib optimism stood in stark contrast to what experienced technocrats were saying behind the scenes. They warned that the UK simply didn’t have the time, the people or the expertise we needed to handle the process of leaving and then forging new trade agreements.

Mr Johnson told us that things were easy; the mandarins cautioned that they were difficult. I have my suspicions as to who will be proved correct, but we already know which proposition resonated with the voters. We need to be careful what we wish for.

Written for and first published in the Financial Times on 16 June 2017.

My new book is “Fifty Things That Made The Modern Economy” – out last week in the UK and coming soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

If your country lets you down, make a new one

Forget Brexit: I’m declaring independence from the rest of the UK. I’ll always have a deep and special connection with the land where I was born and grew up. Nevertheless, I’m taking back control, just as soon as I can figure out what that means.

I also need to choose a name for the divorce process. “Texit” sounds like a fajita laced with pesticide. “Hexit” is better, with something of the evil eye about it. Then there’s just the simple formality of the exit negotiations — if I can find someone in the British government with enough authority to negotiate.

To be clear, this isn’t because of petulance about the election, which did at least give us an amusing twist at the end. Well, it is — but not the result so much as the campaign, and what that campaign revealed. Both Labour and the Conservatives have traditions that I find admirable, but seemed to have concluded that none of those admirable traditions would win votes. The spin-doctors had decided that they could either appeal to me, or the typical British voter, but not both.

But as a wise man once sang, it seems like I’m not alone in being alone. Many people feel like strangers in their own country. Many Scots would rather not be British citizens; there are secessionists in California; and of course, the now-familiar threats of American liberals that they will move to Canada. (In a poetic display of trolling, Emmanuel Macron responded to Donald Trump’s rejection of the Paris agreement by inviting Americans to come to France to work on climate change.)

Such woes are felt across the political spectrum. Many Leave voters felt that Britain was being dragged away from them by a strange coalition of Brussels bureaucrats, the Westminster establishment and a gang of Lithuanian fruit-pickers. Any referendum result that would have pleased me would have left them feeling more alienated than ever. No result could have made both sides happy.

Still, that’s democracy for you: everyone has a vote, and if most people don’t see things your way, you just have to take it on the chin. Or do you? That’s not how things work in other parts of life. If you don’t like the coffee at Starbucks, go elsewhere or brew a cup at home. If your boyfriend is a slob, dump him and find someone better.

So if you don’t like the turn your country has taken, why not join another country — or even start up your own? I live in Oxford, surrounded by people who ride bicycles, marry foreigners and voted Remain. I don’t agree with my neighbours about everything but we do seem to be on the same wavelength. An Ivory Tower city state? Or maybe we could carve out a larger independent nation: London, Oxford, Cambridge, Brighton. Maybe Bristol, too. I love Bristol. And since the rest of the UK never seemed to like London much, maybe it’s best all round if we separate.

Of course, I’m not very serious about this idea. But then Brexit was never a very serious idea either. It’s going to happen, nevertheless.

A more flexible breakaway nation would float. Libertarian Patri Friedman — grandson of economist Milton and protégé of PayPal co-founder Peter Thiel — champions the idea of “seasteading” (UK) (US), setting up ocean-going colonies, powered by waves, wind and the fresh air of freedom. If you don’t like the way your particular floating city is going, unplug and find another one. It’s an intriguing alternative to traditional democracy. If you can’t live with the decisions of your fellow citizens, find some new fellow citizens.

Cory Doctorow’s new dystopian novel Walkaway (UK) (US) explores what happens when some people decide they’d rather fend for themselves than accept life in a default corporate world. It turns out that scavenger drones and 3D printers can get you a wonderful quality of life — until someone powerful doesn’t like what you’re doing and calls an air strike.

This is awkward. Being small makes you vulnerable. Ask Qatar. Oxford doesn’t have much of an air force; I don’t think we even have any tanks. I’m not sure what we’d do if Milton Keynes invaded.

Still, Iceland, Monaco and Singapore all seem to be surviving. Globalisation and peace does make it easier to be small. If the typical nation has a stiff border tariff and 10 infantry divisions, it pays to be one of the big boys. Yet as trade barriers fall, micro-nations become viable. Two decades ago the economists Alberto Alesina, Enrico Spolaore and Romain Wacziarg pointed out that there was a clear negative correlation between the average tariff rate and the number of independent countries in the world. Economic integration allows political disintegration.

There’s a cruel irony in all this. Small nations rely on being able to plug into a liberal global economy shielded from protectionists, pirates and week-long waits to clear customs. The very forces that make people like me worry for our own nations — nationalism, illiberalism and xenophobia — also make the world a more difficult place for breakaway states.

On reflection, perhaps it would be better to stay and try to sort things out. But the predicament is harder to swallow than a pesticide-laced fajita.

Written for and first published in the Financial Times on 9 June 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now in the UK! If you want to get ahead of the curve you can pre-order in the US (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Previous Next

Elsewhere

  • 1 Twitter
  • 2 Flickr
  • 3 RSS
  • 4 YouTube
  • 5 Podcasts
  • 6 Facebook

Books

  • Fifty Inventions That Shaped the Modern Economy
  • Messy
  • The Undercover Economist Strikes Back
  • Adapt
  • Dear Undercover Economist
  • The Logic of Life
  • The Undercover Economist

Search by Keyword

Tim’s Tweets

Free Email Updates

Enter your email address to receive notifications of new articles by email (you can unsubscribe at any time).

Join 157,543 other subscribers

Website Malware Scan
Do NOT follow this link or you will be banned from the site!