Tim Harford The Undercover Economist

Undercover EconomistUndercover Economist

My weekly column in the Financial Times on Saturdays, explaining the economic ideas around us every day. This column was inspired by my book and began in 2005.

Undercover Economist

Fatal Attraction of Fake Facts Sours Political Debate

He did it again: Boris Johnson, UK foreign secretary, exhumed the old referendum-campaign lie that leaving the EU would free up £350m a week for the National Health Service. I think we can skip the well-worn details, because while the claim is misleading, its main purpose is not to mislead but to distract. The growing popularity of this tactic should alarm anyone who thinks that the truth still matters.

You don’t need to take my word for it that distraction is the goal. A few years ago, a cynical commentator described the “dead cat” strategy, to be deployed when losing an argument at a dinner party: throw a dead cat on the table. The awkward argument will instantly cease, and everyone will start losing their minds about the cat. The cynic’s name was Boris Johnson.

The tactic worked perfectly in the Brexit referendum campaign. Instead of a discussion of the merits and disadvantages of EU membership, we had a frenzied dead-cat debate over the true scale of EU membership fees. Without the steady repetition of a demonstrably false claim, the debate would have run out of oxygen and we might have enjoyed a discussion of the issues instead.

My point is not to refight the referendum campaign. (Mr Johnson would like to, which itself is telling.) There’s more at stake here than Brexit: bold lies have become the dead cat of modern politics on both sides of the Atlantic. Too many politicians have discovered the attractions of the flamboyant falsehood — and why not? The most prominent of them sits in the White House. Dramatic lies do not always persuade, but they do tend to change the subject — and that is often enough.

It is hard to overstate how corrosive this development is. Reasoned conversation becomes impossible; the debaters hardly have time to clear their throats before a fly-blown moggie hits the table with a rancid thud.

Nor is it easy to neutralise a big, politicised lie. Trustworthy nerds can refute it, of course: the fact-checkers, the independent think-tanks, or statutory bodies such as the UK Statistics Authority. But a politician who is unafraid to lie is also unafraid to smear these organisations with claims of bias or corruption — and then one problem has become two. The Statistics Authority and other watchdogs need to guard jealously their reputation for truthfulness; the politicians they contradict often have no such reputation to worry about.

Researchers have been studying the problem for years, after noting how easily charlatans could debase the discussion of smoking, vaccination and climate change. A good starting point is The Debunking Handbook by John Cook and Stephan Lewandowsky, which summarises a dispiriting set of discoveries.

One problem that fact-checkers face is the “familiarity effect”: the endless arguments over the £350m-a-week lie (or Barack Obama’s birthplace, or the number of New Jersey residents who celebrated the destruction of the World Trade Center) is that the very process of rebutting the falsehood ensures that it is repeated over and over again. Even someone who accepts that the lie is a lie would find it much easier to remember than the truth.

A second obstacle is the “backfire effect”. My son is due to get a flu vaccine this week, and some parents at his school are concerned that the flu vaccine may cause flu. It doesn’t. But in explaining that I risk triggering other concerns: who can trust Big Pharma these days? Shouldn’t kids be a bit older before being exposed to these strange chemicals? Some (not all) studies suggest that the process of refuting the narrow concern can actually harden the broader worldview behind it.

Dan Kahan, professor of law and psychology at Yale, points out that issues such as vaccination or climate change — or for that matter, the independence of the UK Statistics Authority — do not become politicised by accident. They are dragged into the realm of polarised politics because it suits some political entrepreneur to do so. For a fleeting partisan advantage, Donald Trump has falsely claimed that vaccines cause autism. Children will die as a result. And once the intellectual environment has become polluted and polarised in this way, it’s extraordinarily difficult to draw the poison out again.

This is a damaging game indeed. All of us tend to think tribally about politics: we absorb the opinions of those around us. But tribal thinking pushes us to be not only a Republican but also a Republican and a vaccine sceptic. One cannot be just for Brexit; one must be for Brexit and against the UK Statistics Authority. Of course it is possible to resist such all-encompassing polarisation, and many people do. But the pull of tribal thinking on all of us is strong.

There are defences against the dead cat strategy. With skill, a fact-check may debunk a false claim without accidentally reinforcing it. But the strongest defence is an electorate that cares, that has more curiosity about the way the world really works than about cartoonish populists. If we let politicians drag facts into their swamp, we are letting them tug at democracy’s foundations.
Written for and first published in the Financial Times on 23 September 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Echoes of a bygone age show Britain losing its sense of direction

“It’s that 1970s vibe again,” a senior colleague tells me. This being the Financial Times I presume he is picking up echoes of a bygone economic and political milieu, rather than gleefully anticipating the re-emergence of flares or X-rated movie theatres. Either way, it is hard to venture a firm opinion on the matter: as late as the 1990s, I was still at school. My recollection of James Callaghan is pretty hazy, and I know Edward Heath only through a charming book of Christmas carols that he compiled after leaving office. (Millennials and foreigners confused by the direction this column is taking may be interested to know that both men were UK prime ministers.)

There certainly are parallels: now, as then, politics is dominated by the two big parties; the nation is led by a weak minority government; and Jeremy Corbyn’s views seem politically relevant. There is even an economic echo: the unemployment rate, at 4.3 per cent, is back down to the levels last seen in 1975, when I was in nappies.

But in other ways it feels absurd to compare today’s economy with that of 40 years ago. The uptick in inflation that has attracted some attention this week — to 2.9 per cent on the consumer price index measure — is a molehill compared with the Himalayan peaks of yesteryear, with retail price index inflation rarely slipping below 10 per cent per year and sometimes exceeding 25 per cent. With inflation at 25 per cent, prices double every three years; with inflation at 2.9 per cent the doubling would take a generation. Bank of England base rates then shuttled breathlessly between 5 and 15 per cent — whereas they sit today, as they have done since 2009, at record lows. The price of oil remains of interest not because it has spiked but because it has halved.

And rather than joining the European Economic Community in a desperate attempt to save the British economy, we are now leaving in a desperate attempt to . . . well, I am still trying to figure that one out.

But those are the dry numbers. What of the zeitgeist, the more ineffable spirit of the times? That is a curious question. Dominic Sandbrook, a leading British historian of the 1970s, reminds us of the words of Callaghan to his Labour party colleagues in 1974: “Our place in the world is shrinking: our economic comparisons grow worse, long-term political influence depends on economic strength — and that is running out . . . If I were a young man, I should emigrate.”

Callaghan’s mournful diagnosis cuts deep today. Much of the country knows how he felt. But the curious thing is that half of them believe that the UK was doing just fine until we voted for a once-in-a-generation act of self-harm last June. The other half were as gloomy as Callaghan until the Brexit vote gave them hope. Say what you like about the 1970s, at least their grimness is a fact that we can agree on.

Then, national humiliation was inflicted by the need to approach the International Monetary Fund for help — and everyone could agree that this was not an encouraging development. Now, national humiliation is in the eye of the beholder and we have either broken free of decades of subjugation to Brussels — or voted to make ourselves a laughing stock. I hope the rest of the world is enjoying the joke, at least. Our foreign secretary is Boris Johnson, our prime minister is “strong and stable”, our foreign policy is built on the steadfastness of President Donald Trump, and our back-up plans include Mr Corbyn and the Conservative member of parliament Jacob Rees-Mogg.

Economically, our 2017-era service industries and just-in-time supply chains are highly unlikely to survive a hard Brexit unscathed, despite the gung-ho cheerleading of a few economists who seem to think nothing much has changed in international economics since David Ricardo outlined the principle of comparative advantage in 1817.

Jill Lepore, a Harvard history professor, commented not long ago that she was wary of glib historical comparisons: “Trump is like Andrew Jackson”; “Cryptocurrencies are like the tulip bubble”. Rather than squashing together the past and present like an accordion, she advocates expanding the instrument, “stretch it open as far as you can, so you can see the distance”.

So if we stretch the accordion out, what do we see? A country that becomes more open, liberal, tolerant, wealthy and confident but also more economically unequal. The rise in inequality largely took place in the 1980s, but only became politically salient after the banking crisis of 2007. But also, perhaps, a country that now, as then, has lost a sense of direction. What ever you think of the journey, we travelled a long way under Margaret Thatcher and Tony Blair. But we have been becalmed now for a decade. Where exactly are we going? Ponder again this week’s unemployment and inflation numbers, which reinforce the picture of the UK economy that has become familiar: plenty of jobs, but not a lot of money.
The nation, like its government, is working flat out and going nowhere.

Written for and first published in the Financial Times on 15 September 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

True diversity means looking for the knife in a drawer of spoons

Forget the new year’s resolution: September, not January, is the time for new starts. College freshers are preparing to leave home, graduates are ironing shirts and blouses and dressing up for their first day in the office. Recruiters and admissions tutors are hoping they made the right choices.

So how do we select the best people for a course or a job? It seems like a sensible question, yet it contains a trap. In selecting the best person we might set a test — in a restaurant kitchen we might ask them to whip up some meals; in a software company we might set some coding problems. And then the trap is sprung.

By setting the same task for every applicant we recruit people who are carbon copies of each other. They will have the same skills and think in the same way. Allowing recruiters some subjective discretion might loosen this trap a little, but it might equally make it worse: we all tend to see merit in applicants who look, speak, and dress much like we do. Opposites do not attract, especially when it comes to corporate hiring.

This is unfair, of course. But it is also — for many but not all tasks — very unwise. Scott Page, a complexity scientist and author of a new book, The Diversity Bonus (UK) (US), invites us to think of people as possessing a kind of cognitive toolbox. The tools might be anything from fluent Mandarin to knowing how to dress a turkey to a command of Excel keyboard shortcuts. If the range of skills — the size of the toolkit — matters, then a diverse team will boast more cognitive skills than a homogenous team, even one full of top performers.

The logic of this toolbox model is obvious in certain contexts: any good heist movie will have the bruiser, the charmer, the hacker, the explosives expert, the strategist and the cat burglar. It is clear why such a diverse range of skills is needed and it is obvious that no single test could recruit such a team: it has to be constructed with diversity in mind from the start.

But within a corporate environment the same logic often tends to be forgotten. Everyone has been recruited using the same cookie-cutter template; everyone is proficient at a similar set of tasks, and the range of thinking skills suffers. The IMF is full of economists. Congress is full of lawyers. Football management is full of ex-footballers. If someone does happen to have hidden talents that will be by accident, not by design.

This homogeneity may not be disastrous. If you want to recruit 10 truck drivers you probably just need the 10 safest, most reliable drivers you can find, because the drivers will be working as individuals, not sparking off each other. But in any situation where a range of problems have to be solved together as a team, diversity can help.

Scott Page’s model of diversity — less a glorious rainbow of superficial attributes, more a toolkit crammed with different skills and perspectives — is a powerful way to appreciate the problem with homogeneity. If recruiters keep looking for the same skillset then an organisation risks, in the words of philosopher-queen Alanis Morissette, having 10,000 spoons when all it needs is a knife.

The standard model of graduate recruitment is almost helpless when faced with this problem. Yes, one can have diversity coaching, checking that certain demographic groups aren’t being discriminated against. But when one candidate at a time is being recruited, it is hard to do much about diversity because diversity is not a property of individuals, it is a property of groups.

So how to solve the problem? Rory Sutherland of Ogilvy, an adman with a keen interest in behavioural science, has suggested recruiting people in groups. If an organisation recruits five people at a time then a couple of vacancies can be reserved for wild-cards — people who don’t fit the mould but have interesting talents. But the definition of “interesting” is itself a tricky one, and not every organisation has the luxury of recruiting in bulk.

What makes matters worse is that we often do not appreciate the value of diversity when we see it. One study of problem-solving (by Katherine Philips, Katie Liljenquist and Margaret Neale) found that groups containing an outsider were far more proficient at solving murder-mystery puzzles than groups made up entirely of friends.

The striking thing about this study, though, was that the successful groups with an outsider didn’t realise they were being successful, while the cosy underperforming groups of friends were complacent, not realising how badly they were doing. Having the outsider around helps us solve problems, but don’t expect us to be grateful, or even to notice anything other than social discomfort.

The hard truth is that to find new solutions to old problems we must often work with people we don’t really understand. I won’t pretend this is easy, but I cannot wait to pull off a heist with Scott Page, Rory Sutherland and Alanis Morrissette.

Written for and first published in the Financial Times on 8 September 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

When doing nothing is the best option

The leaders of the free world are returning from their holidays. Must they? Surely no good can come of this.

While on vacation Donald Trump managed to eject most of his advisers, threaten a nuclear war with an unabashed North Korea, and display an unnerving willingness to see things from the Nazi point of view. Goodness knows what he will do now he’s fully back on the job. Theresa May returned from her Easter holiday with the splendid idea of calling a snap general election, so I can hardly contain my excitement as I wait for her latest brainstorm.

A flawed leader leaves us grateful for the quiet days, and one of the saving graces of Mr Trump’s administration is that, while he has many bad ideas, he is not always committed to them. Promising to build a wall, rip up Nafta and discriminate against Muslims and transgender people is damaging enough, but at least the follow-through has been patchy. It is a fragile mercy, but Mr Trump seems to prefer complaining about the US government to leading it.

Mrs May’s lack of leadership is more valuable. The British people have dealt the British establishment an unplayable hand: a parliament strung out between several lunatic fringes, and a referendum result that is hard to interpret and even harder to deliver. With the prime minister powerless, her ministers are showing signs of quiet realism. Yes, the country is chugging towards a train-crash Brexit, but at least our politicians are tying fewer hostages to the tracks.

Since I disagree with most of what Mrs May and Mr Trump are trying to do I might be expected to celebrate every day on which they do not do it. But there may be a deeper principle here: in many areas of life we demand action when inaction would serve us better.

The most obvious example is in finance, where too many retail investors trade far too often. One study, by Brad Barber and Terrance Odean, found that the more retail investors traded, the further behind the market they lagged: active traders underperformed by more than 6 percentage points (a third of total returns) while the laziest investors enjoyed the best performance.

This is because dormant investors not only save on trading costs but avoid ill-timed moves. Another study, by Ilia Dichev, noted a distinct tendency for retail investors to pile in when stocks were riding high and to sell out at low points.

It would be nice to recommend laziness as a universal principle, but alas many companies have turned consumer inertia into a revenue stream. Sometimes we must rouse ourselves to cancel a gym membership or find a cheaper insurance policy. Still, there are many situations where doing nothing is a sound tactic.

The same can be said of medicine. It is a little unfair on doctors to point out that when they go on strike, the death rate falls. Nevertheless it is true. It is also true that we often encourage doctors to act when they should not. In the US, doctors tend to be financially rewarded for hyperactivity; everywhere, pressure comes from anxious patients. Wiser doctors resist the temptation to intervene when there is little to be gained from doing so — but it would be better if the temptation was not there.

Some politicians expertly dodge demands for action. Tony Blair was often accused of recycling announcements, turning a single policy into a dozen press releases. But better one decent policy announced a dozen times than a dozen half-baked policies each announced once.

The argument for passivity has been strengthened by the rise of computers, which are now better than us at making all sorts of decisions. We have been resisting this conclusion for 63 years, since the psychologist Paul Meehl published Clinical vs. Statistical Prediction. Meehl later dubbed it “my disturbing little book”: it was an investigation of whether the informal judgments of experts could outperform straightforward statistical predictions on matters such as whether a felon would violate parole.

The experts almost always lost, and the algorithms are a lot cleverer these days than in 1954. It is unnerving how often we are better off without humans in charge. (Cue the old joke about the ideal co-pilot: a dog whose job is to bite the pilot if he touches the controls.)

Perhaps it is no coincidence that many august institutions are designed not to support wise action but to prevent foolishness. Supreme courts, independent central banks and the EU are often at their best when applying the brakes. No wonder so many of the deepest Eurosceptics — from Jeremy Corbyn to Marine Le Pen — are the politicians with the longest list of self-harming policies.

It is human nature to believe something must always be done. Yet we overrate our abilities to do it and it is awfully hard to make the case for passivity. The task is not made easier by campaigners wanting a policy, newspapers wanting a story or the patient wanting a pill. Who dares to offer them nothing?

Written for and first published in the Financial Times on 1 Sep 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Trump, Bannon, and the terrible lure of zero-sum thinking

As visual metaphors go, it wasn’t bad: Donald Trump ignoring expert advice and risking calamity by staring up at the sun as the moon’s shadow passed across America. Self-destructiveness has become a habit for this president — and for his advisers. A recent example: former White House chief strategist Steve Bannon called Robert Kuttner, a prominent progressive journalist, to declare that his internal foes in the administration were “wetting themselves”. Shortly after Mr Kuttner wrote about the conversation, Mr Bannon was out.

But the truly harmful temptation here is not eclipse-gazing or indiscreet interviews. It’s another idea that Mr Bannon proposed to Mr Kuttner: that the US was in “an economic war with China”. It seems intuitive; many ordinary Americans feel that they cannot win unless China loses. But the world economy is not like a game of football. Everyone can win, at least in principle. Or everyone can lose. Falling for Mr Bannon’s idea of economic war makes the grimmer outcome far more likely.

Like many dangerous ideas there is some truth in it. The American middle class has been suffering while China has been booming. Branko Milanovic, author of Global Inequality (UK) (US), has produced a striking elephant-shaped graph showing how, since the late 1980s, the rich have been doing well, as have many other groups, including the Asian middle class. But earnings near though not at the top of the global income ladder have stagnated. That does not demonstrate harm from China: there is the fall of the Soviet Union to consider, and the struggles of Japan.

However, another study, from David Autor, David Dorn and Gordon Hanson, has shown the lasting impact of the “China shock”. It was no surprise that competition from China put some Americans out of work, but Mr Autor and his colleagues showed that the effects were more locally concentrated, deeper and more enduring than expected. These are important and worrying findings.

But Mr Bannon’s “economic war” is a cure far worse than the disease, and a misdiagnosis of how the world economy works. America has still benefited from trade with Asia and attacking China — even metaphorically — will do nothing for the American middle class. This is because it is surprisingly hard to find a zero-sum game in the real world.

Most commercial transactions offer benefits to both sides, otherwise why would they take place at all? A trip to a restaurant provides good food and a pleasant evening for me, gainful employment for the waiting staff and the chef, and a lively environment for the neighbourhood. Everyone can gain. There are zero-sum elements to the affair: every penny I hand over is a loss to me and a gain to the restaurant staff or owner. But it is best all round not to obsess too much on such matters.

Zero-sum thinking apparently makes for good politics but bad policy. The UK government has shown an unnerving tendency to treat its EU negotiations as a zero-sum affair, in which the Europeans can “go whistle”, in the words of foreign secretary Boris Johnson.

In the Brexit referendum the Vote Leave campaign turned on a zero-sum claim: we send money to the EU, we should spend it on ourselves. The form of the argument was as powerfully misleading as the details: the focus on membership fees pulled the attention of voters away from the idea of the EU as a club of co-operating nations.

Populists of all stripes focus on zero-sum arguments because they’re easy to explain and emotionally appealing. Any toddler understands the idea of grabbing what someone else has; most adults prefer a situation where everyone gains.

The theory of zero-sum games was developed by the mathematician John von Neumann and the economist Oskar Morgenstern in their famous book published in 1944. It works fine for analysing chess and poker, but by itself zero-sum thinking is not much use to an economist who analyses a world full of win-win situations, of gains from trade.

Zero-sum thinking is not even that helpful to a military strategist. Von Neumann was a cold war hawk: “If you say bomb the Soviets tomorrow, I say why not today?”, Life magazine quoted him as saying. “If you say bomb them at five o’clock, I say why not one o’clock?” He was a genius, but it does not take a genius to see the blind spot in his thinking.

The populists may lack the genius but they have the same blind spot. Not coincidentally, the focus on zero-sum rhetoric has drawn attention away from more plausible solutions, many of which are purely domestic: higher quality education, publicly funded infrastructure investment, antitrust action to keep markets functioning competitively, and a more constructive welfare state which supports and encourages work rather than stigmatises and punishes idleness.

The biggest risk is that zero-sum thinking becomes self-fulfilling. Given oxygen by years of slow growth, it will lower growth further. By emphasising conflict it will intensify it. The US is not in an economic war with China, but could start one. That might help Mr Trump. It would not help those he claims to defend.

 
Written for and first published in the Financial Times on 25 August 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

The psychological biases that leave us unprepared for disaster

This column was written and first published a week before Hurricane Harvey struck the US coast. – TH

Who saw the global financial crisis coming, who didn’t and who deserved blame for the forecasting failure? After a decade of debating these questions, I wonder whether we shouldn’t be asking a different one: even if we had clearly seen the crisis coming, would it have made a difference? Perhaps — but perhaps not.

Consider New Orleans in 2004. With a terrible hurricane bearing down on the city, officials realised that the situation was grim. The levees were in disrepair and a storm surge could flood the low-lying city. A hundred thousand residents would be unable to evacuate without help, and not enough help was available. A plan was hatched to evacuate families to the Superdome, a sports stadium, but managers there warned that it simply could not house so many. If only there had been more warning of disaster.

Some readers will recall, though, that the catastrophe of Hurricane Katrina took place in 2005. The storm of 2004 was Hurricane Ivan, which, after lashing the Caribbean, weakened and turned aside from New Orleans. The city had been given almost a full year’s warning of the gaps in its defences.

The near miss led to much discussion but little action. When Hurricane Katrina hit the city, evacuation proved as impractical and the Superdome as inadequate as had been expected. The levees broke in more than 50 places, and about 1,500 people died. New Orleans was gutted. It was an awful failure but surely not a failure of forecasting.

Robert Meyer and Howard Kunreuther in The Ostrich Paradox (UK) (US) argue that it is common for institutions and ordinary citizens to make poor decisions in the face of foreseeable natural disasters, sometimes with tragic results. There are many reasons for this, including corruption, perverse incentives or political expediency. But the authors focus on psychological explanations. They identify cognitive rules of thumb that normally work well but serve us poorly in preparing for extreme events.

One such mental shortcut is what the authors term the “amnesia bias”, a tendency to focus on recent experience. We remember more distant catastrophes but we do not feel them viscerally. For example, many people bought flood insurance after watching the tragedy of Hurricane Katrina unfold, but within three years demand for flood insurance had fallen back to pre-Katrina levels.

We cut the same cognitive corners in finance. There are many historical examples of manias and panics but, while most of us know something about the great crash of 1929, or the tulip mania of 1637, those events have no emotional heft. Even the dotcom bubble of 1999-2001, which should at least have reminded everyone that financial markets do not always give sensible price signals, failed to make much impact on how regulators and market participants behaved. Six years was long enough for the lesson to lose its sting.

Another rule of thumb is “optimism bias”. We are often too optimistic, at least about our personal situation, even in the midst of a more generalised pessimism. In 1980, the psychologist Neil Weinstein published a study showing that people did not dwell on risks such as cancer or divorce. Yes, these things happen, Professor Weinstein’s subjects told him: they just won’t happen to me.

The same tendency was on display as Hurricane Sandy closed in on New Jersey in 2012. Robert Meyer found that residents of Atlantic City reckoned that the chance of being hit was more than 80 per cent. That was too gloomy: the National Hurricane Center put it at 32 per cent. Yet few people had plans to evacuate, and even those who had storm shutters often had no intention of installing them.

Surely even an optimist should have taken the precautions of installing the storm shutters? Why buy storm shutters if you do not erect them when a storm is coming? Messrs Meyer and Kunreuther point to “single action bias”: confronted with a worrying situation, taking one or two positive steps often feels enough. If you have already bought extra groceries and refuelled the family car, surely putting up cumbersome storm shutters is unnecessary?

Reading the psychological literature on heuristics and bias sometimes makes one feel too pessimistic. We do not always blunder. Individuals can make smart decisions, whether confronted with a hurricane or a retirement savings account. Financial markets do not often lose their minds. If they did, active investment managers might find it a little easier to outperform the tracker funds. Governments, too, can learn lessons and erect barriers against future trouble.

Still, because things often do work well, we forget. The old hands retire; bad memories lose their jolt; we grow cynical about false alarms. Yesterday’s prudence is today’s health-and-safety-gone-mad. Small wonder that, 10 years on, senior Federal Reserve official Stanley Fischer is having to warn against “extremely dangerous and extremely short-sighted” efforts to dismantle financial regulations. All of us, from time to time, prefer to stick our heads in the sand.

Written for and first published in the Financial Times on 18 August 2017.

My new book is “Fifty Inventions That Shaped The Modern Economy”. Grab yourself a copy in the US or in the UK (slightly different title) or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Challenge is all too easily ducked by today’s knowledge workers

“The man whose whole life is spent in performing a few simple operations, of which the effects, too, are perhaps always the same . . . generally becomes as stupid and ignorant as it is possible for a human creature to become.” This anxiety about the stupefying effects of cog-in-a-machine manufacturing sounds like a line from Karl Marx. It is, in fact, from Adam Smith’s The Wealth of Nations.

As the anniversary of Smith’s death was this week, it seemed like a good moment to reflect on the Scottish philosopher’s warning about the deadening effect of repetitive work. Smith knew that specialisation and the division of labour weren’t about to disappear, so he advocated publicly funded schools as a path to more fulfilling work and leisure.

The emergence of mass production lines made Smith’s words seem prophetic; but many repetitive jobs have since been taken by machines. So, has his warning about stultifying work been rendered obsolete?

The Wealth of Nations is almost a quarter of a millennium old, and we should not expect every word to ring true today. But correctly read, Smith’s anxiety continues to resonate — and not just for people with repetitive jobs, but knowledge workers too.

The modern knowledge worker — a programmer, a lawyer, a newspaper columnist — might appear inoculated from Smith’s concern. We face not monotony but the temptations of endless variety, with the entire internet just a click away. All too easily, though, we can be pulled into the soothing cycle of what slot-machine designers call a “ludic loop”, repeating the same actions again and again. Check email. Check Facebook. Check Instagram. Check Twitter. Check email. Repeat.

Smith would not have dreamt of a smartphone, but what is a ludic loop but “performing a few simple operations, of which the effects, too, are perhaps always the same”?

Smith was concerned about jobs that provided no mental challenge: if problems or surprises never arose, then a worker “has no occasion to exert his understanding, or to exercise his invention, in . . . removing difficulties which never occur.”

For the modern knowledge worker, the problem is not that the work lacks challenge, but that the challenge is easily ducked. This point is powerfully made by computer scientist Cal Newport in his book Deep Work (US) (UK). Work that matters is often difficult. It can be absorbing in mid-flow and satisfying in retrospect, but it is intimidating and headache-inducing and full of false starts.

Email is easier. And reading Newport’s book I realised that email posed a double temptation: not only is it an instant release from a hard task, but it even seems like work. Being an email ninja looks professional and seems professional — but all too often, it is displacement activity for the work that really matters.

A curious echo of Smith’s warning comes in Robert Twigger’s new book Micromastery (US) (UK). Mr Twigger sings the praises of mastering one small skill at a time: not how to cook, but how to make the perfect omelette; not how to build a log cabin, but how to chop a log. There is much to be said for this. We go deep — as Newport demands — but these sharp spikes of skill are satisfying, not too hard to acquire and a path to true expertise.

They also provide variety. “Simply growing up in the premodern period guaranteed a polymathic background,” writes Twigger. To prosper in the premodern era required many different skills; a smart person would be able to see a problem from many angles. A craft-based, practical upbringing means creative thinking comes naturally. “It is only as we surge towards greater specialisation and mechanisation that we begin to talk about creativity and innovation.”

I draw three lessons from all this. The first is that learning matters. Smith wanted schooling for all; Twigger urges us to keep schooling ourselves. Both are right.

The second is that serious work requires real effort, and it can be tempting to duck that effort. Having the freedom to avoid strenuous thinking is a privilege I am glad to have — but I am happier when I don’t abuse that freedom.

The third lesson is that old-fashioned craft offered us something special. To Smith it was the challenge that came from solving unpredictable problems. To Twigger it is the variety of having to do many small things well. To Newport, it is the flow that comes from deep immersion in a skill that requires mastery. Perhaps all three mean the same thing.

Smith realised that the coming industrial age threatened these special joys of work. The post-industrial age threatens them too, in a rather different way. Fortunately, we have choices.

“The understandings of the greater part of men are necessarily formed by their ordinary employments,” wrote Smith. So whether at work or at play, let us take care that we employ ourselves wisely.

 

Written for and first published in the Financial Times on 21 July 2017.

My new book is “Fifty Things That Made The Modern Economy” – out last week in the UK and coming in a few days in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Think like a supermodel if you want to win from the gig economy

Are we misunderstanding the endgame of the annoyingly named “gig economy”? At the behest of the UK government, Matthew Taylor’s review of modern working practices was published this week. The title could easily have graced a report from the 1930s, and the review is in many ways a conservative document, seeking to be “up to date” while preserving “enduring principles of fairness”.

Mr Taylor, chief executive of the RSA and a former policy adviser to the Blair government, wants to tweak the system. One proposal is to sharpen up the status of people who are neither employees nor freelancers, calling them “dependent contractors” and giving them some employment rights. In the US, economists such as Alan Krueger — formerly the chairman of Barack Obama’s Council of Economic Advisers — proposed similar reforms.

There is nothing wrong with this; incremental reform is often wise. Quaint ideas such as the employer-employee relationship are not yet obsolete. Yet they might yet become so, at least in some industries. If they do, I am not sure we will be ready. The obsolescence I have in mind was anticipated by Silicon Valley’s favourite economist, Ronald Coase. Back in 1937, a young Coase wrote “The Nature of the Firm”, calling attention to something strange: while corporations competed within a competitive marketplace, corporations themselves were not markets. They were hierarchies. If you work for a company, you don’t allocate your time to the highest bidder. You do what your boss tells you; she does what her boss tells her. A few companies dabble with internal marketplaces, but mostly they are islands of command-and-control surrounded by a sea of market transactions.

Coase pointed out that the border between hierarchy and market is a choice. Corporations could extend their hierarchy by merging with a supplier. Or they could rely more on markets, spinning off subsidiaries or outsourcing functions from cleaning and catering to IT and human resources. Different companies make different choices and the ones that choose efficiently will survive.

So what is the efficient choice? That depends on the nature of the job to be done. A carmaker may well want to have the engine manufacturer in-house, but will happily buy bulbs for the headlights from the cheapest bidder.

But the choice between hierarchy and market also depends on the technology deployed to co-ordinate activity. Different technologies favour different ways of doing things. The bar code made life easier for big-box retailers. While eBay favoured the little guy, connecting buyers and sellers of niche products.

Smartphones have allowed companies such as Uber and Deliveroo to take critical middle-management functions — motivating staff, evaluating and rewarding performance, scheduling and co-ordination — and replace them with an algorithm. But gig workers could install their own software, telling it where they like to work, what they like to do, when they’re available, unavailable, or open to persuasion. My app — call it GigBot — could talk to the Lyft app and the TaskRabbit app and the Deliveroo app, and interrupt me only when an offer deserves attention.

Not every job can be broken down into microtasks that can be rented out by the minute, but we might be surprised at how many can. Remember that old line from supermodel Linda Evangelista, “We don’t wake up for less than $10,000 a day”? GigBot will talk to your alarm clock; $10 or $10,000, just name the price that would tempt you from your lie-in.

It is easy to imagine a dystopian scenario in which a few companies hook us in like slot-machine addicts, grind us in circles like cogs, and pimp us around for pennies. But it is not too hard to imagine a world in which skilled workers wrest back control using open-source software agents, join electronic guilds or unions and enjoy a serious income alongside unprecedented autonomy.

Nothing empowers a worker like the ability to walk out and take a better offer; in principle the gig economy offers exactly that. Indeed both scenarios may come true simultaneously, with one type of gig for the lucky ones, and another for ordinary folk.

If we are to take the best advantage of a true gig economy, we need to prepare for more radical change. Governments have been content to use corporations as delivery mechanisms for benefits that include pensions, parental leave, sick leave, holidays and sometimes healthcare — not to mention the minimum wage. This isn’t unreasonable; even a well-paid freelancer may be unable to buy decent private insurance or healthcare. Many of us struggle to save for a pension. But if freelancers really do start to dominate economic activity — if — the idea of providing benefits mostly through employers will break down.

We will need governments to provide essential benefits, perhaps minimalist, perhaps generous, to all citizens. Above that safety net, we need portable benefits — mentioned warmly but briefly by Mr Taylor — so that even a 10-minute gig helps to fill a pension pot or earn time towards a holiday. Traditional corporate jobs have been socially useful, but if you push any model too far from reality, it will snap.

 
Written for and first published in the Financial Times on 14 July 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now week in the UK and coming very soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Fantasy gaming can be better than reality

“The only thing that can make me happy is computer games!” So declares my five-year-old son, tears streaming down his cheeks, with a vehement desperation that merely encourages me to ration this potent experience. I don’t recall my own parents restricting the time I spent gaming, but then I didn’t have access to computers until I was nearly 10, with the arrival of an Oric-1, with 8 glorious colours and a magnificent 48K of memory.

Maybe I didn’t spend much time on the computer anyway — my ability to play games would have been limited by the fact that my mother, something of a hacker, would be hogging it. Or perhaps I found the games less addictive than my son does, although I seem to remember many hours as a teenager playing the magnificent interstellar odyssey Elite. It cannot be denied that games are getting better: varied, beautiful, narratively engaging, and often social, too, with millions logging into online worlds, forging alliances and waging battles, all in character and alongside friends. Some games are dreadful clickfests, little better than slot machines. But that should not discredit all games any more than Fifty Shades of Grey discredits Finnegans Wake.

I stopped playing computer games in 1999, at the age of 25. Increasingly realistic games on ever-larger screens gave me motion sickness — few disincentives are quite as visceral as nausea. Then, in 2004, I met Edward Castronova. He is an economist who had enjoyed some media attention after calculating the gross domestic product per capita of Norrath, the entirely-imaginary setting of an online game called EverQuest. Mr Castronova pointed out that players who slogged away at the mundane parts of the game could earn real money. In-game achievements — stronger characters, magical abilities — could be sold to other players who wanted a short-cut. The wage was about $3.50 an hour, not a lot for a New Yorker, but serious money if you lived in Dar es Salaam.

The surprising idea that people could earn a living slaying orcs attracted some attention, but Mr Castronova pointed out something else: some gamers were spending serious amounts of time online. They played not for money but for fun, devoting many hours a week to engaging, challenging and persistent online roles. And if games are really such fun, who needs reality? Why work in Starbucks when you could command a starship? Mr Castronova published a book, Exodus to the Virtual World (UK) (US), in 2007, describing people turning their back on the physical world and spending more time in virtual ones. It seemed highly speculative at the time. But data from the US labour market increasingly suggest that Mr Castronova was on to something.

Four economists — Mark Aguiar, Mark Bils, Kerwin Kofi Charles and Erik Hurst — have published their latest research paper studying the impact of awesome computer games on the US job market. The basic observation is this: the unemployment rate in America is at its lowest level for 16 years; if it drops a little further it will be at its lowest level since 1969. Yet some people — young men in particular — are completely disengaged from the labour market. (They don’t count as unemployed because they’re not looking for work.) In 2016 — excluding full-time students — 15 per cent of men in their twenties did not work a single week in the entire year. In the year 2000, the last time unemployment was this low, the comparable figure was 8 per cent

So at a time when most of the people looking for jobs find them, why are so many young men not even looking? One explanation is that they think there is no hope, but another explanation is that they would rather be playing a game. Food is cheap; living with your parents is cheap; computer games are cheap. Why work? Distinguishing the two hypotheses is not easy, but Prof Aguiar and his colleagues make a good case that the pull of video games is an important part of the story. Women and older men — who spend less time playing games — are more engaged with the labour market.

This is an alarming trend: if basement-dwelling videogamers are turning their backs on reality, they are missing a vital opportunity to pick up the skills, experience and contacts they will need if they’re ever to earn a proper living. The long-term prognosis is worrying.

Then again, good games do bring happiness. Joblessness is usually a reliable predictor of misery, yet men under 30 are far less likely to be unhappy than in the early 2000s. The proportion saying they’re “very happy” or “pretty happy” has risen from 81 to 89 per cent, almost halving the rate of unhappiness. The reverse is true for men over 30.

Without exception, the longest and firmest friendships in my life are with other gamers. I favour face-to-face games with dice and pencils, but those games still involve me and my friends stepping into a fantasy world. I am worried that so many young men are disconnected from the job market. But perhaps we should stop blaming the games, and see if we can get reality to pull its socks up.

 
Written for and first published in the Financial Times on 7 July 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now in the UK and coming very soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

We are still waiting for the robot revolution

The cash machine turned 50 this week — old enough, I think, to teach us a few lessons about the dawning of a new machine age. It seems a good advertisement for practical innovation that makes life a little easier. But with its very name a promise to replace a human being, the “automated teller machine” seems a harbinger of mass technological unemployment.

The story of the robot takeover has become familiar: robots came first for the bank tellers, and I did not speak out, for I was not a bank teller. Then overnight the robots were driving trucks, performing legal research and interpreting mammographic X-rays. The only jobs remaining were those writing books with titles such as Race Against The Machine and Rise of the Robots.

The difficulty with these visions of technological joblessness is that there are plenty of jobs around at the moment. In the UK, the employment rate is nearly 75 per cent; it hasn’t been higher since records began in 1971. The job situation is not quite so rosy in the US, where participation in the labour market is down since the Clinton and Bush years. Still, the unemployment rate is at a 16-year low. A job apocalypse this is not.

The ATM may help to explain this apparent puzzle. James Bessen of Boston University points out that the ATM did not, in fact, replace bank tellers — there are more bank teller jobs in the US now than when the ATM was introduced.

This should not entirely be a surprise: the original story of the cash machine is that its inventor John Shepherd-Barron had the door of his local bank slammed in his face on a Saturday lunchtime, and was frustrated that there was no way to get his money until Monday morning. Mr Shepherd-Barron didn’t invent a replacement for human tellers so much as a way to get cash at any time of the day or night. Banks opened more branches and employed humans to cross-sell loans, mortgages and credit cards instead. The automated teller worked alongside more human tellers than ever.

The ATM is no outlier here. Mr Bessen found that in the 19th century, 98 per cent of the labour required to weave cloth was automated — yet employment in the weaving industry increased as our demand for clothes more than offset the labour-saving automation. The same process seems to be at work today in the legal sector: artificial intelligence is increasingly being deployed to do tasks once done by legal clerks, but employment of those clerks is up, not down. Mr Bessen has found only one clear example of automation entirely eliminating a human job: elevator operators. There are other jobs that haven’t been taken by robots, but nevertheless have disappeared — Hansom cab driving, or operating a telegraph. Banks are not being replaced by cash dispensers so much as bypassed entirely by contactless payments and online accounts.

Overall, though, machines have been tools that have enhanced human productivity. They automate some routine tasks. This expands output and it also boosts demand for humans to perform complementary, non-routine tasks. This leads to better pay, more interesting work and as many jobs as ever overall.

Or at least — it should. Our chief economic problem right now isn’t that the robots are taking our jobs, it’s that the robots are slacking off. We suffer from slow productivity growth; the symptoms are not lay-offs but slow-growing economies and stagnant wages. In advanced economies, total factor productivity growth — a measure of how efficiently labour and capital are being used to produce goods and services — was around 2 per cent a year in the 1960s, when the ATM was introduced. Since then, it has averaged closer to 1 per cent a year; since the financial crisis it has been closer to zero. Labour productivity, too, has been low.

Plenty of jobs, but lousy productivity: imagine an economy that was the exact opposite of one where the robots took over, and it would look very much like ours. Why? Tempting as it may be to blame the banks, a recent working paper by John Fernald, Robert Hall and others argues that productivity growth stalled before the financial crisis, not afterwards: the promised benefits of the IT revolution petered out by around 2006. Perhaps the technology just isn’t good enough; perhaps we haven’t figured out how to use it. In any case, results have been disappointing.

There is always room for the view that the productivity boom is imminent. A new policy paper from business economists Michael Mandel and Bret Swanson argues that we are starting to find digitally driven efficiencies in physical industries such as energy, construction, transport, and retail. If this happens, Silicon Valley-style innovation will ripple through the physical economy. If.

It is in the nature of exponential growth that the near future can easily outweigh the recent past. But we are still waiting. For now, the machine has stalled and the error message reads: “Sorry: this robot takeover could not be completed at present.”

Written for and first published in the Financial Times on 30 June 2017.

My new book is “Fifty Things That Made The Modern Economy” – out now in the UK and coming soon in the US. Grab yourself a copy in the US (slightly different title) or in the UK or through your local bookshop.

Free email updates

(You can unsubscribe at any time)

Previous

Elsewhere

  • 1 Twitter
  • 2 Flickr
  • 3 RSS
  • 4 YouTube
  • 5 Podcasts
  • 6 Facebook

Books

  • Fifty Inventions That Shaped the Modern Economy
  • Messy
  • The Undercover Economist Strikes Back
  • Adapt
  • Dear Undercover Economist
  • The Logic of Life
  • The Undercover Economist

Search by Keyword

Tim’s Tweets

Free Email Updates

Enter your email address to receive notifications of new articles by email (you can unsubscribe at any time).

Join 155,844 other subscribers

Website Malware Scan
Do NOT follow this link or you will be banned from the site!