Tim Harford The Undercover Economist

Undercover EconomistUndercover Economist

My weekly column in the FT Magazine on Saturday’s, explaining the economic ideas around us every day. This column was inspired by my book and began in 2005.

Undercover Economist

Delusions of objectivity

‘“Naive realism” is the seductive sense that we’re seeing the world as it truly is, without bias or error’

“Have you ever noticed when you’re driving,” the comedian George Carlin commented, “that anybody driving slower than you is an idiot, and anyone going faster than you is a maniac?”

True enough. But when you think for a moment about Carlin’s quip, how could it be otherwise? You’ve made a decision about the appropriate speed for the driving conditions, so by definition everybody else is driving at a speed that you regard as inappropriate.

If I am driving at 70 and pass a car doing 60, perhaps my view should be, “Hmm, the average opinion on this road is that the right speed is 65.” Almost nobody actually thinks like this, however. Why not?

Lee Ross, a psychologist at Stanford University and co-author of a new book, The Wisest One in the Room, describes the problem as “naive realism”. By this he means the seductive sense that we’re seeing the world as it truly is, without bias or error. This is such a powerful illusion that whenever we meet someone whose views conflict with our own, we instinctively believe we’ve met someone who is deluded, rather than realising that perhaps we’re the ones who could learn something.

The truth is that we all have biases that shape what we see. One early demonstration of this was a 1954 study of the way people perceived a college-football game between Dartmouth and Princeton. The researchers, Albert Hastorf and Hadley Cantril, showed a recording of the game to Dartmouth students and to Princeton students, and found that their perceptions of it varied so wildly that it is hard to believe they actually saw the same footage: the Princeton students, for example, counted twice as many fouls by Dartmouth as the Dartmouth students did.

A more recent investigation by a team including Dan Kahan of Yale showed students footage of a demonstration and spun a yarn about what it was about. Some students were told it was an anti-abortion protest in front of an abortion clinic; others were told it was a protest outside an army recruitment office against the military’s (then) policy of “don’t ask, don’t tell”.

Despite looking at exactly the same footage, the experimental subjects drew sharply different conclusions about how aggressive the protesters were being. Liberal students were relaxed about the behaviour of people they thought were gay-rights protesters but worried about what the pro-life protesters were doing; conservative students took the opposite view. This was despite the fact that the researchers were asking not about the general acceptability of the protest but about specifics: did the protesters scream at bystanders? Did they block access to the building?

We see what we want to see. We also tend to think the worst of the “idiots” and “maniacs” who think or act differently. One study by Emily Pronin and others asked people to fill in a survey about various political issues. The researchers then redistributed the surveys, so that each participant was shown the survey responses of someone else. Then the participants were asked to describe their own reasoning and speculate about the reasoning of the other person.

People tended to say that they were influenced by rational reasons such as “attention to fact”, and that people who agreed with them had similar motives. Those who disagreed were thought to be seeking “peer approval”, or acting out of “political correctness”. I pay attention to facts but you’re a slave to the approval of your peers. I weigh up the pros and cons but you’re in the pocket of the lobbyists.

Even when we take a tolerant view of those who disagree with us, our empathy only goes so far. For example, we might allow that someone takes a different view because of their cultural upbringing — but we would tend to feel that they might learn the error of their ways, rather than that we will learn the error of ours.

Pity the BBC’s attempts to deliver objective and neutral coverage of a politicised issue such as the British referendum on leaving the EU. Eurosceptics will perceive a pro-Brussels slant, Europhiles will see the opposite. Both sides will assume corruption, knavery or stupidity is at play. That is always possible, of course, but it is also possible that passionate advocates simply don’t recognise objectivity when they see it.

But then how could the situation be otherwise? If any media outlet criticises a political position that you personally admire, there is a contradiction to be resolved, and an easy way to explain the disagreement is to conclude either that the media are biased, or that you are. You can guess which choice people instinctively make. Small wonder that careful studies of media bias in the US show that most newspapers and radio or TV stations don’t try to persuade their readers and viewers; instead, they pander to the biases of their audience.

It is hard to combat naive realism because the illusion that we see the world objectively is such a powerful one. At least I’ve not had to worry about it too much myself. Fortunately, my own perspective is based on a careful analysis of the facts, and my political views reflect a cool assessment of reality rather than self-interest, groupthink or cultural bias. Of course, there are people to the left of my position. They’re idiots. And the people on my right? Maniacs.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Trump, trade and ‘the China shock’

‘Freer trade has inflicted a more grievous toll than economists, myself included, had expected’

It hasn’t escaped the notice of pundits that the political iconoclasts Bernie Sanders and Donald Trump have something in common: they’re sceptical about trade. Trump, for example, has riffed expansively: “We don’t win any more. We don’t beat China in trade. We don’t beat Japan . . . We can’t beat Mexico, at the border or in trade.” Sanders expressed his concerns with a little more precision: “While bad trade agreements are not the only reason why manufacturing jobs in the US have declined, they are an important factor.”

Both men have vastly outperformed expectations in the primary campaigns. There are many reasons for that but perhaps the simplest explanation is that freer trade has inflicted a more grievous toll than economists, myself included, had expected.

Fifteen years ago, the conventional economic wisdom was that free trade was almost unambiguously a good idea. Here’s the basic logic. There are two ways for the British to get hold of wine. We can grow and press our own grapes, or we can make something that the French want and trade with them. If we’re good at making, say, computer games and the French are good at making wine, then trading is the better way to get what we want.

The idea that we might, Trumpishly, “beat the French in trade” sounds appealing but is incoherent. And while a British Sanders might point to the loss of jobs in the UK wine industry, that would miss the gains in the software industry. There is little economic difference between a tariff on the import of French wine and a tariff on the export of British software.

Here’s a parable beloved of economists. An entrepreneur announces a technological breakthrough: he has a machine that can disintegrate computer game discs and reconstitute the atoms into fine wine. He sets up a factory on the coast of Kent with the machine inside. Computer games go in, and cases of wine emerge. But then an investigative reporter from the Financial Times gains access to the factory and finds that there is no machine — just a dock where a forklift truck operator busily unloads French wine from a boat, replacing it with computer games for export to the French market. Should we care? From the point of view of the British, isn’t France merely a technology for converting computer games into wine?

With formal models to back up this sort of story, most economists took the view that when countries lower their trade barriers, even unilaterally, they prosper. What the British wine industry loses, the UK computer games industry gains. Meanwhile, consumers get better and cheaper wine into the bargain.

It was always clear that, despite the win-win nature of trade at the national level, freer trade could create losers — such as British vineyards and French computer game studios. But the conventional wisdom was that these losses were both small and fixable with the right policies of retraining or redistribution. Most importantly, people who lost their jobs could find new ones in booming export industries.

Admittedly, it was evident even 20 years ago that median household incomes were stagnating in the US, inequality was rising in anglophone countries, and manufacturing employment was steadily falling. But these trends seemed to owe more to technological change than to globalisation.

I’ve been phrasing all this “conventional wisdom” in the past tense but, for the most part, it stands up. However, it is acquiring an important and depressing footnote. A new research paper, “The China Shock”, from David Autor, David Dorn and Gordon Hanson, is part of a rethink under way in the economics profession.

Autor and his colleagues try to zoom in on the impact of China’s emergence as a trading power. China’s rise has been dramatic, driven almost entirely by internal policy changes inside China, and has had a differential effect on different regions and industries. For example, Tennessee and Alabama are both US manufacturing centres exposed to global competition. But Tennessee’s furniture manufacturing industry is much more exposed to China in particular than is Alabama’s heavier manufacturing industries. This helps the researchers to figure out with more confidence what the impact of the China shock has been.

Autor, Dorn and Hanson conclude that the American workers who have been hurt by competition with China have been hurt more deeply, and for a longer period, than many economists predicted. Employment has fallen in industries exposed to trade competition, as expected. But it has not shown much signs of rising in export-oriented sectors.

The US labour market is less flexible than we thought, it seems. In a simplified economic model, workers move smoothly to a new home, a new industry, even a new level of education. In practice, Autor and his colleagues find that communities hit by Chinese competition often do not adapt; they wither. It may take a generation or two, rather than a few years, to adjust.

In the long run, of course, that adjustment will happen — just as we have adjusted to the decline of agricultural labour or the need for typewriter repairs. But the long run is longer than many economists feared. It is easy to see why supporters of Trump and Sanders have run out of patience.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Capital ideas in a time of inequality

‘The wealthy do not simply wallow in bank vaults like Scrooge McDuck. They spend their money’

In January 1963, Warren Buffett included the following impish observation in his letter to his investment partners. “I have it from unreliable sources that the cost of the voyage Isabella underwrote for Columbus was approximately $30,000.”

Unreliable indeed; there was no dollar in 1492. But we get the gist. Buffett goes on to observe that while the voyage could be considered “at least a moderately successful utilisation of venture capital”, if Queen Isabella had instead invested the $30,000 in something yielding 4 per cent compound interest, the invested sum would have risen to $2tn by 1962. For her inheritors’ sake, perhaps Isabella should have said no to Columbus and simply found the 15th-century equivalent of a passive index fund instead.

Buffett’s thought experiment returned to me as I browsed through the latest list of billionaires from Forbes. None of the leading players had achieved their position by the simple accumulation of family wealth over generations. The top five — Bill Gates, Amancio Ortega, Buffett, Carlos Slim and Jeff Bezos — are all entrepreneurs of one form or another. According to economists Caroline Freund and Sarah Oliver, the proportion of billionaires who inherited their fortunes has fallen from 55 per cent two decades ago to 30 per cent today.

Is this absence of old-money trillionaires because Buffett’s 4 per cent compound interest was unavailable to the wealthy and powerful of pre-industrial Europe? Hardly. If anything, 4 per cent is conservative. According to Thomas Piketty’s bestselling book Capital in the Twenty-First Century (2013), the real rate of return on capital, after taxes and capital losses, was 4.5 per cent in the 16th and 17th centuries, then 5 per cent until 1913. Although it fell sharply between the wars, the effective average rate of return was very nearly 4.3 per cent across the five centuries. At that rate, $30,000 invested in 1492 would be worth $110tn today.

Not to get too technical, but $110tn is a lot of money. It’s more than 1,000 times the wealth of the richest man in the world, Bill Gates. It’s 17 times the total wealth of the 1,810 billionaires on the Forbes list — or, alternatively, nearly half the household wealth of every citizen on the planet. (According to Credit Suisse’s Global Wealth Report, total global household wealth is $250tn.) Queen Isabella’s investment advisers apparently let her down. Patient, conservative investments would have left her heir today with a fortune to tower over every modern plutocrat.

All this brought to mind Piketty’s “r>g”, a mathematical expression so celebrated that people started putting it on T-shirts. It describes a situation where “r” (the rate of return on capital) exceeds “g” (the growth rate of the economy as a whole). That is a situation that described most of human history, but notably not the 20th century, when growth rates soared while capital had a tendency to be nationalised, confiscated or reduced to rubble.

“r>g” is significant because if capital is reinvested and grows faster than the economy, it will tend to loom larger in economic activity. And since capital is more unequally distributed than labour income, “r>g” may describe a society of increasingly entrenched privilege, where wealth and power steadily accrue in the hands of heirs.

This is a fascinating, and worrying, possibility. But it is a poor description of the modern world. For one thing, when billionaires divide their inheritance, mere procreation can be a social equaliser. Historically, the great houses of Europe intermarried and concentrated wealth in the hands of a single heir. (No wonder: one of Queen Isabella’s grandsons, Ferdinand I, had 15 children.) But these days, disinheriting daughters and second sons is out of fashion. (That said, “assortative mating”, the tendency of educated people to marry each other, is back and may explain more about rising income inequality than we tend to realise.)

Another thing: the rich do not simply wallow in money vaults like Scrooge McDuck. They spend. According to Harvard economist Greg Mankiw: “A plausible estimate of the marginal propensity to consume out of wealth, based on both theory and empirical evidence, is about 3 per cent.” Instead of 4.3 per cent, then, wealth compounds at 1.3 per cent after allowing for this spending. Five centuries of compound interest at 1.3 per cent turns $30,000 into about $25m, a fine inheritance indeed but not the kind of money that will get you near the Forbes list.

Of course, inherited privilege shapes our societies not only among the plutocracy but down in the rolling foothills of English middle-class wealth. There, economic destiny is increasingly governed by whether your parents bought a house in the right place at the right time — and by the UK government’s astonishing abolition of inheritance tax on family homes.

But whether mega-wealth in the 21st century will be driven by the patient accumulation of rents on capital, rather than the disruptive entrepreneurship of the late 20th century, remains to be seen. After all, long-term real interest rates in advanced economies have fallen fairly steadily from 4 to 5 per cent three decades ago to nothing at all today. You don’t need to be Warren Buffett to figure out that if you want to get rich by accumulating compound interest of zero, you’ll be waiting a long time.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

These are the sins we should be taxing

‘The UK already relies more than most rich countries on fuel, alcohol and tobacco duties’

Is it time to rethink the way we tax sin? The UK has long levied special taxes — “duties” — on products that pollute the environment, the lungs, the liver or the pocketbook: driving, flying, tobacco, alcohol and gambling. There are good reasons for these taxes. The government must raise revenue somehow, so there is much to be said for taxing products that are price-insensitive, socially harmful or, at the very least, unhealthy temptations.

But the way sin taxes are levied in practice is an incoherent muddle. Plenty of products that are bad for us (bacon, butter, sugar) get favourable tax treatment, attracting no value-added tax, although the standard VAT rate is 20 per cent. Heating and lighting our homes also attracts a concessional rate of tax, although a kilogram of carbon dioxide emitted from a power station or a gas boiler contributes to climate change just as much as a kilogram emitted from a car. Vehicle excise duty is a tax not on driving but on owning a car. And the rate of duty on alcohol varies depending on how we drink it.

It is easy to see how successive chancellors bodged their way to this point. Taxes on the pint or at the pump are eye-catching; raising them seems morally serious but cutting them is a crowd-pleaser. And so they bounce around like the political football they are.

George Osborne has an opportunity to fix the situation this Wednesday during his Budget speech. Here’s what he should do.

First, similar harms should attract similar taxes. The UK duty on 10ml of pure alcohol, roughly the amount in a shot of vodka or half a pint of beer, varies wildly. It is about 7p in strong cider, 18p in strong beer, or 28p in whisky and wine. A consistent price per millilitre would make more sense.

Second, he should broaden the sin tax base. UK duties are concentrated on tobacco, motor fuel and alcohol. As the Institute for Fiscal Studies showed in its “green budget” in February, revenue from duties has been falling for decades, from 4.1 per cent of national income in the early 1980s to 2.6 per cent last year. This drop is the net result of falling duties on fuel (back to the levels of 20 years ago in real terms), declining duties on alcohol and lower consumption of both tobacco and alcohol.

Should the chancellor, then, raise duties? Perhaps, but there are limits. The UK already relies more than most rich countries on fuel, alcohol and tobacco duties. Above a certain level, the smugglers and bootleggers take over.

A wiser approach is to tax sins that have thus far escaped attention. The most obvious is congestion: fuel taxes do not distinguish between driving along an uncongested country road and driving in rush-hour in a built-up area, which causes vastly more social harm. Congestion charges, which are now technically feasible, are fair and efficient, if the political case can be made.

Another obvious sin is sugar. While one can be too puritanical about nudging people to take care of their health and waistline, it seems strange that perfectly reasonable activities such as buying a T-shirt or earning a living attract tax, while sugar is tax-free. A sugar tax of a half-penny a gram would add about 18p to the cost of a can of Coke, more than that to a family pack of Bran Flakes, 25p to a 200ml bottle of ketchup and 45p to the price of a packet of chocolate digestives.

Third, Osborne should avoid arbitrary cut-offs where possible. In a bygone age it must have been simpler to slap a tax on an item in a particular category but this has led to the infamous “Jaffa Cake” problem. Are Jaffa Cakes — sponge discs with an orange jelly topping, partly coated in chocolate — cakes (zero VAT) or biscuits (VAT at 20 per cent)? A tribunal in 1991 mused that Jaffa Cakes are packaged much like biscuits, are sold next to biscuits, are the same size and shape as biscuits and, like biscuits, are eaten without a fork. However, it also noted that they are made of a cake-like dough, are soft and, like a cake, they go harder when stale. The tribunal eventually concluded that Jaffa Cakes are cakes, and thus they remain tax-advantaged, even as they nestle on the supermarket shelves next to their biscuitish rivals.

All of this is nonsense from any angle. In discouraging unhealthy eating, the relevant issue should not be whether food is circular or requires a fork but how much sugar, salt and saturated fat it contains. Sugar can be measured and taxed by the gram, whether it comes dissolved in oft-demonised soft drinks or added to bread, cereal, ready-meals, chocolate bars or anything else.

Finally, we should not worry too much about the distributional consequences of sin taxes. This isn’t because distribution doesn’t matter — it does. But, by some measures at least, alcohol and fuel duties hit the middle classes harder than they hit the poor. In any case, there are better ways to deal with inequality than by cutting sin taxes. People on low incomes need support but that help is better provided through tax credits, child benefit or good public services rather than cheap booze, sweets and tobacco. We are all free to buy vodka and cigarettes. Yet trying to make them cheaper would be a strange way to address social injustice.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

The lost leisure time of our lives

‘Keynes was right to predict that we would be working less but overestimated for how long that trend would continue’

Three hours a day is quite enough,” wrote John Maynard Keynes in his 1930 essay Economic Possibilities for our Grandchildren. The essay continues to tantalise its readers today, thanks in part to a forecast that is looking magnificently right — that in advanced economies people could be up to eight times better off in 2030 than in 1930 — coupled with a forecast that is looking spectacularly wrong, that we would be working 15-hour weeks.

In 2008, economists Lorenzo Pecchi and Gustavo Piga edited a book in which celebrated economists pondered Keynes’s essay. One contributor, Benjamin Friedman of Harvard University, has recently revisited the question of what Keynes got wrong, and produced a thought-provoking answer.

First, it is worth teasing out the nature and extent of Keynes’s error. He was right to predict that we would be working less. We enter the workforce later, after long and not-always-arduous courses of study. We enjoy longer retirements. The work week itself is getting shorter. In non-agricultural employment in the US, the week was 69 hours in 1830 — the equivalent of working 11 hours a day but only three hours on Sundays. By 1930, a full-time work week was 47 hours; each decade, American workers were working two hours less every week.

But Keynes overestimated how rapidly and for how long that trend would continue. By 1970 the work week was down to 39 hours. If the work week had continued to shrink, we would be working 30-hour weeks by now, and perhaps 25-hour weeks by 2030. But by around 1970, the slacking-off stopped. Why?

One natural response is that people are never satisfied: perhaps their desire to consume can be inflamed by advertisers; perhaps it is just that one must always have a better car, a sharper suit, and a more tasteful kitchen than the neighbours. Since the neighbours are also getting richer, nothing about this process allows anyone to take time off.

No doubt there is much in this. But Friedman takes a different angle. Rather than asking how Keynes could have been so right about income but so wrong about leisure, Friedman points out that Keynes might not have been quite so on the mark about income as we usually assume. For while the US economy grew briskly until the crisis of 2007, median household incomes started stagnating long before then — around 1970, in fact.

The gap between the growth of the economy and the growth of median household incomes is explained by a patchwork of factors, including a change in the nature of households themselves, with more income being diverted to healthcare costs, and an increasing share of income accruing to the highest earners. In short, perhaps progress towards the 15-hour work week has stalled because the typical US household’s income has stalled too. Household incomes started to stagnate at the same time as the work week stopped shrinking.

This idea makes good sense but it does not explain what is happening to higher earners. Since their incomes have not stagnated — far from it — one might expect them to be taking some of the benefits of very high hourly earnings in the form of shorter days and longer weekends. Not so. According to research published by economists Mark Aguiar and Erik Hurst in 2006 — a nice snapshot of life before the great recession — higher earners were enjoying less leisure.

So the puzzle has taken a different shape. Ordinary people have been enjoying some measure of both the income gains and the leisure gains that Keynes predicted — but rather less of both than we might have hoped.

The economic elites, meanwhile, continue to embody a paradox: all the income gains that Keynes expected and more, but limited leisure.

The likely reason for that is that, in many careers, it’s hard to break through to the top echelons without putting in long hours. It is not easy to make it to the C-suite on a 20-hour week, no matter how talented one is. And because the income distribution is highly skewed, the stakes are high: working 70 hours a week like it’s 1830 all over again may put you on track for a six-figure bonus, while working 35 hours a week may put you on track for the scrapheap.

The consequences of all this can emerge in unexpected places. As a recent research paper by economists Lena Edlund, Cecilia Machado and Maria Micaela Sviatschi points out, urban centres in the US were undesirable places to live in the late 1970s and early 1980s. People paid a premium to live in the suburbs and commuted in to the city centres to work. The situation is now reversed. Why? The answer, suggest Edlund and her colleagues, is that affluent people don’t have time to commute any more. They’ll pay more for cramped city-centre apartments if by doing so they can save time.

If there is a limited supply of city-centre apartments, and your affluent colleagues are snapping them up, what on earth can you do? Work harder. Homes such as Keynes’s elegant town house in Bloomsbury now cost millions of pounds. Three hours a day is not remotely enough.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

How to make good guesses

‘Would you say that someone reading the FT is more likely to have a PhD or to have no college degree at all?’

What’s the likelihood that the British economy will fall into recession this year? Well, I’ve no idea — but I have a new way to guess.

Before I reveal what this is, here’s a totally different question. Imagine that you see someone reading the Financial Times. Would you say that this individual, clearly a person of discernment, is more likely to have a PhD or to have no college degree at all?

The obvious response is that the FT reader has a PhD. Surely people with PhDs better exemplify the FT reader than people with no degree at all, at least on average — they tend to read more and to be more prosperous.

But the obvious response is too hasty. First, we should ask how many people have PhDs and how many people have no college degree at all? In the UK, more than 75 per cent of adults have no degree but the chance that a randomly chosen person has a PhD is probably less than 1 per cent.

It only takes a small proportion of non-graduates to read the FT before they’ll outnumber the PhD readers. This fact should loom large in our guess, but it does not.

Logically, one should combine the two pieces of information, the fact that PhDs are rare with the fact that FT readers tend to be well educated. There is a mathematical rule for doing this perfectly (it’s called Bayes’ rule) but numerous psychological experiments suggest that it never occurs to most of us to try. It’s not that we combine the two pieces of information imperfectly; it’s that we ignore one of them completely.

The number that gets ignored (in this example, the rarity of PhDs) is called the “base rate”, and the fallacy I’ve described, base rate neglect, has been known to psychologists since the 1950s.

Why does it happen? The fathers of behavioural economics, Daniel Kahneman and Amos Tversky, argued that people judge such questions by their representativeness: the FT reader seems more representative of PhDs than of non-graduates. Tversky’s student, Maya Bar-Hillel, hypothesised that people seize on the most relevant piece of information: the sighting of the FT seems relevant, the base rate does not. Social psychologists Richard Nisbett and Eugene Borgida have suggested that the base rate seems “pallid and abstract”, and is discarded in favour of the vivid image of a person reading the pink ’un. But whether the explanation is representativeness, relevance, vividness or something else, we often ignore base rates, and we shouldn’t.

At a recent Financial Times event, psychologist and forecasting expert Philip Tetlock explained that good forecasters pay close attention to base rates. Whether one is forecasting whether a marriage will last, or a dictator will be toppled, or a company will go bankrupt, Tetlock argues that it’s a good idea to start with the base rate. How many marriages last? How many dictators are toppled? How many companies go bankrupt? Of course, one may have excellent reasons to depart from the base rate as a forecast but the base rate should be the beginning of the journey.

On this basis, my guess is that there is a 10 per cent chance that the UK will begin a recession in 2016. How so? Simple: in the past 70 years there have been seven recessions, so the base rate is 10 per cent.

Base rates are not just a forecasting aid. They’re vital in clearly understanding and communicating all manner of risks. We routinely hear claims of the form that eating two rashers of bacon a day raises the risk of bowel cancer by 18 per cent. But without a base rate (how common is bowel cancer?) this information is not very useful. As it happens, in the UK, bowel cancer affects six out of 100 people; a bacon-rich diet would cause one additional case of bowel cancer per 100 people.

Thinking about base rates is particularly important when we’re considering screening programmes or other diagnostic tests, including DNA tests for criminal cases.

Imagine a blood test for a dangerous disease that is 75 per cent accurate: if an infected person takes the test, it will detect the infection 75 per cent of the time but it will also give a false positive 25 per cent of the time for an uninfected person. Now, let’s say that a random person takes the test and seems to be infected. What is the chance that he really does have the disease? The intuitive answer is 75 per cent. But the correct answer is: we don’t know, because we don’t know the base rate.

Once we know the base rate we can express the problem intuitively and solve it. Let’s say 100 people are tested and four of them are actually infected. Then three will have a (correct) positive test, but of the 96 uninfected people, 24 (25 per cent) will have a false positive test. Most of the positive test results, then, are false.

It’s easy to leap to conclusions about probability, but we should all form the habit of taking a step back instead. We should try to find out the base rate, or at least to guess what it might be. Without it, we’re building our analysis on empty foundations.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

The consequences of cheap oil

‘When oil prices are high, people may get out of their cars and walk, cycle or get public transport’

After years in which $100 oil was the norm, the price of Brent crude is now around a third of that. Assume for a moment that Russia and Saudi Arabia fail in their efforts to get the price back up. Will $30 oil change the world? The answer is yes, of course. Everything is connected to everything else in economics, and that is particularly true when it comes to oil. For all the talk of the weightless economy, we’re not quite so post-industrial as to be able to ignore the cost of energy. Because oil is versatile and easy to transport, it remains the lubricant for the world’s energy system.

The rule of thumb has always been that while low oil prices are bad for the planet, they’re good for the economy. Last year a report from PwC estimated that a permanent fall in the price of oil by $50 would boost the size of the UK economy by about 1 per cent over five years, since the benefits — to most sectors but particularly to heavy industry, agriculture and air travel — would outweigh the costs to the oil production industry itself.

That represents the conventional wisdom, as well as historical experience. Oil was cheap throughout America’s halcyon years of the 1950s and 1960s; the oil shocks of the 1970s came alongside serious economic pain. The boom of the 1990s was usually credited to the world wide web but oil prices were very low and they soared to record levels in the run-up to the great recession. We can debate how important the oil price fluctuations were but the link between good times and cheap oil is not a coincidence.

Here’s a piece of back-of-the-envelope economics. The world consumes nearly 100 million barrels a day of oil, which is $10bn a day — or $3.5tn a year — at the $100 price to which we’ve become accustomed. A sustained collapse in the oil price would slice more than $2tn off that bill — set against a world economic output of around $80tn, that’s far from trivial. It is a huge transfer from the wallets of oil producers to those of oil consumers.

Such large swings in purchasing power always used to boost economic growth, because while producers were saving the profits from high prices, consumers tended to spend the windfall from low ones. One of the concerns about today’s low prices is that the positions may be reversing: the big winners, American consumers, are using the spare cash to pay off debts; meanwhile, losers such as Russia and Saudi Arabia are cutting back sharply on investment and public spending. If carried to extremes, that would mean a good old-fashioned Keynesian slowdown in a world economy trying to spend less and save more; the more likely result of which is that lower oil prices fail to give us the boost we hope for.

It is intriguing to contemplate some of the less obvious effects. Charles Courtemanche, a health economist at Georgia State University, has found a correlation between low gasoline prices and high obesity rates in the United States. That is partly because, when oil prices are high, people may get out of their cars and walk, cycle or get public transport. Cheap gasoline, on the other hand, puts disposable income into the pockets of families who are likely to spend it on eating out. Low oil prices may make us fat.

Another depressing possibility is that low oil prices will slow down the rate of innovation in the clean energy sector. The cheaper the oil, the less incentive there is to invent ways of saving it. There is clear evidence for this over the very long run. As recently as the late 1700s, British potters were using wasteful Bronze Age technology for their kilns. The reason? Energy was cheap. Wages, in contrast, were expensive — which is why the industrial revolution was all about saving labour, not saving energy.

More recently, David Popp, an economist at Syracuse University, looked at the impact of the oil price shocks of the 1970s. He found that inventors emerged from the woodwork to file oil-saving patents in fields from heat pumps to solar panels.

It is always possible that the oil price collapse will do little to affect some of the big technological shifts in the energy market. The scale of oil production from hydraulic fracturing (fracking) in the US may be curtailed but a huge technological leap has already happened. As the chief economist of BP, Spencer Dale, recently commented, fracking is starting to look less like the huge, long-term oil-drilling projects of the past, and more like manufacturing: cheap, lean, replicable and scalable. Low oil prices cannot undo that and the efficiencies may well continue. We can hope for ever-cheaper solar power too: photovoltaic cells do not compete closely with oil, and we may continue to see more and more installations and lower and lower prices.

That said, when fossil fuels are cheap, people will find ways to burn them, and that’s gloomy news for our prospects of curtailing climate change. We can’t rely on high oil and coal prices to discourage consumption: the world needs — as it has needed for decades — a credible, internationally co-ordinated tax on carbon.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Online dating? Swipe left

‘It is crazy to believe someone’s eye colour, height, hobbies and musical tastes are a basis for a lasting relationship’

Online dating promised so much. “This is one of the biggest problems that humans face and one of the first times in human history there was some innovation,” says Michael Norton, a psychologist at Harvard Business School.

Finding the right partner, whether for life or for Saturday night, is so important to so many people that you would think we might have cracked it by now. By assembling a vast array of date-worthy people in a searchable format, online dating seems like it should be a huge improvement on the old-fashioned methods of meeting people at work, through friends, or in bars and nightclubs. But it’s not clear that the innovation of online dating is helping very much.

A simple survey that Norton conducted with two other behavioural scientists, Jeana Frost and Dan Ariely, revealed that people were unhappy with their online dating experience in three obvious ways. The first was that the “online” bit of the dating was about as much fun as booking a dentist’s appointment. The second was that it took for ever — the typical survey respondent spent 12 hours a week browsing through profiles and sending and receiving messages, yielding less than two hours of offline interaction. Now, 106 minutes are plenty for certain kinds of offline interaction but, however people were spending their time together, they didn’t seem satisfied. This was the third problem: people tended to have high expectations before the dates they had arranged online but felt disenchanted afterwards. To adapt a Woody Allen joke: not only are the dates terrible but there are so few of them.

Given that online dating tends to be tedious, time-consuming and fruitless, it is no surprise that we seem hungry for a better way. Most approaches to online dating have tried to exploit one of the two obvious advantages of computers: speed and data-processing power. Apps such as Grindr and Tinder allow people to skim quickly through profiles based on some very simple criteria. (Are they hot? Are they available right now?) That is, of course, fine for a one-night stand but less promising for a more committed relationship.

The alternative, embraced by more traditional matchmaking sites such as Match.com and OkCupid, is to use the power of data to find the perfect partner. We badly want to believe that after giving a website a list of our preferences, hobbies and answers to questions such as, “Do you prefer the people in your life to be simple or complex?”, a clever algorithm will produce a pleasing result.

Because these pleasing results seem elusive, wishful thinking has gone into overdrive. We hold out hope that if only we could be cleverer, the algorithms would deliver the desired effect. For example, Amy Webb’s TED talk “How I Hacked Online Dating” has been watched more than four million times since it was posted in 2013.

In a similar vein, Wired magazine introduced us to Chris McKinlay, “the math genius who hacked OkCupid” and managed to meet the woman of his dreams after cleverly reverse-engineering the website’s algorithms. The brilliance of McKinlay’s achievement is somewhat diminished by the revelation that he had to work his way through unsuccessful dates with 87 women before his “genius” paid dividends.

This should hardly be a surprise. Imagine looking at the anonymised dating profiles of 10 close friends and comparing them with the profiles of 10 mere acquaintances. Using the profile descriptions alone, could you pick out the people you really like? The answer, says Dan Ariely, is no. “It’s terrible. It’s basically random.”

It is crazy to believe that someone’s eye colour and height, or even hobbies and musical tastes, are a basis for a lasting relationship. But that is the belief that algorithmic matching encourages. Online dating is built on a Google-esque trawl through a database because that’s the obvious and easy way to make it work.

Is there a better way? Perhaps. Jeana Frost’s PhD research explored an alternative approach to online dating. Why not, she asked, make online dating a bit less like searching and a bit more like an actual date? She created a virtual image gallery in which people had a virtual date, represented by simple geometric avatars with speech bubbles. The images — from Lisa and Jessica Simpson to George Bush and John Kerry — were conversation starters. People enjoyed these virtual dates and, when they later met in person, the virtual date seems to have worked well as an icebreaker.

Virtual dating has not taken off commercially, says Norton, in part because companies have tried too hard to make it realistic, and have fallen into the “uncanny valley” of the not-quite-human. I suspect, but cannot prove, that virtual spaces such as World of Warcraft are perfectly good places to meet a soulmate, assuming your soulmate happens to like orc-bashing. Perhaps mainstream virtual dating is just waiting for the right design to emerge.

Or perhaps the problem is deeper: online dating services prosper if they keep us coming back for more. Setting someone up with a romantic partner for life is no way to win a repeat customer.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

How to keep your gym habit

‘Might a commitment strategy allow you to pay yourself to go to the gym?’

How are those resolutions going? Still going to the gym? If not, you’re not alone.

Let’s think about incentives. If some benevolent patron had paid you a modest sum — a few pounds a day, perhaps — for keeping your resolution throughout January, would that have helped you keep fit now that January is behind us?

The answer is far from clear. An optimistic view is that by paying you to look after yourself in January, your mysterious patron would have encouraged you to form good habits for the rest of the year. The most obvious case would be if you were trying to give up cigarettes; paying you to get through the worst of the withdrawal period might help a lot. Perhaps diet and exercise would be similarly habit-forming.

Yet some psychologists would argue that the payment is worse than useless, because payments can chip away at our intrinsic motivation to exercise. Once we start paying people to go to the gym or to lose weight, the theory goes, their inbuilt desire to do such things will be corroded. When the payments stop, things will be worse than if they had never started.

The idea that external rewards might crowd out intrinsic motivation is called overjustification. In a celebrated study in 1973 conducted by Mark Lepper, David Greene and Richard Nisbett, some pre-school children were promised sparkly certificates as a reward for drawing with special felt-tip pens. Others were given no such promise. When the special pens were reintroduced to the nursery classrooms a week or so later, without any reward on offer, the researchers found that the children who had previously been promised certificates for their earlier drawing now spent half as much time with the pens as their peers. Only suckers draw for free.

There’s a big difference between exercising and colouring, however: while many children like felt-tips, many adults do not like exercising. A payment can hardly crowd out your intrinsic motivation if you don’t have any intrinsic motivation in the first place. Systematic reviews of the overjustification effect suggest that incentives do no harm for activities that people find unappealing anyway.

So perhaps the idea of paying people to exercise is worth thinking about after all. In 2009, two behavioural economists, Gary Charness and Uri Gneezy, published the results of a pair of experiments in which they tried it. Some of their experimental subjects were paid $100 to go to the gym eight times in a month, while those in two alternative treatment groups were either paid $25 for going just once, or weren’t asked to go to the gym at all.

The results were a triumph for the habit-formation view. The payments worked even after they had stopped. In one study, the subjects were exercising twice as often seven weeks after the bonus payments stopped than before they started; in the other, the increase was threefold 13 weeks after payments had stopped. People who were already regular gym-goers didn’t change their behaviour — so there was no crowding-out — but there was a surge in exercise from people who hadn’t previously done much. A later study by Dan Acland and Matthew Levy found a similar habit-forming effect among students, although, alas, the good habits often failed to survive the winter vacation. In other experiments, incentive payments have been shown to be modestly successful at helping smokers to give up.

There is much to be said for a benign patron who pays you to stay healthy while you form good habits. But where might such a person be found? Take a look in the mirror — your patron might be you.

Inspired by the ideas of Nobel laureate Thomas Schelling, economists have become fascinated by the idea of commitment strategies, where your virtuous self takes steps to outmanoeuvre your weaker self before temptation strikes. A simple commitment strategy is to hand £500 to a trusted friend, with instructions that they are only to return the cash if you keep your resolution.

Might a commitment strategy allow you to pay yourself to go to the gym? It might indeed. Economists Heather Bower, Mark Stehr and Justin Sydnor recently published the results of a long-term experiment conducted with 1,000 employees of a Fortune 500 company. In this experiment, some employees were initially paid $10 for each visit to the company gym over a month. Some of them were then offered the opportunity to put money into a commitment savings account: if they kept exercising, the money would be returned; otherwise it would go to charity. The approach was no panacea: most people did not take up the option, and not everyone who did managed to stick to their goals. But even three years later, those who had been offered commitment accounts were 20 per cent more likely to be exercising than the control group.

That chimes with my experience. I once wrote a column about sending $1,000 to a company called Stickk, which promised to give it away if I didn’t exercise regularly. The contract was for a mere three months — and I succeeded. Eight years after my money was returned, I’m still sticking to the habit.

Written for and first published at ft.com.

Free email updates

(You can unsubscribe at any time)

Undercover Economist

Hidden truths behind China’s smokescreen

‘When countries become richer, do they pollute their environment more or less?’

The pictures from Beijing tell their own story: pollution there is catastrophic. Bad news for residents, and awkward for me too. Just over a decade ago, I wrote a book, The Undercover Economist, which among many other things cheerfully asserted that particulate air pollution in urban China was sharply falling as the country grew richer. It’s a claim I believed at the time (based on well-regarded research in the 2002 Journal of Economic Perspectives) but with each new report of smog over China, I felt a nagging sense that I had led readers astray. I figured it was time to do some more research and to set the record straight.

There is a broader question here. When countries become richer, do they pollute their environment more or less? For a while it seemed obvious that pollution and riches went hand in hand: industrialised nations spewed out more of everything.

But then the leading countries began to crack down on pollution. London no longer suffers from smog. The European Union reduced sulphur dioxide emissions by more than 80 per cent between 1990 and 2011. At the same time, the United States has reduced atmospheric lead by 98 per cent.

In the early 1990s, Princeton economists Gene Grossman and Alan Krueger coined the phrase “environmental Kuznets curve” to stand for the idea that as countries become richer, their emissions first rise but then fall, as richer citizens demand cleaner air from the governments they elect and the companies from whom they buy. There’s some evidence that this is true but it’s hard to interpret that evidence. An optimistic view is that countries reduce pollution with or without economic growth because they can use clean technologies developed elsewhere. If true, China may be able to clean up its air faster than we’d expect.

A grimmer possibility is that the richer countries aren’t really reducing pollution — they are exporting it, by banning dirty factories at home while happily buying from dirty factories abroad. On this view, China is unlikely to be able to clean its air any time soon.

How serious a problem is offshoring pollution? It’s not trivial. In 2007, Joseph Aldy of Harvard’s Kennedy School published research showing clear evidence of this pollution-export effect within the US. Richer states seemed to be emitting less carbon dioxide per person as their economies grew. Alas, Aldy concluded that the effect could be explained entirely by the rich having bought their electricity from poorer states rather than generating it at home. A more recent study (Peters, Minx, Weber and Edenhofer 2011) estimated that by 2008, developed countries were net importers from developing countries of goods whose production represented about 1.6 billion tonnes of carbon dioxide emissions, roughly 5 per cent of the global emissions total. No prizes for guessing that much of this energy-intensive manufacturing is taking place in China, alongside the production of steel, cement and coal-fired electricity for domestic use.

The dreadful air quality in Beijing, then, is no mystery. First, China is not yet rich, so it may be on the wrong side of the environmental Kuznets curve anyway, the side where pollution has not yet begun to fall. Second, China is not a democracy, and that will partially dampen the power of its citizens to demand cleaner air. Third, China is a major exporter of manufactured goods.

But as I stood ready to pen my correction, I realised something: I didn’t actually have a time series for air pollution in urban China. I could see that things were bad but not what the trend was.

“The challenge with particulates is that we keep changing what we want to measure and regulate,” Aldy told me. Researchers now track PM2.5, very fine particles thought to be particularly hazardous to health; but, in 1985, when my original data series began, nobody was collecting PM2.5 data.

So how much worse have things got in China? I called Jostein Nygard of the World Bank, who has been working on Chinese air pollution issues for more than two decades, and I was surprised at his response: in many ways, China’s urban air quality has improved.

Sulphur dioxide is down and coarser particulate matter is also down since good records began in 2000 — a fact that is explained by Chinese efforts to install sulphur scrubbers and to move large pollution sources away from the cities. “You could see the air quality improving through the 1980s and 1990s and to the 2000s,” says Nygard. PM2.5 is very bad, he says — but not necessarily worse than 10 years ago, and serious efforts are now under way to track it and reduce it.

To my surprise, not quite a correction at all, then. But if local air pollution in China is actually on an improving track, how come we see so many stories about pollution in China? One reason, of course, is that the situation remains serious. Another is that the Chinese government itself seems to be using smog alerts as a way to send a message to local power brokers that clean air is a priority. But there is also the question of what counts as news: sudden outbreaks of smog are newsworthy. Slow, steady progress is not.

Written for and first published at ft.com.

Previous Next


  • 1 Twitter
  • 2 Flickr
  • 3 RSS
  • 4 YouTube
  • 5 Podcasts
  • 6 Facebook


  • Messy
  • The Undercover Economist Strikes Back
  • Adapt
  • Dear Undercover Economist
  • The Logic of Life
  • The Undercover Economist

Subscribe to TimHarford.com

Enter your email address to receive notifications of new articles by email.

Tim’s Tweets

Search by Keyword

Do NOT follow this link or you will be banned from the site!