‘In the recent UK election campaign, a diet of numbers was stuffed into voters like feed into French ducks’
The American Economic Review isn’t usually the place for trash talk but a brief new article by Paul Romer is the closest academic economics is likely to come to a spat between boxers at a pre-fight weigh-in. Romer, a professor at New York University, is worried that a new trend in economics — “mathiness” — is polluting the discipline. And he names names — including Robert Lucas and Edward Prescott, both Nobel laureates, and inequality guru Thomas Piketty.
In a follow-up comment, “Protecting the Norms of Science in Economics”, Romer says: “I point to specific papers that deserve careful scrutiny because I think they provide objective, verifiable evidence that the authors are not committed to the norms of science.”
Romer adds that if his suspicions are confirmed, such people should be ostracised — suggesting that Nobel Prize winners should be ejected from academic discussion because of their intellectual bad faith. This is strong stuff.
Romer, though, has rarely stuck to the academic script. In the late 1980s he developed a new approach to thinking about economic growth that mathematically modelled the development and spread of ideas, an achievement that many regard as worthy of the Nobel memorial prize in economics. But Romer then drifted away from academia, first founding an online learning platform called Aplia, and then campaigning for a radical development idea, “charter cities”.
Does economics have a mathiness problem? Many casual observers would say, “of course”. Economics has a reputation for producing rigorous nonsense.
But Romer’s attack is much more focused. He doesn’t mean that economics uses too much mathematics but that some economic theorists are pushing an ideological agenda and using fancy mathematics to disguise their intentions. They can redefine familiar words to mean unfamiliar things. They can make unrealistic assumptions. They can take hypothetical conclusions and suggest they have practical significance. And they can do all these things with little fear of detection, behind a smokescreen of equations. If Romer is right, some economics papers are Orwellian Newspeak dressed up as calculus.
In his short essay “Politics and the English Language”, Orwell argued that there was a “special connection between politics and the debasement of language”. While some people wish to communicate clearly, the political writer prefers a rhetorical fog. And the fog can spread. Writers who should know better imitate sloppy writing habits. Readers become jaded and stop hoping that anyone will tell them the truth.
Romer fears a similar rot at the heart of economics. As some academics hide nonsense amid the maths, others will conclude that there is little reward in taking any of the mathematics seriously. It is hard work, after all, to understand a formal economic model. If the model turns out to be more of a party trick than a good-faith effort to clarify thought, then why bother?
Romer focuses his criticism on a small corner of academic economics, and professional economists differ over whether his targets truly deserve such scorn. Regardless, I am convinced that the malaise Romer and Orwell describe is infecting the way we use statistics in politics and public life.
There being more statistics around than ever, it has never been easier to make a statistical claim in service of a political argument.
In the recent election campaign in the UK, a diet of numbers was stuffed into voters like feed into French ducks. A fact-checking industry sprang up to scrutinise these numbers — I was part of it — but the truth is that most of the numbers were not false, unhelpful. Instead of simply verifying or debunking the latest number, fact checkers found themselves spending much effort attempting to purify muddied waters.
This is infuriating — for the public, for the fact checkers, and for the scientists and statisticians who take such pains to gather evidence. Imagine their dismay when the politicians seize that evidence and hold it up for protection like a human shield. Good statistics matter; without them it is almost impossible to understand the modern world. Yet when statistics are dragged into political arguments, it is not the reputation of politics that suffers but the reputation of statistics. The endgame isn’t pretty: it becomes too much trouble to check statistical claims, and so they are by default assumed to be empty, misleading or false.
Just as the antidote to Newspeak isn’t to stop using language, the antidote to mathiness isn’t to stop using mathematics. It is to use better maths. Orwell wanted language to be short, simple, active and direct. Romer wants economists to use maths with “clarity, precision and rigour”. Statistical claims should be robust, match everyday language as much as possible, and be transparent about methods.
Some critics believe that economics should conduct itself in plain English at all times. This is, I think, unreasonable. Mathematics offers precision that English cannot. But it also offers a cloak for the muddle-headed and the unscrupulous. There is a profound difference between good maths and bad maths, between careful statistics and junk statistics. Alas, on the surface, the good and the bad can look very much the same.
Written for and first published at ft.com.