I intend to break a record with this column, and publish what must surely be the FT’s longest ever correction. It’s not that I enjoy self-flagellation in public. But the manner of my error offers a lesson about economics as well as journalism.
First, the error. In the “Dear Economist” column, I recently described the research of Harold Hotelling. But I attributed to Hotelling an anecdote that is widely used but was probably never used by Hotelling himself.
Ron Johnston, a geography professor at the University of Bristol, was kind enough to point out my mistake. “You are, I fear, by no means the first person to incorrectly cite Hotelling’s classic 1929 paper as using the example of two or more ice-cream sellers locating on a beach. [It] did not: his much more prosaic example was of two ‘places of business’ serving ‘buyers … supposed uniformly distributed along a line of length 1, which may be Main Street in a town or a transcontinental railroad’.”
So why did I slip up? I know Hotelling’s model fairly well, but I wrote my answer while travelling and unable to access the Economic Journal of 1929. Wondering whether the ice-cream anecdote really was Hotelling’s, I checked accounts of his model on the internet and found the anecdote attributed to him again and again. (As Johnston correctly pointed out, I am by no means the first to make the mistake.) The stakes in this case being rather low, I drafted my column and then forgot about my doubts.
I was a victim of what economists call “informational herding”. Imagine a series of perfectly rational but somewhat lazy economics professors posting accounts of the Hotelling model on the internet. Each professor is unsure of whether the ice-cream story is true or not so each makes his best guess rather than taking the time to make that trip to the library.
If the first two professors are wrong, and include the ice-cream anecdote in their course notes, they set a precedent that it is irrational to ignore. The third professor may recall that Hotelling did not mention ice cream, but rationally doubts his memory when he sees what his colleagues have written. Since he is unsure and reckons that two heads are better than one – and also prefers not to trek to the library – he also includes the anecdote.
Despite the errors and the laziness, everybody involved has behaved rationally. The third professor is correct to believe that his two colleagues are more likely to be right together than he is to be right alone. But the curious thing is that any number of professors may now follow suit, suspecting that the anecdote is bogus but deferring to the collective wisdom of their predecessors.
The same reasoning could apply to people responding to a fire alarm in a large office. Seeing that nobody else is moving, each person is likely to suppress their doubts and stay put. Once the exodus does begin, everyone may pour towards the same exit.
Or consider the success of books such as the Harry Potter series. Millions of people buy such books because millions of other people have bought them. The assumption is that all these other readers can’t be wrong; but the theory of rational herds suggests that only the first few of those readers made an informed decision. Everybody else was relying on the early adopters.
Yet rational herds can quickly change their minds. Everybody knows that the early movers effectively decided the behaviour of the entire herd, because nobody who came later felt confident enough to depart from their decision. That means that if a single commentator did his homework and felt confident enough to dissent, all future professors, book-buyers or office workers would feel similarly confident in following that lead. Just one FT columnist could correct a long-standing myth – if he did his homework properly. SorryFirst published at ft..com.