Turn up the lights, and the workers work harder. Turn them down again, and they work harder still. The “Hawthorne Effect” is named after Western Electric’s titanic Hawthorne Works in Cicero near Chicago, where a series of productivity trials was carried out between 1924 and 1932. Led by Elton Mayo, a professor at Harvard Business School, they are among the most famous experiments in social science. Not every social scientist is impressed.
Richard Nisbett, a social psychologist at the University of Michigan, complained to The New York Times a decade ago about the study’s fame, calling it a “glorified anecdote”. He had a point. Among managers, the study is generally held to demonstrate that people respond to change: whatever you do, output rises for a while, as long as you do something. Inside academia, “the Hawthorne Effect” refers to the idea that people work hard once you start experimenting on them. Both beliefs are surprising enough to be interesting, while nicely confirming the prejudices of those who hold them.
Interested psychologists have known for a while that all was not well with the study. The experimental room was smaller and quieter than the factory floor. Two workers were sacked and replaced during the study for talking and idling. Experimental conditions were changed on Sundays, so each surge in productivity coincided with a Monday. These were not controlled conditions in the modern sense.
Yet the story survives. As Nisbett complained, “Once you’ve got the anecdote, you can throw away the data.” And for decades that is exactly what seemed to have happened: the data from the most famous Hawthorne experiment – the illumination study – were thought to have been lost, and perhaps deliberately destroyed.
Now two Chicago-based economists, Steven Levitt (best known as the co-author of Freakonomics) and John List, have unearthed the original data in libraries in Boston and Milwaukee, following clues buried in an appendix to an old article in the American Sociological Review.
Levitt and List are fond of experimental studies, but think the effect of being scrutinised sometimes contaminates such experiments. “We believe that there is a Hawthorne Effect,” says List, referring to the idea that people behave differently when studied, “but there is little evidence of it in the actual Hawthorne data.” As for the idea that turning the lights up and down makes a big difference, Levitt and List conclude that “existing descriptions of supposedly remarkable data patterns prove to be entirely fictional.”
It is not the only time that an experiment’s reputation has far outrun what was actually discovered. In 1967, the psychologist Stanley Milgram asked 160 people in Nebraska to get a letter to a stockbroker in Boston, passing it only to someone with whom they were on first-name terms. The popular account says that the letters arrived after six steps – and that we are all just six handshakes away from anyone on the planet. The reality, as the psychologist Judith Kleinfeld found, was that more than 80 per cent never arrived. Follow-up experiments concur. A recent BBC documentary “recreated” Milgram’s experiment, with a Boston-based scientist receiving parcels from all over the world via only six connections. But 37 of the 40 parcels never arrived.
In some ways, the Hawthorne and “six degrees” experiments are the least troubling examples. They are famous enough to have been challenged. Less celebrated “findings” circulate in academia, exerting plenty of influence. They are never put to the test. Trying to replicate old results is rarely regarded as social science worth publishing in the top journals. That must change.
Also published at ft.com.