Complexity is the mother of invention
A brief history of innovation: In the beginning, there were lone inventors who changed the world. John Harrison was one of the most prominent – the clock-maker became famous and (eventually) rich in the 18th century by building a clock so accurate and so resilient in the face of changing temperatures and constant rocking that it could be taken on board a ship and used to calculate the ship’s longitude. In doing so, Harrison pitted himself against the might of the Royal Observatory, which had been established in 1675 by King Charles II in order to solve the longitude problem with an astronomical method. The loner got there first.
As science and technology progressed, innovation became more and more industrialised. Thomas Edison set up perhaps the world’s first industrial research laboratory at Menlo Park, New Jersey, in 1876. Edison set the tone for the 20th century, with expensive research projects carried out on a colossal scale. Among the most famous were government efforts such as the Manhattan Project, to create the first atomic bomb, and the Apollo moon landings.
And then, towards the end of the last century, the tide seemed to turn in favour of the innovation minnows once again. Companies such as Microsoft and Google were set up in spare rooms and garages. Large companies seemed to be abandoning in-house research and buying start-ups. Powerful computers became cheap enough for most pockets.
The culmination of this process is the likes of Facebook, whipped up in a few days by a Harvard student. Soon after Facebook’s launch, Mark Zuckerberg said: “I think it’s kind of silly that it would take the university a couple of years to get around to it. I can do it better than they can, and I can do it in a week.”
But are Facebook and Google symbolic of a new trend towards micro-innovation by individuals or small teams? Or are they exceptions to the implacable march of technology towards ever larger and more expensive research efforts, requiring multi-billion-dollar tools such as the Large Hadron Collider?
An economist at the Kellogg School of Management, Benjamin F. Jones, has been trying to look beyond the eye-catching denizens of Silicon Valley to test this question with some meaningful numbers, based on patent citations. Jones is worried about what he calls “the burden of knowledge”. Facebook may have been easy for a young, talented creator to produce, but Jones fears the general trend is in the other direction. If he is right, scientists will have to master an ever greater body of knowledge before they can make a contribution – or specialise earlier and join teams of other specialists. Technological progress will become ever harder.
The evidence suggests that Jones is right to be concerned. The trend away from the lone inventor has continued, with the size of teams listed in patent citations increasing steadily since Jones’s records began in 1975. The age at which inventors first produce a patent has also been rising, and specialisation seems sharper; lone inventors have become less likely to produce multiple patents in different technical fields. “Deeper” fields of knowledge, whose patents cite many other patents, attract larger teams. Compare a modern patent to one from the 1970s and you’ll find a larger team filled with older and more specialised researchers.
All this suggests that innovation is, broadly, a more complex and expensive process than it used to be. Isaac Newton once told his rival Robert Hooke, “If I have seen further it is only by standing on the shoulders of giants.” The climb up the giants’ backs appears to be becoming more and more arduous.
Also published at ft.com.