The Myth of Linear Science

Matt Ridley, author of "The Rational Optimist: How Prosperity Evolves" has caused a small stir with a Wall Street Journal article arguing, among other things, that scientific research does not drive technological innovation and therefore should have less public funding. Not surprisingly, basic scientists are unimpressed. Here is a rebuttal by theoretical physicist and blogger Sabine Hossenfelder.

There is some editorial click-baiting at work here. Ridley's article bears the headline "The Myth of Basic Science", but his main point is about innovation. He argues it is a spontaneous and largely autonomous process where lots of nameless people drive technology forward every day by solving real world problems thrown up by the existing technological world. In this view, lone geniuses are not very important but neither is there a predictable linear process where scientific results get turned into new technologies.

But Ridley's dig at fundamental science is real. I think his argument is that:

  1. Technology moves forward spontaneously by real-world tinkering.

  2. In the processes, it throws up scientific insights; e.g. 18th & 19th century physicists had to learn from, but did not teach, steam engineers about thermodynamics.

  3. When it needs to, industry will fund whatever fundamental research that is actually useful.

  4. By contrast, governments will malinvest research funds by trying to fuel a misconceived linear process of innovation.

  5. Therefore if public funding for science falls, we will get more technological growth and just as much science as we need.

(Ridley does not address the value of scientific knowledge for its own sake, unless implicitly via point #2).

Points #1, #2 and #4 are mostly right. But real-world tinkering is more dependent on fundamental science than Ridley imagines. As a result, #3 is mostly wrong. Thus #5 doesn't follow logically.

I am an engineer. Engineering is indeed all about tinkering with real-world systems. But to do this well, I must call upon a whole world of knowledge which is an unpickable tangle of facts, some obviously pragmatic and some abstruse. Except that even the abstruse stuff is needed to solve problems.

And only a tiny bit of that knowledge is in my head; most of it is encoded in my tools. If I write a function-prototype in C++, then the programming language takes care of many concerns, from nitty-gritty bookeeping to the complexities of abstract logic and type theory. Such theory was kicked off by people like Bertrand Russell when they wanted to understand the Nature of Ultimate Truth.

The same goes for intellectual tools. Take voltage. Every day, technicians use that concept to solve problems in ways that often need little theoretical understanding. But the concept of voltage packages up lots of subtle considerations about the electromagnetic field. It is a concept we didn't even have until 19th century pure scientists discovered it.

None of this means that electrical & computer engineering are just the result of science getting applied via Ridley's dreaded linear process. But it does mean that basic science is a crucial part of our toolkit for building technology.

Perhaps Ridley would respond here that I am proving his point for him, by showing that industry has a strong motive to do basic science. But that only works for applied science. A company that makes metal-bending tools might spend lots of money to learn unbelievably abstruse things about the physics of plastic deformation and its metallurgic side-effects. But by contrast, no one who made candles or other flame-age lighting would have even imagined trying to discover electricity just on the off-chance it could power a light-bulb.

Such a research programme would be just the kind of goal directed "linear" process that Ridley rightly derides, and is even less likely to work in the private sector than it in the public. And the irony is that ivory-tower science is exactly the kind of spontaneous, undirected alternative that he prefers.

True, in the short term, science is influenced by academic curiosity and the plans of funding agencies. But in the bigger picture, it is shamelessly opportunistic, advancing in whichever direction progress is easiest. Academic fads go wherever nature hints that it might reveal some secret, and if that proves more than a hint, a new field is born. The result is that knowledge grows where there is knowledge to be found. This system grabs much more knowledge than could be gained by butting heads against walls in pursuit of some pre-conceived notion of what will be useful.1

Now all I've shown here is that, contra Ridley, technological growth requires spending on basic, curiosity-driven, industrially unprofitable science. I've shown nothing about how much governments should contribute. Maybe a better job can be done by old-style philanthropy, new-fangled crowdsourcing or some other mechanism we haven't even imagined yet. But all that is wild speculation. Anyone with the healthy respect Ridley has for evolution should be weary of throwing out a tried and tested system in favour of some radical theory.


  1. This amounts to a utilitarian argument for doing science for its own sake. I didn't think such an argument existed until I wrote the above paragraph. It saves me from having to write a hedged justification in terms of democratic legitimacy. Hossenfelder sort of makes a start for such an argument in her post. 

Comments