In 2010, the British science writer Matt Ridley debuted as a classical liberal with his book The Rational Optimist. Ridley’s “coming out” was eventful and exciting for libertarians all over the world. A former staff writer and head of the Washington bureau of The Economist, a successful science author and, more importantly, a gifted narrator, Ridley condensed in his thick book much research and wisdom. The financial crisis appeared to many to have dispensed with free market ideas once and for all. Ridley pointed out that, to the contrary, free markets were actually producing prosperity, food, cleanliness all over the world—particularly for the world’s poor.
The Rational Optimist was chiefly concerned with data and facts, of the kind often gleaned from “standard” history. Ridley’s remarkable new book, The Evolution of Everything: How Ideas Emerge, is more focused on human interpretations of the world we live in—that is to say, on ideas. Ridley draws on an impressive and heterogeneous literature, from Richard Dawkins’ The Selfish Gene (1976) to George Selgin’s Good Money: Birmingham Button Makers, the Royal Mint, and the Beginnings of Modern Coinage (2008), to show that evolution doesn’t work only in biology.
In arguing that evolution can explain the universe and morality, life and culture, personality and the Internet, Ridley is not trying to make a passe-partout word out of it, as had been done with the term “dialectic” a few decades ago, but to outline two fundamentally different ways of looking at things. “The history of Western thought,” he writes, “is dominated by skyhooks, by devices for explaining the world as the outcome of design and planning.” Look about you in search of intentionality and design; that’s a possibility. But there is another way of seeing things: to accept that the world goes on largely on its own, that “great men” are more instrumental than necessary, and that all changes tend to be “incremental.”
F.A. Hayek distinguished between a “French” and a “British” individualism. Hayek wrote about thinkers, but the geographical references to France and England had more to do with a vignette of “national character” than with polar stars in the history of thought. (For consider that Montesquieu and Tocqueville happened to be “British” individualists, and Bentham and Mill “French” ones.) The gist of Hayek’s argument was that you can either see social reality as something a single human mind can design and rebuild, or as a product of the interactions of so many individuals that no one of them can fully master that reality.
Hayek’s “French individualism” is not merely a political theory. Faith in interventionism and mysticism basically answer the same elemental human need: the search for a “visible hand,” to trust or to blame. As Ridley puts it:
The reluctance to accept coincidence lies at the heart of telepathy, spiritualism, costs and other manifestations of the supernatural. The mystical mentality insists that something caused a coincidence; something made things go bump in the night.
The reluctance to accept coincidence lies also at the heart of a faith in government action. In fact Ridley compares social planning to “creationism.” In Human Action (1949), Ludwig von Mises remarked that “The historical role of the theory of the division of labor as elaborated by British political economy from Hume to Ricardo consisted in the complete demolition of all metaphysical doctrines concerning the origin and the operation of social cooperation.” Ridley wholeheartedly agrees.
In a way, this book can be read as a conversation between Ridley and his scientist friends. They fall prey to an inherent contradiction. Understanding evolution in nature, they have deep admiration for the beauty of self-organized ecosystems. But concerning the human things, they often take a top-down view, under which the human self-organizing ecosystems call for corrections from the bright and the bold.
Can this be done, without paying a price in less economic growth and thwarted innovation? Ridley is skeptical. By the same token, he reveres Charles Darwin but vigorously denounces eugenics as an attempt to forge a new man. Evolutionary thought does not entail forging anything, but rather, pausing and appreciating the achievements that have come about even though they aren’t under our control.
This is true for Darwin’s finches as well as for iPads and wristwatches. Writes Ridley:
Economic development is more than just a growth of income—it is the appearance of a whole system of collaborative engagement among people to drive innovation that cuts the time it takes people to fill needs.
Innovation is a product “of human action and not of human design,” and efforts to explain it by other reasons than increased division of labor are unsuccessful.
Building on the work of Allison Wolf, Ridley explains that “the countries with the most education simply do not show greater productivity growth than the ones with less.” And building on the work of Terence Kealy, he shows that technology isn’t the daughter of science. Perhaps the opposite is true: “Technology comes from technology far more often than from science. And science comes from technology too.” The steam engine “owed almost nothing to the science of thermodynamics, but the science of thermodynamics owed almost everything to the steam engine.”
In saying so, Ridley is once more at odds with his scientist friends. The latter, taking the top-down view, tend to emphasize their own role and imagine a linear transition from the world of ideas to the world of things: first comes the discovery, then the prototype. Ridley instead suggests that the discovery procedure is less of a neat thing. It may happen by the hand of tinkerers, by trial and error. Yes, innovators are key, but sometimes innovations are the low-hanging fruits of a long series of incremental steps in a certain direction.
Thus social engineers are to him like creationists, basically unable to marvel at the accomplishments of self-propelled spontaneous orders.
When he turns from economics to the evolution of culture, personality, even the mind, the libertarian himself seems to enter into contradiction. In this sphere Ridley seems to incline toward a deterministic approach. The self, he writes, “is a construct, a story told after the fact to bring unity to what is actually a diverse experience.” We humans are “nothing but the neural signals of our brain.” Advances in the neurosciences will undermine the notion of deliberate behavior in many respects.
Will this fact weaken the case for advocating individual liberty? On the one hand, realizing that we are “nothing but the neural signals of our brain” does not necessarily rule out free, purposeful actions. The fact that it needs gasoline to move doesn’t mean that the car doesn’t move. On the other hand,
the history of the Western world shows that, as we have gradually embraced bottom-up explanations, we have stopped blaming people for things that were not their fault. We once blamed ill people for the wickedness that led them into illness; or accident victims for the sins that they were being divinely punished for. As late as the 1960s . . . we blamed and punished homosexuals for their inclinations, refusing to believe that they were a product of their internal influences—genetic or developmental.
Ridley tries to draw the line in the right place when he says that the drive to understand genetic or developmental inclinations may aid and abet “people who make spurious claims to diminished responsibility to escape harsh sentences.” But then he loses the thread, concluding that society should not hang onto superseded visions of personal responsibility, but rather consider that “criminal punishment” is something we inflict for society’s security, rather than moral blameworthiness. Ridley’s attempt to reconnect social sciences and biology and evolutionary psychology is what makes The Evolution of Everything a most challenging book—and yet, in some regards, The Evolution of Everything may be trying to prove too much.
The review of it in the Guardian by the British philosopher John Gray acerbically makes this point. Gray and Ridley have crossed swords before, on The Rational Optimist. (Ridley’s retort to Gray concerning that book is here.) This time out, Gray argues in a vein that many will find persuasive: that this wonderful evolutionary tale doesn’t take into account that while the 20th century was indeed about scientific progress and technological development, it also saw genocide, war, and death on a larger scale than ever. Warfare got “industrialized,” too.
Not by chance, Gray compares Ridley to Herbert Spencer. As a young man, Spencer thought he could be rationally optimistic: the Corn Laws were abolished, free trade was proclaimed, industry was advancing fast. “Industrial” societies seemed to him to be taking the place of “militant” societies. But saying that Spencer not only desired but expected the arrival of limited government trivializes his thought. Spencer expected and desired a continuous decrease of coercion in human societies. But that didn’t happen, remembers Gray, and after Spencer realized it, he spent his last years “in baffled gloom.” That was well before Auschwitz and Hiroshima.
Shouldn’t a thoughtful libertarian be rationally pessimistic, then?
Ridley can certainly defend himself on this score, but it seems to me that there is a misunderstanding here. He is far from pretending that evolution always overcomes the—indeed powerful—forces that oppose it. Allegiance to the “great man” theory of history, or social engineering of various sorts, has consequences. Ideas have consequences indeed. People can’t see “the evolution of everything” at work. They search for skyhooks everywhere. Ridley doesn’t deny that, but points to other, evolving tendencies in human societies. This book does not claim that history ended. It is built on the premise that many would like to stop or slow evolutionary, spontaneous processes. If this wasn’t case, after all, why bother to write a book?
Libertarians can adopt either a pessimistic or an optimistic attitude toward modern times. The pessimistic narrative is what grounds the ever-growing involvement of government in human affairs. The nation-states of the 19th century may have regulated professions and distributed patronage, but they seldom commanded more than 5 percent of the gross national product. They indebted themselves to wage wars, but balanced the budget every 10 years or so. The most interventionist of politicians back then held a more modest vision of government than do even self-styled conservatives today. After two world wars and decades of Keynesianism, spending skyrocketed and huge public debt became the rule rather than the exception. Liberty, in the concrete sense of people deciding what to do with their own money, seems gone—or at least this is what five minutes with your tax adviser may show you.
The optimistic narrative points to economic growth, for over-regulated capitalism is capitalism nonetheless. In spite of many obstacles getting in the away, the profit motive has been raising people out of destitution as never before in human history. Higher taxes in the West seem bearable (at least to the extent of not provoking social uprisings) because of increased average incomes. On a global scale, living standards went up even as the planet’s population did. If anything deserves the name “progress,” it is feeding more people better than you fed less people before.
Ridley unabashedly champions the optimistic narrative. He knows as well as anyone else that government has gotten bigger and bigger, but he thinks the benefits of worldwide spontaneous evolutionary cooperation have much outweighed the costs of top-down, mostly failing, governmental action. The body is stronger than its parasite. His optimism is based upon an astonishing fact: If purposeful central planning tends to produce misery, producing wealth does not require much more than allowing people to live their lives as they wish. We shall be optimists not because we have a “vision” of the future to implement—but because we don’t really need one.