Advertisement

Thinking Hard

Share
Anthony Gottlieb is executive editor of The Economist and the author of "The Dream of Reason: A History of Western Philosophy From the Greeks to the Renaissance."

Stephen Toulmin has been thinking and writing about rationality for half a century. First trained as a scientist, he returned from World War II to study philosophy under Wittgenstein in Cambridge. His doctoral thesis (published in 1950 as “An Examination of the Place of Reason in Ethics”) looked at the similarities and differences between rational argument in science and morals. His next book sought to establish that the standards of reasoning used in everyday life were not those suited to science. And his “The Uses of Argument,” published in 1958, studied informal patterns of reasoning, rather than the rigorous formal deductions that the philosophers of those days preferred to study. Academics in his own subject largely ignored it; but the book was influential in departments of rhetoric and communications. Much of his work since then has been in the history of ideas--including the popular and fascinating “Wittgenstein’s Vienna,” co-written with Allan Janik in 1973--and the concept of rationality is never far from the center of Toulmin’s concerns. In one sense, then, “Return to Reason” is a misnomer. He has never left it.

Yet there is now a “loss of confidence”--he does not say “crisis”--in our traditional ideas about rationality, according to Toulmin. Especially among those in the humanities, he argues, the claims of rationality have been progressively challenged over the last 20 or 30 years, to the point of being sidelined. This is a common complaint and not exactly news, but Toulmin does not merely bemoan and rant, as many others have done. He offers a diagnosis and a solution. Rationality has come under threat, he believes, because of the undue influence of classical mechanics and abstract mathematical methods on our idea of what intelligent problem-solving should be. Deduction in the style of Euclid’s geometry, mechanically predictable and rigorous law in the style of Galileo and Newton, indubitable certainty in the style of Descartes’ “I think, therefore I am” all exert a malign influence, insofar as they overshadow a looser, more pragmatic and less abstract concept of “reasonableness.” What we need is more open-minded, informal reasonableness and less inappropriately mathematical rationality. Only then, Toulmin argues, can the idea of reason regain its rightful good name.

The rot set in, according to this way of thinking, when Newton’s dynamics came to be regarded as the ideal model to which all disciplines and forms of rational thinking should aspire. It was the apparent predictability of the physical universe, especially as explained by the Marquis de Laplace at the start of the 19th century, which really did the damage. According to Laplace, the universe was strictly deterministic: superhuman intelligence, if he knew the position of every particle in the universe and every force acting on it, would be able to predict the complete state of the universe at all times in the future. This promise of predictability made every scientist (including social scientists) want to become the Newton of his subject. But, as Toulmin points out, this Laplacean ideal is the physics that never was. Newton’s “Principia” did not establish determinism at all, and so those who were seeking to emulate him were largely barking up the wrong apple tree. It is unfortunate that Toulmin does not pursue this important point in more detail, as it is widely believed that classical physics is deterministic in the Laplacean sense, which it isn’t.

Advertisement

The first discipline attacked by Toulmin for inadvisedly modeling itself on mathematical physics is classical economics. This ought to be a fairly easy target, as the old joke about the two economists and a $50 bill does often seem to be all too true (“Isn’t that a $50 bill lying the street?” “No, it can’t be. If it was, somebody would have picked it up.”) Toulmin’s case against the subject, and several others, is that there is too much focus on what ought in theory to work and not enough attention paid to what does in fact work: an emphasis on doing the sums right, rather than on doing the right sums. But instead of convincing case studies of the right and wrong ways to go about economics, we get merely the sketchiest details of two examples. He calls them “vignettes” but they are really just anecdotes, lacking the depth of analysis to convince us that he has spotted a systemic problem rather than a couple of cases of sloppy practice.

It is, unfortunately, a repeated fault of the book that it fails to connect with its targets in sufficient detail, preferring to range widely over subjects and centuries, to meander and even sometimes to swerve across the line into pure waffle. This is ironic, as the main message of the book--surely a laudable one--is that abstract principles are no substitute for concrete, practical-minded engagement of the subject under investigation. Although it would be hard to disagree with most of what he has to say, the lack of details of opponents’ views makes one wonder if there are not perhaps some straw men being dispatched here. Much of his critique of medicine, for example, apparently boils down to an injunction to treat the person rather than the disease, an idea which is surely now widely accepted if imperfectly practiced.

In his final chapters, Toulmin perks up as he ruminates on the good question of where his old teacher, Wittgenstein, would stand in all this. Like a few other commentators, Toulmin now sees parallels between Wittgenstein and the skeptics of ancient Greece, who claimed to suspend judgment on the truth of all theories and prefer practical remedies for practical problems. It is a striking testament to Wittgenstein’s charisma to see a thinker who has sat at the master’s feet still struggling to interpret and come to terms with what he heard in a bare room in Trinity College, Cambridge, half a century ago.

Advertisement