Advertisement

Evolutionary Idea: What If Artificial Intelligence Isn’t?

Share

The broad thesis of George B. Dyson’s fascinating new book, “Darwin Among the Machines,” is that the networked computers that now surround us by the billions constitute, collectively, a form of intelligent life--one that’s evolving in ways we may never understand.

It’s a provocative idea, certainly--it not only challenges much conventional thinking about artificial intelligence but ultimately has implications for everything from the nature of God to the development of environmental policy.

Yet Dyson’s real achievement lies less in the conception of the idea itself than in the route he takes to get there. The book is an original, creative work of intellectual history--one that reinterprets some great thinkers, rediscovers many obscure ones and exposes the centuries-old foundations of seemingly modern innovations. And it sprang from a personal history intricately entwined with many of the central ideas.

Advertisement

Dyson was blessed, and cursed, with extraordinary family heritage. His father, Freeman Dyson, is a Nobel Prize-winning physicist. His mother, Verena Huber-Dyson, is an accomplished logician. His sister, Esther Dyson, is a well-known computer industry pundit and thinker who was among the first to recognize the cosmic implications of the PC revolution.

But George Dyson, raised amid the intellectual ferment of Princeton University’s Institute for Advanced Studies, spent most of his adult life living in the forests of British Columbia, even in a treehouse for a time. He never finished high school, and eventually carved out a metier for himself as a kayak designer.

He has, though, come to write a book that draws on the thinking of his father, mother and sister, and even leads him to reconstruct seminal events that he all but witnessed as a little boy. Although he injects this personal dimension judiciously, his familial connection to the ideas is never far in the background.

Refreshingly, at a time when the high-tech cognoscenti are obsessively gazing into the future, Dyson’s premise is that it’s more fruitful to examine the past. “If you’re going hiking and you don’t want to get lost, you have to look behind you,” he says.

And so, the book opens with a discussion of Thomas Hobbes’ “Leviathan.” Although the great English theorist is well-known to generations of poli-sci freshmen, Dyson focuses on a different dimension of his work. The Leviathan--the collective human entity, as organized in a society or a polity--is itself an intelligent being, not simply a collection of intelligent beings.

Dyson follows this idea through the development of the theory of evolution, notably in the thinking of Erasmus Darwin, Charles Darwin’s grandfather, and Samuel Butler, a contemporary of Charles Darwin and one of his great antagonists. Butler pushed the idea of a species-level intelligence whose development must be taken into account in any complete theory of evolution.

Advertisement

In the essay that gave Dyson’s book its name, Butler asserted that such intelligence wasn’t limited to people: “We find ourselves awe-struck at the vast development of the mechanical world, at the gigantic strides with which it has advanced in comparison with the slow progress of the animal and vegetable kingdom. We shall find it impossible to refrain from asking ourselves what the end of this mighty movement might be. . . . The machines are gaining ground upon us. Day by day we are becoming more subservient to them . . . more men are daily devoting the energies of their whole lives to the development of mechanical life.”

Much of Dyson’s book is devoted to discerning how “mechanical life,” and our understanding of it, has evolved. He traces the development of the field of logic from Leibniz and George Boole on through Kurt Goedel, Alan Turing and John von Neumann, and simultaneously describes how centuries of thinkers and tinkerers combined to create modern computers and communications systems.

A lot of this makes for difficult reading. Even amateur philosophers are likely to feel overwhelmed by the finer points of how Turing and his wartime colleagues broke the Nazi codes. The discussion of symbiogenesis and the relationship between mathematical logic and biological evolution is often impenetrable. In his rigorous effort to give credit where due, Dyson introduces so many characters and analyzes so many ideas that the narrative sometimes loses its thrust.

He makes up for this, in part, by unearthing long-buried jewels from the history of technology that show just how much of the contemporary high-tech world consists not of invention but of reinvention. Data networking? Check out the optical telegraph systems developed in the late 18th century, themselves based on 17th century theories.

Or consider Leibniz’s description, circa 1679, of an imaginary computer: “A container shall be provided with holes in such a way that they can be opened and closed. They are to be open at those places that correspond to a 1 and remain closed at those that correspond to a 0. Through the opened gates small cubes or marbles are to fall onto tracks, through the others nothing.” Substitute electrical charges for marbles, and that’s basically how a microprocessor works.

Dyson also does a service in recounting the development of the modern computer, especially at the Institute for Advanced Studies and at Rand Corp. The dominance of military priorities in choosing how to proceed is not exactly news, but it’s still striking today: Von Neumann, who perhaps more than anyone defined the computer as we know it, was a fervent cold warrior motivated by the desire to build bigger and better nuclear weapons.

Advertisement

Von Neumann died prematurely of cancer, and Dyson speculates that at the time of his death he was thinking well beyond the limitations of the one-calculation-at-a-time computer architecture he had invented and was considering how a much more complex network design might come closer to simulating what we know as the mind.

Plenty of people have taken up where he left off, and this book offers only a narrow selection. Dyson did all his research in a small library in Washington state--relying not on the Internet but on inter-library loan--and didn’t do any interviews. In focusing on historical origins, he largely ignores the many contemporary debates among philosophers, computer scientists and cognitive psychologists about the nature of consciousness, intelligence and the mind.

That was probably a wise choice; synthesizing all the current thinking on these immense subjects would have been unmanageable for writer and reader alike.

Whether you buy the book’s overarching argument probably depends as much on personal turn of mind as on the evidence Dyson amasses.

But I was continually struck by a paradox: Here is a book that, among other things, advances a new interpretation of evolution and a new theory of what it might mean to be intelligent, and it’s written by a man who’s a living embodiment of much more traditional understanding of those terms.

I don’t know if machines can think, but George Dyson sure can. And I don’t know what gives rise to intelligence in machines, but in Dyson’s case, his family and his childhood environment obviously had a lot to do with it.

Advertisement

“My father always thought a lot about extraterrestrial intelligence,” Dyson says. “It always seemed to me that there was no reason alien intelligence had to be on another planet.”

Instead, he believes, it’s among us as a mass of communicating computers, part of our world, part of nature. And those who fear that computers are a threat to humans’ primacy as thinking beings--see the Big Blue chess computer scare of 1997--are simply missing the point.

*

Jonathan Weber (jonathan.weber@latimes.com) is editor of The Cutting Edge.

Advertisement