Advertisement

Cosmic data

Share
Margaret Wertheim, author of "The Pearly Gates of Cyberspace: A History of Space From Dante to the Internet," is at work on a book about the role of imagination in theoretical physics.

IT is a truism of the 21st century that we swim in a sea of information. It gushes forth from the morning paper, it croons from the racks of magazine stands and the million-plus volumes on Amazon.com. It floods down the pipelines of the Internet into our laptops, iPods, PDAs and cellphones. Life is increasingly structured around the delivery and reception of information -- ever more, ever faster and ever timelier data.

Of course information has always been a valuable commodity, but until recently it largely served as a means to other ends, say, as a tool to gain military advantage or a competitive edge in the marketplace. In “Programming the Universe,” Seth Lloyd sets out to convince us that information is the foundation of reality itself and that the cosmos may be seen as a vast computer for processing and transforming data.

Lloyd is a professor of quantum mechanical engineering at MIT and a pioneer in the field of quantum computing. He was the first person to propose how a quantum computer could be built, a task that is now proceeding apace in research labs around the world. Along with a growing number of physicists, he believes that computation and information theory offer a new paradigm for understanding the physical world. His aim in this dense but utterly charming book is to trace the history of computational manipulation of information along with the rise of “information science,” and simultaneously to tell the story of our universe as an unfolding sequence of “information revolutions.”

Advertisement

Humans have been computing almost since we became human. “Like the first tools, the first computers were rocks,” Lloyd writes. “Calculus” is the Latin word for pebble, and the first calculations were made by rearranging pebbles. “Rock computers didn’t have to be small,” Lloyd tells us. “Stonehenge may well have been a big rock computer for calculating the relations between the calendar and the arrangement of the planets.” Rocks eventually led to the abacus, which remains one of the most successful computing devices of all time. Later revolutions were made possible by the slide rule in the 17th century; geared cogs in the 19th century (these formed the basis of mechanical calculators); vacuum tubes in the 1940s; and transistors in the 1960s.

Over the last half-century, the computational power available to us has roughly doubled every 18 months, a fact first articulated by Intel co-founder Gordon E. Moore and formally known as Moore’s Law. But we are approaching the limits of what silicon can do, and it is clear that if we are to keep up the blinding pace of computational enhancement a new technology is needed. Many scientists are hoping that quantum mechanics holds the key and that quantum computers will take us as far past semiconductors as semiconductors took us from pebbles. Lloyd and his colleagues are leading that charge.

As humans have learned to process information on ever grander scales, so too Lloyd sees the history of the universe as a series of information-processing revolutions. The first was the Big Bang, which brought into being from nothingness the cosmic seed of space and time. In the beginning, he tells us, there was very little information in the universe, but as the nascent bubble of spacetime fluoresced into being, its data content exploded. “The Big Bang was also a Bit Bang,” he cutely surmises. Soon the primal bits were coalescing into the structures we know as particles (protons, electrons and so on), which later congregated to form atoms, which in turn clumped together to make stars and galaxies. “Every time a new ingredient of the soup condensed out ... new information was written in the cosmic cookbook.”

Planetary systems formed and more complicated molecules came into being, including eventually the life-encoding information structure known as DNA. The evolution of higher organisms required a revolution in intracellular communication or information processing between cells, as did the development of immune systems. Living things in the form of hairless apes soon began to orchestrate their own informatics revolutions. “Life, language, human beings, society, culture -- all owe their existence to the intrinsic ability of matter and energy to process information,” Lloyd writes.

This view of reality has been gaining ground in scientific circles for several decades. But though it has come to the fore in the age of computers, its roots lie in the attempts of 19th century physicists to understand steam engines. That effort led to the laws of thermodynamics and the articulation of the concept of “entropy,” a mysterious quality that always increases when any action is carried out. It turns out that entropy is a measure of the information content of a system and as time goes on, both entropy and information in the universe increase.

Though the thermodynamicists of the 19th century did not understand it at the time, they were casting information as a fundamental quality of nature, along with that other great pillar known as energy. In the informatics’ view of reality, energy and information are the two complementary forces acting upon matter. As Lloyd puts it: “Energy makes physical systems do things. Information tells them what to do.”

The idea that the entire universe may be an information-processing system -- some sort of giant computer -- was first put forward in the 1960s by Edward Fredkin (then at MIT) and Konrad Zuse, who constructed the first electronic computers in Germany in the 1940s. Both Fredkin and Zuse proposed that our world might be a digital computer, but since the universe is built on a quantum mechanical foundation, Lloyd tells us that a digital computer could never capture its immensity. Only a quantum computer would be sufficiently complex to do that. In theory, Lloyd says, we could make a quantum computer that would model the complexity of the entire universe, a simulation that would be effectively indistinguishable from the actual universe.

Advertisement

Since the universe itself is built on quantum systems such as atoms, and since all quantum systems are continually exchanging information, Lloyd concludes that the universe must be a giant quantum computer. And what does this machine compute? “It computes itself. The universe computes its own behavior.”

In his final chapters, Lloyd argues that understanding quantum computation will lead us to an understanding of how the universe truly works and what it is truly doing. Specifically, he believes this line of research will resolve one of the major questions in science today -- how increasingly complex systems, such as molecules and people, have emerged from simple ones, such as electrons and protons. That is no small claim. But can quantum computing shoulder such a heavy epistemological burden?

Throughout history, humans have interpreted the world in terms of things they know. The ancient creator gods behaved like super-humans, coupling and breeding and giving birth to the cosmos, or fashioning its elements from familiar technologies such as weaving or molding clay. Modern scientific accounts also have drawn heavily on familiar contemporary tropes: In the 17th century, the universe was seen as a vast clockwork system. By the 19th, when the study of magnetic and electrical phenomena was hot, it was reconceived as a network of invisible force fields. At the dawn of the age of digital computers, scientists speculated that it was one of these machines.

Inevitably, we see the whole through the lens of the particular. Though I do not doubt that the quantum computational view will yield new insights, history would suggest that a final metaphor is illusory. To reify that metaphor and insist the universe is a quantum computer seems almost as philosophically quaint as the Norse myth Lloyd describes in which “the universe begins when a giant cow licks the gods out of the salty lip of a primordial pit.” Although I question his conclusion, the journey Lloyd takes in this rich and complex book is genuinely thrilling. For those willing to work through some difficult examples, it is a clear account of quantum computing and an insightful rumination on both the physical and computational sciences. *

Advertisement