Advertisement

SCIENCE / MEDICINE : Chasing the Ever-Elusive Quark : Physics: The particle is the most pervasive entity in the universe. But tracking it down requires the most powerful computers ever built.

Share
<i> Zorpette is associate editor of Spectrum magazine in New York City</i> .

Aided by tremendously powerful new computers, physicists around the world are making an ambitious attempt to come to terms with the quark, which may rank as the most pervasive yet elusive entity in the universe.

In groups of three, quarks are believed to make up protons and neutrons, the basic particles at the core of all atoms, from tiny hydrogen to giant uranium nuclei. But the same theories that describe how quarks bind to form those particles, a branch of physics called quantum chromodynamics (QCD), also predict that quarks will never be torn free from one another and observed individually.

That is bad news for physicists, whose standard procedure for studying particles involves producing them in accelerators, often through collisions with other particles, and then observing how they interact with other particles. Lacking such an alternative for quarks and faced with tremendous computational requirements, physicists have turned to computers. So far, they have built one of the world’s most powerful computers and have begun planning vastly more powerful machines.

Advertisement

“In the last five or six years, QCD theory has (become) one of the major uses of supercomputers,” said Anthony D. Kennedy, a research scientist at the Supercomputer Computations Research Institute at Florida State University in Tallahassee.

“It’s not that this problem is so well suited to computers,” he added. “It’s just that it seems to be immune to any other technique.”

Physicists believe that a better understanding of quarks could contribute to a more complete understanding of the early universe--and quite possibly of today’s universe as well. Before it was even a thousandth of a second old, the universe is believed to have existed as a high-energy primordial soup, or plasma, of quarks and gluons, the particles that quickly began to “glue” quarks together into various atomic building blocks as the universe cooled and coalesced.

Powerful computers and programs might also help physicists understand or possibly even predict data from giant particle accelerators in the United States and Europe, where matter is studied in its most fundamental forms and new physical theories are routinely put to the test. “Most of what happens (in the giant accelerators) is governed by QCD,” said Don Weingarten, a research staff member at IBM’S Thomas J. Watson Research Center in Yorktown Heights, N.Y.

But the computers that will help interpret accelerator data are years away and will be many times more powerful than today’s machines. Kennedy, for example, is using commercially available supercomputers to try to simulate the complex behavior of quarks and gluons in nuclear particles. He calls those supercomputers, the best of which are capable of performing 1 billion or 2 billion mathematical operations every second, too slow to yield reasonably accurate simulations.

So researchers are building and operating specially designed massively parallel computers, so named because they have hundreds of internal processors that operate in parallel--rather than performing tasks one after another serially, as in more traditional computers--providing extremely high computational rates. Physicists at Columbia University in New York City began using such a machine last fall. Elsewhere in the United States, similar machines are nearing completion at IBM’s Watson Center and at the Fermi National Accelerator Laboratory in Batavia, Ill. And at least two projects of note are under way outside the United States, including one at the University of Tsukuba in Japan. The other, at the University of Rome in Italy, has been dubbed “bee,” a whimsical allusion to a busy beehive.

Advertisement

Fittingly enough, the Columbia computer is in the Pupin Physics Center, where, over coffee with colleagues one winter day early in 1963, Caltech physics professor Murray Gell-Mann first used the word quark to describe the particles. Gell-Mann and a former student of his, George Zweig, independently came up with the idea of a new particle to explain perplexing findings about subatomic particles.

Zweig called the new particles “aces,” but Gell-Mann plucked the word quark , an archaic term for the cry of sea gulls, from “Finnegans Wake,” the last novel by James Joyce. For his work leading up to the quark proposal, Gell-Mann won the 1969 Nobel Prize for physics.

The advantages of the parallel computers are linked to the simulations they perform. Researchers are trying to simulate a volume of space surrounding a proton, neutron or some other particle made up of quarks and gluons, according to Norman H. Christ, a physics professor at Columbia and leader of the physics department’s computer project.

With a parallel machine, he noted, the volume of space surrounding the particle can be divided into still smaller subregions that interact with one another and that are then distributed among the machine’s many internal processors. In the case of the Columbia computer, there are 256 processors connected in a flat array, like a checkerboard with 16 squares on each of four sides. Each of the squares (computer scientists call them nodes) represents a specific subregion within the simulated space.

Like most physical phenomena, the quark-gluon interactions tend to be localized--things happening in one subregion tend to affect only adjacent subregions. That makes the simulation job easier because adjacent subregions can be simulated by adjacent nodes. That way, the only nodes that need to communicate or exchange information are ones next to each other, which greatly simplifies the overall design of the machine.

In the Columbia computer, each of the processor nodes can perform up to 64 million mathematical operations a second, so together they could theoretically yield a rate of 16 billion a second. Such ideal performance is never actually achieved. For QCD calculations, the machine sustains a rate of about 6 billion operations a second, Christ said. But that rate is still three times the speed achieved with the same programs running on Cray Research Inc.’s Y-MP/832, a $20-million general-purpose supercomputer.

Advertisement

The Columbia machine cost about $1.5 million to build and was funded mainly by the U.S. Department of Energy’s high energy physics division, with additional donations from Intel Corp. in Santa Clara.

But even such dazzling processing rates are not really up to the problem at hand, researchers agree. The programs that physicists are working with begin by simulating complex fields associated with quarks and gluons. Thousands of different configurations of those fields are computed, each according to the mathematical probability that the field will occur. Each of the configurations, in turn, is specified by millions of numbers. Finally, the physical quantity of interest--the energy of an isolated quark or of a quark-gluon plasma, for example--is computed by taking an average over the myriad configurations.

The approach is straightforward but sprawling. Gell-Mann said, “It is not at all obvious that it’s the most efficient way. I regard this as a brute-force way of using computers. It is my intuition that there exist better ways.” For the moment, however, no one knows how to find them.

Part of the problem lies in the complex, completely interdependent behavior of quarks and gluons. While it may make sense sometimes to study an atom’s electrons and nucleus independently, for example, such a technique is fairly useless for quarks and gluons because the theories change radically in the absence of one particle or the other.

Also, physicists believe that quarks exist in six different types, called “flavors,” although only five have been found so far. Quarks of all flavors are subject to three kinds of binding charges, which are called “colors.” There are eight kinds of gluons, all of which are “colored,” so they both exert color charges and are subject to them.

As Kennedy put it, “Quarks and gluons are rather strange beasts.”

Given such difficulties, most physicists involved in computational QCD are reluctant to even guess how powerful a computer they may need to simulate the particles convincingly. One figure sometimes mentioned is 1 trillion mathematical operations per second, a performance goal that would have seemed absurd only a few years ago. But in mid-January Kennedy and Christ hosted a meeting in Tallahassee where 42 physicists from around the country agreed to draw up a detailed proposal for the machine.

Advertisement

How will physicists know when they have built a powerful enough machine? One test would be computing the mass of a proton, which should be possible if the theoretical equations of QCD are correct and a powerful enough computer is available to churn through them. The proton’s mass has already been determined through experiments and could be checked against the computed result.

What happens after that is mostly a matter of conjecture. Frank R. Brown, an assistant professor at Columbia, looks forward to more accurate simulations of quark-gluon plasmas, infinitesimal replications of the universe at its earliest instant. The simulations might even be verified by plasmas created in future particle accelerators, in which uranium atoms, stripped of their electrons, are smashed together at tremendous speeds.

At IBM, research staff member Weingarten hopes that future computers and programs might shed light on various proposals to unify the four fundamental forces discovered in nature so far--one of the most significant challenges in all of science.

“It’s conceivable that attempts to make sense of some of these proposals may benefit from algorithms and hardware developed for QCD,” Weingarten said in an interview. “Numerically, mathematically, it’s more or less the same kind of problem as QCD.”

Using Supercomputers to Simulate Quarks

Scientists are using a specially designed supercomputer with hundreds of internal processors--arranged like a 16-square checkerboard--to simulate the three quarks that exist within nuclear particles, such as protons. Programs that run on these supercomputers begin by dividing the particle’s volume into hundreds of smaller volumes, each of which is then simulated on an individual processor.

Advertisement