Advertisement

Supercomputers Central to Nuke Test Ban’s Failure

Share

If the 20th century is remembered for anything, it may be for the invention of two technologies: computers and nuclear weapons.

In the minds of most people, these technologies are distinctly different--one ubiquitous and “personal,” the other remote and frightening. But computers and nuclear weapons have had a symbiotic relationship for more than 50 years, beginning with the world’s first digital computer, ENIAC, which was used to perform calculations necessary for development of the first hydrogen bomb.

The relationship between computing and nuclear weapons has been brought into the forefront in the last few weeks because of the debate over ratification of the Comprehensive Test Ban Treaty, which the U.S. Senate rejected Oct. 13. That was a catastrophe for nuclear arms control and a vote condemned all over the world.

Advertisement

The argument of the Clinton administration, which supported ratification, was that advanced supercomputers will allow “virtual testing” of nuclear warheads through computer simulations, replacing the need to conduct explosive underground nuclear tests. The U.S. Department of Energy is currently implementing the largest computer research and development program in the world, the Accelerated Strategic Computing Initiative (ASCI), to develop supercomputers and software programs that can conduct such virtual testing.

This argument was a “huge mistake,” said Dr. Chris Payne, a senior researcher at the National Resources Defense Council in Washington and a former nuclear weapons expert for the Senate. Payne called the administration’s strategy for defending the treaty “a shot in the foot, a self-inflicted wound.”

“The ASCI program is not necessary for protecting the reliability and safety of the U.S. nuclear arsenal,” said Payne, who wrote an article on virtual testing of nuclear weapons for the September issue of Scientific American. Payne believes we already have engineering techniques that can verify the reliability of current nuclear weapons without great advances in computer simulation.

Payne said President Clinton was convinced by advisors from the Department of Energy that advanced computer simulations must replace underground explosive testing to guarantee the reliability and safety of U.S. nuclear weapons.

The problem is that the Department of Energy’s radically ambitious ASCI program could take 10 to 15 years to complete. When DOE officials told senators about this timetable during the treaty ratification hearings, even the Republican moderates backed off supporting the treaty, Payne said.

“We’re into a new world here,” he said. “We have met the enemy, and he is us. The national weapons laboratories are so detached from the needs of international security that our policy becomes self-defeating.”

Advertisement

The U.S. nuclear weapons complex, which involves the three national weapons laboratories of the Department of Energy--Sandia and Los Alamos in New Mexico and Lawrence Livermore in Northern California--as well as several universities, especially the University of California, have long planned for a world without underground nuclear weapons explosions.

Nuclear weapons engineers rely on software codes for simulating the behavior of nuclear weapons, and each of these code packages typically runs to millions of lines of software programming. Weapons designers and “stewards” attempt to model and simulate the explosion of a specific nuclear warhead, an event that lasts about a millionth of a second. The U.S. codes, the most advanced in the world, are highly classified and the product of decades of experience with nuclear explosions, both real and virtual.

As computer processing power has increased, the need for actual nuclear explosions for testing has receded. Scientists have acquired more data to refine the bomb codes, progressing from one-dimensional studies to two-dimensional simulations. Even those require days of intensive computation on the world’s fastest computers.

The current goal of the weapons laboratories is to develop three-dimensional modeling capabilities, which will require supercomputers very much faster than we have now--on the order of 100 teraflops, or 100 trillion floating operations per second. Right now, we have machines that can perform in the range of 3 to 4 teraflops, such as IBM’s “Blue Pacific” supercomputer at Livermore and Silicon Graphics’ “Blue Mountain” supercomputer at Los Alamos.

Both facilities expect to have 100-teraflop machines by 2003, an extraordinarily ambitious benchmark.

“This program is equivalent to the Manhattan Project or the Apollo Project, in terms of its ambitions and its financial cost,” Payne said.

Advertisement

On Oct. 8, the University of California, the prime contractor for Los Alamos National Laboratory, announced a contract to build the world’s largest and most powerful computing facility at Los Alamos, a three-story laboratory that will house a 30-teraflop supercomputer and more than 300 nuclear weapons engineers.

The issue is whether any of this is necessary to guarantee the safety and reliability of our current nuclear arsenal.

Payne and other scientific critics think the primary purpose of the ASCI program is not to maintain the U.S. arsenal--he believes that can be done with what we know today--but to develop new nuclear weapons.

The Department of Energy insists that no new nuclear weapons are under development today. But the ASCI program might give us that capability even in the absence of underground nuclear testing.

The jury is still out on whether computer simulations alone can be used to design and test new nuclear weapons, Payne said.

“We know that the weapons labs are modifying, improving and certifying nuclear warheads using only computer simulations,” he said.

Advertisement

The shift to virtual design and testing of nuclear warheads also has grave implications for nuclear proliferation. The massively parallel supercomputers now used by the weapons labs are essentially large collections of commercially available computers networked to perform parallel, simultaneous computation. Nuclear weapons codes are being rewritten to run on these machines. That makes national security concerns about exports of high-performance business computers even more acute.

A January 1998 NRDC report on supercomputers and nuclear weapons says the link between computers and bombs implies “that an equivalent curb on the arms race”--equivalent to a test ban--”could have been achieved, and in theory could still be achieved, by placing limitations on weapons computing.”

That’s not on the table yet, but perhaps it should be. How long will we have to “run in place” to cope with weapons that should never be used?

*

Gary Chapman is director of the 21st Century Project at the University of Texas at Austin. He can be reached at gary.chapman@mail.utexas.edu.

Advertisement