Advertisement

Supercomputer Network Gives U.S. a Chance to Catch Up

Share
Times Staff Writer

Astrophysicist Larry Smarr needed supercomputers a decade ago to handle the complex equations of his work on the black holes of space. But no American universities had such advanced machines for basic academic research, so the University of Illinois professor had to beg computer time from friends at government military labs and in Germany, where non-military use of supercomputers was more common.

Smarr’s 10 years of frustrations finally ended Monday when the National Science Foundation announced that it will spend $200 million over five years to establish and operate four supercomputer centers that scientists nationwide will be able to tap for their research.

Smarr and a small cadre of colleagues committed to maintaining American supremacy on the frontiers of science played crucial roles in drawing up reports and lobbying Congress for money to create the national supercomputer network.

Advertisement

They believe the new centers--at UC San Diego, the University of Illinois, Princeton University and Cornell University--will arrest more than a decade of inattention by national leaders to the future computer needs of American science. Increasingly, scientists in fields ranging from astronomy to molecular biology require supercomputers, which have the capability to handle 1 billion calculations per second. Such machines already have become everyday tools in the aerospace and oil exploration industries--as well as in military research--but have not been generally available to researchers at universities because of their prohibitive cost.

“The United States was asleep (technologically) during the 1970s, and Germany and Japan weren’t,” Smarr, who will direct the Illinois center, said in a telephone interview this week. “What Germany and Japan did in putting supercomputers into the hands of universities, and especially graduate students, was logical. By not doing that, we were peculiar.

“The announcement Monday was like waking up from a bad dream. But we still have a ways to move for going such a long time in not training students and scientists on the most advanced computers.”

The scientists were able to dovetail their arguments for equipping university researchers with the latest equipment with the concerns of Congress that the United States could fall behind other nations technologically.

Smarr and his colleagues found representatives surprisingly amenable to their concerns. Smarr spent much of 1984 on Capitol Hill, talking about the centers.

“A key (in Congress) was the fact that the Japanese now produce supercomputers comparable to ours,” said Kenneth G. Wilson, a Nobel laureate in physics and director of the new Cornell center. “That attracted both the media and congressional interest. And there is no question that in the political forum, the effects of Japanese computer competition were crucial.”

Advertisement

“We had a lot of help from Dr. Wilson and others,” said Rep. Don Fuqua (D-Fla.), chairman of the House Science and Technology Committee. “But my colleagues are keenly interested in seeing that our country does not miss the boat in training our graduate students at the university level in the latest technology. There’s strong support in Congress for this program.”

But that support was far in the future for Wilson, Smarr and a few other visionaries in the late 1970s.

Smarr had discovered that university scientists could obtain time on industry or government supercomputers only through personal contacts.

“It was dependent on whom you knew and on whom you could convince to sponsor you,” Smarr said. Purchase of commercial time could cost as much as $2,000 an hour, Smarr said, a prohibitive amount when a generous federal scientific grant for computer time might run $10,000 for an entire year. In addition, since university professors would have to travel to the computer location, the work would be concentrated during summer vacation periods.

Smarr and Wilson served on a scientific panel that produced a 1981 report pointing out the lack of available computers for theoretical physics, a field where basic research often produces benefits across a wide spectrum of science. Their work was also part of a 1982 study involving the U.S. Departments of Defense and Energy as well as the National Science Foundation, which concluded that American scientists were facing a computer crisis.

A subsequent NSF report concluded that $100 million or more would be needed to solve the problems identified in the 1982 study.

Advertisement

“We then began by asking Congress for $6 million (in 1983) to buy commercial time for scientists on supercomputers,” John Connolly of NSF said.

In the meantime, Smarr, based on his years of frustration, had put together his own proposal for a computing center at Illinois, with the help of 65 colleagues ranging from accounting professors to biologists. He sent the proposal to NSF, and it was soon followed by another unsolicited application from GA Technologies, the La Jolla atomic research firm that will run the UC San Diego center.

“The GA proposal was important in showing (people in Washington) that Illinois was by no means alone in its thinking about national centers,” Smarr said.

In 1983 Sidney Karin, the GA computer chief who will run the UC San Diego center, came away from discussions with federal energy officials convinced of the need for national centers. GA has been the lead participant in a smaller computer center set up several years ago for companies specializing in atomic energy research.

After getting the proposals, NSF decided to set up a nationwide competition for applicants to run one supercomputer center and applied to Congress for $20 million to pay for it. However, the combination of intense congressional interest and strong scientific lobbying resulted in a doubling to $40 million for this year alone and the Monday announcement that four centers will be set up.

Even with the consensus in Congress to trim federal spending and cut the budget deficit, the 1984 Reagan budget calls for $46 million for the national centers. Fuqua’s committee has not yet voted on fiscal 1986 authorizations, but a staff aide said that committee members hold the centers as a high priority.

Advertisement

Cornell’s Wilson said that initially he had little hope that the government would act. “But now everyone has high expectations over both the equipment and capability to be used at the centers,” he said. “I am taken aback by how effective (the lobbying) proved.”

With the establishment of supercomputer centers, Karin said, fields such as molecular biology, which need faster and more powerful computers, will be able to advance rapidly on several fronts at the university level. The supercomputers can be used, for example, to help design new drugs.

“With these centers, the public sector is now addressing our needs so in the future we won’t have to go to Japan or Germany, (won’t have) to leave the U.S. scientific community,” said John Hawkins, an astrophysicist at California Institute of Technology.

Advertisement