Advertisement

U.S. Slow to Compute on ‘Super’ Scale

Share
<i> Times Staff Writer </i>

A highlight for many visitors to Apple Computer’s headquarters in Cupertino, Calif., is a tour of the company’s supercomputer center. There, coiled in a semicircle and encased in glass at the end of a hall checkerboarded in black and white tiles, stands Apple’s bright purple Cray supercomputer.

Given its hue and sleek art-Deco surroundings, it’s no wonder critics once questioned whether Apple bought the $14.5-million machine to attract attention or to do serious work. “In the beginning, people didn’t think we needed that much computing power to design personal computers,” recalls Kent Koeninger, who helps Apple engineers use the Cray machine. “But you don’t design Apple and Macintosh computers on paper in garages anymore.”

Not Fully Embraced

Indeed, supercomputers can be used to design everything from airplane wings to new drugs to personal computer circuitry. Capable of processing billions of mathematical operations per second, supercomputers allow scientists and engineers to test their theories in a fraction of the time it would take using conventional methods. Furthermore, scientists can simulate feats, such as exploding bombs, crashing cars and flying airplanes through vicious storms, that they wouldn’t dare attempt in real life.

Advertisement

But despite the power of supercomputers, government and industry analysts contend that the machines have not been fully embraced by American industry. And in the view of many experts, that is a key reason that the United States’ pioneering and world-leading supercomputer industry could soon lag behind Japan’s, like so many other industries that started here.

The potential result, these analysts say, could be devastating. Not only could the United States lose its lead in the supercomputer industry, but in other industries, as well, because businesses are not taking advantage of the best tools available.

For now, though, the spotlight is on the U.S. supercomputer industry. In the past two months alone, this $1 billion-a-year industry has fallen into unprecedented disarray. Control Data Corp., the industry pioneer and one of the nation’s two active supercomputer makers, said in April it would close those operations after losing more than $350 million. Weeks later, Seymour Cray, founder and chief scientist of the remaining manufacturer, Cray Research, left the company to start anew with an untried supercomputer technology. For its part, Cray Research has said it expects sales to rise just 10% this year, and analysts are estimating a profit decline of about 10%.

Meanwhile, one of the three giant Japanese electronics companies in the field, NEC, has unveiled plans to manufacture what it claims is the fastest-ever supercomputer.

With the cutting edge of the most advanced technology known to man at stake, there has been no small amount of attention focused on the current problems. Efforts to diagnose and treat the industry’s ills have focused primarily on Japan’s superior chip-making technology and its long-recognized patience for nurturing nascent industries.

Government Uses

But increasingly, analysts and engineers are looking at other factors, including the oft-ignored supercomputer buyer. The result: a dawning realization that, unlike the Japanese, U.S. industry has, so far, been reluctant to make the huge investments of cash and manpower necessary to support the supercomputer industry.

Advertisement

“In the U.S., these machines are used for national security matters, defense and other government oriented work,” says Stan Mantel, an analyst with the Super Performance Computing Service in Palo Alto. “In a pure commercial sense, Japan has the leg up. You can argue that bigger and better machines give you bigger and better answers, and, in some cases, answers that you would never get without these machines.”

Some engineers argue that at least a portion of the problems plaguing America’s supercomputer industry--as well as the global competitiveness of other key businesses--could be solved if U.S. companies would take advantage of the latest and best technology available. Many point out that their Japanese counterparts already have done just that.

And, certainly, the U.S. government has. Since the advent of the supercomputer more than a decade ago, the U.S. government has been, by far, its largest user, accounting for 101 of the 199 supercomputers installed nationwide as of the end of 1988. Department of Energy officials estimate that at nuclear weapons research laboratories, alone, there are about two dozen machines.

According to a year-end 1988 survey by the Super Performance Computing Service, only 74, or 37%, of the nation’s supercomputers were installed in private businesses. (The survey does not include high-performance machines built by International Business Machines whose status as supercomputers is still hotly debated.)

Japan Differs

The situation is far different in Japan. According to the same survey, the Japanese government has 17 of the 104 installations, universities have 27 and private industry has 60 machines, or 58% of the total.

Also telling is the industry-by-industry breakdown of where these machines are used.

For example, 21 of the 60 industrial installations in Japan are in the electronics industry. In the United States, Apple Computer, which uses its machine to simulate circuits and for such advanced technology research as speech recognition, is one of just five electronics makers to own a supercomputer, and the only one in the Silicon Valley, the site of the nation’s largest concentration of high-technology operations.

Advertisement

Auto makers in Japan account for 11 machines; in the United States, the total is three, one for each of the major manufacturers.

However, 29, or nearly 40%, of the supercomputers used in private enterprise in the U.S. are in aerospace companies. In Japan, aerospace companies account for just five installations.

Some find the differences appalling.

“We have treated these machines primarily as defense systems, not as general scientific discovery, design and productivity tools,” argues Larry Smarr, director of the National Center for Supercomputing Applications at the University of Illinois in Champaign-Urbana, one of five such university research facilities throughout the nation.

“The Japanese, on the other hand, realized that they were a key technology to important business sectors for the next several decades and proceeded to so position them.”

Need Questioned

Adds Charles Ferguson, a research associate at the Center for Technology, Policy and Industrial Development at MIT: “Both the Japanese and U.S. governments have supported their supercomputer industries over the years. But the goal of the Japanese government has been to make their industries ready to compete in the world economy. Our goal has been national security.”

To be sure, many U.S. corporations would argue that they already have as much supercomputing power as they need. Dozens of companies lease the power through time-sharing networks operated by supercomputer owners. Others are corporate partners at one of the five university research facilities. And still many, many more companies own machines known as “mini-supers,” which offer some of the capabilities and power of a full-fledged supercomputer.

Advertisement

Many corporations argue that business problems don’t necessarily require the enormous power of a supercomputer, a $20-million piece of hardware steadily heading toward obsolescence. Furthermore, companies say they are particularly reluctant to purchase the hardware when they may lack two other key ingredients: adequately trained engineers to use the machines and a range of software to take full advantage of its powers.

Consider the experience of Kodak. For nearly three years the camera and film maker has been an industrial partner at the University of Illinois supercomputer center, spending just over $1 million annually for all the supercomputing its U.S. engineers need.

The partnership, says Lawrence A. Ray, a senior research scientist who is Kodak’s man on campus, is supposed to help the company decide if it wants to buy its own machine. So far, there’s been no rush to make the buy.

“The best solution is to belong to a center like this so you don’t have the facilities, staffing and all the rest to maintain,” Ray says.

But even more importantly, he argues, many engineers just don’t know how to use a supercomputer effectively. And many, already accustomed to the desktop computer revolution that brought powerful personal workstations, are reluctant to work again on a machine they must share with scores of other engineers.

“Buying the Cray is the easy part,” Ray says. “Once you have it, you have to use it and that’s the hard part.”

Advertisement

Ray says engineers typically develop projects based on the size of the tools--the computers--they have available. With the increased power of a supercomputer, engineers must learn how to increase the scope of the problems they can tackle.

Training Required

According to some estimates, it typically takes engineers, even highly educated and skilled ones, from 18 months to three years to learn to use a supercomputer well.

Such arguments are the major reason why the supercomputer centers, such as the one in Illinois, were established in 1986. The theory is that as engineers are exposed to these machines, they will begin to think in the large-scale terms necessary to make best use of supercomputers.

“You need trained drivers for these machines and people are not being trained to use them in industry. That’s the real root of our problem,” says C. Gordon Bell, an electronics industry pioneer, former National Science Foundation computing director and now the chairman of Ardent Corp., a mini-supercomputer maker in the Silicon Valley. “There would be a higher demand for these machines in industry if more people knew how to use them effectively.”

Efforts to train more non-defense-oriented users, now largely confined to the five supercomputer centers established in 1986, may soon get a big boost.

Last month U.S. Senator Albert Gore (D-Tenn.) introduced a bill to spend more than $1.3 billion over the next five years to install supercomputers in university and other research centers. The bill would also create a data network to connect the nation’s supercomputers. Although similar proposals have died in prior years, recent events in the supercomputer industry have increased the chances of passage for the current bill.

Advertisement

Says an aide to Gore: “U.S. industry hasn’t gotten the message as quickly as businesses in Japan. That’s why these university and research centers are so important.”

Adds Lauren Kelley, the supercomputer analyst for the U.S. Department of Commerce: “The writing is on the wall for the U.S. supercomputer industry if there are no changes in how we approach it.”

IBM Enters Market

In the private sector, there are signs of change emerging, too.

After years of ignoring the supercomputer market, IBM over the past year has taken major steps toward becoming a significant participant. It has introduced special accessories for its more powerful mainframe computer that permit “vector-style” scientific computing. Analysts estimate that there already are 500 of these machines installed throughout the world, about 200 more than all other supercomputers combined.

In addition to its own research, and perhaps even more important, is IBM’s investment last year in Supercomputer Systems Inc., a new company founded 18 months ago by Steve Chen, a Cray Research defector considered among the most brilliant computer designers in the world.

Analysts speculate that IBM may have an easier time selling its supercomputers to U.S. corporations than have Cray and Control Data.

“There are a lot of ‘True Blue’ customers out there who will follow IBM anywhere,” says Michael Murphy, editor of the California Technology Stock newsletter. “When IBM blesses something, especially something with scientific aspects, a lot of data processing managers listen.”

Advertisement

Analysts expect that whatever supercomputer IBM finally introduces, it will be able to share data with the company’s mainframes, a factor that will allow data processing managers to integrate all of their IBM equipment.

And as the supercomputer becomes more compatible with existing computer systems, analysts say, industry use should increase. Outside of scientific fields, potential applications for the machines include investment banking, investment portfolio modeling, construction and the entire health care field.

“The real question is not whether we have the best supercomputers, but whether we build the best cars, electronic circuits, airplanes and the like,” says Chris Willard, an analyst with Dataquest, a Silicon Valley market research firm.

Adds analyst Stan Mantel: “The country that wins is not necessarily the country that produces these machines in the greatest quantity. It is the country that uses them the most effectively.”

Advertisement