Does it matter if it’s made in America? The question has been debated for years, and now it is being asked again following the Defense Department’s recent announcement that it will spend $580 million in the next five years to support U.S. development of flat-panel display screens.
Flat-panel displays are the computer screens on electronic weapons, which are the wave of the future in warfare. They are also the screens on commercial computers and a key component of digital interactive television and personal communications devices--the future offspring of the TV and telephone that are expected to become everyday products within this decade.
The Pentagon says it is spending the money because it will need flat-panel displays for weapons. But the $580 million is really an attempt to recover from costly myopia--and to retain in the United States the ability to develop and produce the screens.
The bitter irony is that these wonder products--on which liquid crystals control light and millions of transistors retain and manipulate information--were invented in America. They were neglected by shortsighted corporate executives but were taken up by companies in--you guessed it--Japan, which now dominate a $5-billion global industry that is destined to become a $15-billion industry in a few years.
“That $5 billion is only the value of the screens,” says David Mentley, head of display research for Stanford Resources, a San Jose consulting firm. “The displays are key components of systems worth five times that amount.”
Suffice it to say that displays are key products of the Information Age. And the story of why taxpayer money is being spent to retain them illustrates the complexity of world industry and what this country must do to provide opportunity for future generations.
The principles of liquid crystal displays were first discovered in the 1960s at Sarnoff Research Center, then part of RCA, and at a Westinghouse lab in Pittsburgh--and in Switzerland, at Hoffmann-LaRoche, the drug firm.
But where Americans and Europeans didn’t follow up with investment and development, Japanese companies did. Sharp, Seiko and Casio used liquid crystal displays in calculators and digital watches.
That gave them knowledge of the technology so that later, when personal computers developed needs for sophisticated graphics and color pictures, Japanese manufacturers could deliver increasingly capable screens.
It wasn’t easy. Like everything in electronics, prices of screens came down but cost of development rose. It costs more than $400 million to build a manufacturing plant for flat-panel displays.
Such numbers made big U.S. firms balk. IBM, AT&T;, General Electric--which acquired RCA in 1986 and jettisoned the Sarnoff Center--cut R&D; on displays.
Meanwhile, Japan in the 1980s was still in its era of cheap money and industry poured billions into displays. The result is that Sharp, Seiko, Hitachi, Toshiba, Hoshiden and others have 95% of the world market; Korean and U.S. firms may have 5%.
But here the story takes a twist. Japanese companies are still investing in displays--but finding it harder to do so in lean times. And Korean firms, Samsung and Goldstar, are preparing to increase production of complex screens. The upshot could be price competition on world markets and opportunities for U.S. firms.
Meanwhile, the technology could be moving on. The type of display being made today, called thin-film transistor active matrix, is difficult and costly to produce. Some U.S. firms are working on simpler alternatives. Motorola is backing a venture called Motif that hopes to bring out a less expensive but high-quality display; SI Diamond, a Houston company, is working with help from Lawrence Livermore Laboratory on a screen employing millions of microscopic TV tubes.
Big companies--Xerox’s Palo Alto Research Center in a venture with AT&T;, IBM in a venture with Toshiba--are getting back into displays. So “the wheel’s still in spin,” as Bob Dylan put it.
Getting back in will be difficult. But it must be done, because the flat-panel display is emerging as a product combining hardware and software that will be key to future interactive, video computers--key to the information highway.
Many Americans saw that coming. Company engineers told U.S. executives in the mid-'80s of the technological importance of displays. But the corporate brass decided that funding such product development would reduce current earnings. So they cut back.
Such decisions were shortsighted in two ways. One, they relied on competitors for essential components. Now many U.S. firms lamely accuse Japanese companies of being slow to sell them the latest and best screens.
More important, they passed up opportunities for job-creating product development. By contrast, consider Sharp Corp., the world leader in displays. Its camcorder product was doing poorly. So Sharp reached into its display know-how and came up with a viewfinder for its camcorder. The product took off, and now Sony and other competitors are adding viewfinders as well. When you’ve invested in the know-how, you make your own opportunities.
Yet sometimes lessons are learned. Pentagon support for flat panels will have a larger dimension if it marks a turning point for funding research, says Paul Saffo of the institute for the Future in Menlo Park, Calif. As research budgets have declined in government and business, we have been living off technologies paid for in past defense budgets and the space program--such as lasers and microprocessors, the foundation of U.S. leadership in information industry.
So far, there has been government rhetoric but no money for the information highway. That contrasts sharply with the decision 40 years ago to float bonds and levy a tax on gasoline to build the interstate highway system--origin of the information highway metaphor. Then, we sacrificed present gratification to provide for future generations.
Maybe with flat-panel displays we are saying again that we must mount collective efforts to see that technology is made in America--that, yes, it matters a great deal.