Advertisement

Heavy R&D; Spending No Longer the Magic Route to Profitability

Share
Michael Schrage is a writer, consultant and research associate at the Massachusetts Institute of Technology. He writes this column independently for The Times. He can be reached by electronic mail at schrage@media-lab.media.mit.edu on the Internet

To the ghoulish delight of Wall Street, Louis V. Gerstner’s IBM announced that it would eliminate 35,000 jobs and bite an $8.9-billion write-down bullet. So the stock immediately jumped.

If Gerstner had a perverse sense of humor, there’s a simple way he could talk the stock right back down: All he would have to do is announce “bold, new spending” for IBM research and development.

Similarly, if Apple Computer CEO Michael Spindler, John Sculley’s operations-minded successor, wants to terrify his investors, all he has to do is announce that the $7-billion-a-year company will maintain its $2-million-plus per day R&D; spending habit. That barely two years ago, Apple derived fully 80% of its revenue from products introduced during the same year is a fact that now belongs to the mists of computer antiquity. The question now is, What have you done for me lately?

Advertisement

A fundamental discontinuity in the computer industry has emerged: Research & development has been emasculated. Where, once upon a time, intensive R&D; investing was seen as essential to competitiveness, the market has now effectively divorced R&D; from the way it determines a company’s value. R&D; has been transmuted from an engine of innovation to a costly burden of overhead.

That’s not quite as ominous as it sounds. Too often, computer company “innovations” proved to be too little, too late, or on time but too pricey. Over the past three years, price wars have shifted the battleground from added-value to lower-cost. Most R&D; in the computer industry has been designed around adding value. Drastic R&D; cuts were thus inevitable.

Does anyone really believe that insufficient R&D; spending is at the heart of IBM’s pitiful decline? The company, which once spent more than $5 billion a year on high-tech R&D;, spends perhaps $4 billion now, but that’s still more than the annual revenue of most companies in the industry. The simple reality is that IBM’s recent multibillion-dollar R&D; investments have not consistently translated into either market share or profit: They generated more costs than benefits.

Similarly, Apple Computer performed much of the most innovative research and development in personal computer interface design and implementation. The company spent proportionately twice as much on R&D; as its low-cost PC competitors, such as Dell Computer and AST. As price wars slashed margins, Apple’s R&D; burden remained high. Apple’s innovation edge failed to translate into dramatically increased market share. The result? Apple literally cannot afford to maintain its present levels of R&D; investment.

By contrast, let’s look at some of the winners in today’s ultracompetitive information technology market. Microsoft is extraordinarily successful and Bill Gates is one heck of an entrepreneur, but no one in the industry considers Microsoft to be a particularly brilliant technological innovator. Indeed, Microsoft is more imitative than innovative in its software successes: it sees what it likes and ruthlessly emulates.

Microsoft claims to spend a lot on R&D--almost; $400 million a year just to do operating systems R&D; (the core of its business). But is that investment going into “research and development,” or is the focus overwhelmingly on mere software enhancements? In other words, Microsoft’s concept of R&D; innovation is married more to incrementalism than any bold transformations of software’s potential.

Advertisement

And why not? Microsoft didn’t grow rich because of brilliant technical breakthroughs; it grew rich because it does a better job of matching its software developments to its customers’ perceived software needs. Marginal improvements for marginal costs.

The same holds true of companies such as Oracle, Sybase and Novell. They have succeeded in the marketplace not by dint of superior R&D; but because they have been able to consistently blend incremental technical improvements with aggressive marketing at comparatively low prices.

Indeed, consider the rapid and profitable growth of systems integrators like EDS and Andersen Consulting. Of course, they toss a few million at R&D; now--but these companies typically did virtually no R&D; at all. In fact, they did their essential R&D; at the customer sites, learning how to piece all that disparate technology together into working systems.

To be sure, there are singular exceptions like Intel, the semiconductor company that sells the dominant microprocessor of the personal computer age. Intel--which does comparatively little pure research but an awful lot of expensive development--has done an excellent job of converting its capital-intensive innovation investments into premium-priced microprocessors and semi-custom logic chips. There seems to be a clear and unambiguous link between the quantity of Intel’s investment and the quality of its semiconductor innovations.

In its purest form, the R&D; issue revolves less around expenditures than credibility. For the overwhelming majority of information technology companies today, the size of the R&D; expenditure bears no relationship to the market success of the innovations. As companies have grown larger and markets have grown more competitive, R&D; has simply lost credibility in terms of its cost-effectiveness.

The result is that we are now at a time when the next round of breakthrough innovations will be dictated not by any established computer companies but by a next generation of techno-entrepreneurs.

Advertisement
Advertisement