Advertisement

The new economic certainty: Nothing is certain

Share
Peter G. Gosselin is a Times national economics correspondent based in Washington, D.C.

At any given time, there is a dominant line of argument about the great issues that face America. When it comes to the economy, that line is most often delivered by Federal Reserve Chairman Alan Greenspan.

By last summer, with the country nearly four years into its post-bubble experience, Greenspan had settled on his latest and still-current line, one that seems suffused with modesty about our powers to predict the future and pregnant with the suggestion that the nation might best avoid sweeping conclusions (or perhaps any conclusions at all) about our immediate past.

Asked by a congressional committee how working Americans will fare amid the continuing rush of technological and global change, the Fed chairman offered this Delphic assessment: “Innovation, by its nature, is unforecastable.” But, he added helpfully, history suggests that if we remain “sufficiently flexible” and keep our economic institutions in order, “it will happen ... jobs will be created.”

Advertisement

Such a position is maddening because it overlooks the fact that, only a few years ago, Greenspan regularly engaged in the most sweeping of pronouncements as Pied Piper of the productivity revolution. It seems a little self-serving, given that a critical look-back might implicate the Fed chairman in the ballooning of the stock bubble. And it could be dead wrong. Economic and technological changes may not always work out for the best. Good things such as job creation -- or only good things -- may not, after all, simply “happen.”

So what can be drawn from the events of the last dozen years? What is the punch line of the “Fabulous Decade,” the “Roaring Nineties”?

Taken together, the eight books under review offer a tour d’horizon of the era. Among other things, they suggest that our decade-long love of stocks went unrequited, our hope that economic growth would lift all bottoms was probably overdone and our wild enthusiasm for all things technological may have been misplaced. Instead, the books hint at a darker, more ambiguous future.

Any assessment of the 1990s must start with the stock market. Over the decade, the market slipped the bounds of mere finance to become something much grander -- a reality show more engrossing than any yet devised by the networks, the embodiment of our wildest hopes for a new, democratized (and fabulously wealthy) future.

To pick just one of the thousands of anecdotes that fill these books, consider this story -- recounted in former Barron’s writer Maggie Mahar’s “Bull!” -- about San Jose resident Kathy Rubino. Rubino learned how deeply the market’s influence had penetrated American society when she awoke one morning, popped on CNBC and received an obscene phone call. As the caller panted lurid remarks, the network’s Joe Kernen warned of falling stock prices. “Suddenly,” Mahar writes, “the obscene caller interrupted himself: ‘Is that Kernen ... ?’ he asked.

“Rubino, stunned, off balance -- and still groggy -- found herself answering him: ‘The market has taken a plunge....’ ‘Jesus,’ the anonymous caller responded, ‘any word on Cisco...?’ ”

Advertisement

What was behind a tale like this, what allowed the stock market to so rivet American attention during the decade, was more than the drama of moneymaking. It was an unusual consensus that reached full flower in the 1990s and swept the field clear of objectors. On one side stood the claims of economic theory, conservative ideology and Wall Street self-interest that the stock (and bond and derivatives) market is the most -- indeed, the only -- efficient means of steering the nation’s savings to their best use. On the other were social theorists and public policymakers. Since at least the Great Depression, these people had distrusted the market as prone to unfairness and self-destruction. But as the century came to a close, they allowed themselves to be convinced otherwise. They came to view the market as America’s best hope of escaping a series of excruciating economic binds, especially the anticipated baby boom retirement crunch of the 21st century.

Mahar’s book and Roger Lowenstein’s “Origins of the Crash” can be read as briefs against the first strut of the 1990s consensus. In piling one financial perversion upon another, from Sunbeam to WorldCom to Enron, they ask, in effect, “What efficiency?” But their corruption case is not the strongest one that can be made against the stock mania of the decade. Instead, it’s the mounting evidence against the other strut of the 1990s consensus: the assumption that, left to their own devices, most Americans can invest their way to security, especially retirement security.

Mahar, quoting a 1992 Wall Street Journal story, puts the problem this way: “By the year 2000, employees will be managing $1 trillion of their own money in 401(k) retirement plans. What if they goof up?” In “Coming Up Short,” Boston College economists Alicia H. Munnell and Annika Sunden show that so many people have “goofed up” so much of the time that the shimmering promise of a private, voluntary, 401(k)-based retirement system, which so enticed policymakers and ordinary Americans, is essentially a chimera.

In a mere 20 years, 401(k)s and similar individual retirement accounts have mushroomed into the centerpiece of the nation’s private retirement system and are almost as important a vessel of the American dream as people’s homes. Curiously, most efforts to evaluate 401(k)s seem to have gone off on tangents. Economists have devoted immense energy to running simulations which show that if people started contributing on their first jobs and adjusted their investments along the way, they could end up with a load of money. But of course this raises the question of what people actually do. This is where Munnell and Sunden come in. They’ve organized their book around the sequence of decisions Americans must make about 401(k)s -- whether to participate, how much to contribute and so forth -- and their findings are disconcerting: “The evidence ... suggests that, although, in theory, workers could do very well under 401(k) plans, in practice they do not.”

Instead, large numbers stumble at every step in the process. More than one quarter of those eligible to join a plan don’t. Only 10% of those who do join contribute the maximum allowed by law (which makes one wonder why President Bush called for higher contribution limits). More than half don’t follow the first rule of investment and diversify. Munnell and Sunden do not propose to toss over 401(k)s but to improve them. But the bottom line is so at odds with what most people expect that it throws into question all of the attention, affection and support that Americans lavished on stocks in the 1990s. The bottom line, according to the authors, is that the median combined balance of 401(k)s and IRAs for 45- to 54-year-old workers -- the first generation that will have to rely almost entirely on these accounts to supplement Social Security -- is $37,000, or about an extra $200 a month in retirement benefits.

If the stock market has failed to deliver on its promises of retirement security, what about the real economy? By most measures, the U.S. economy did very well in the 1990s, especially during the second half when the gross domestic product grew in real, after-inflation terms at a 4% annual clip. The performance was so good, in fact, that a bevy of economists has written books designed to identify key causes of the growth to reproduce or improve upon it.

Advertisement

Chief among these are Joseph E. Stiglitz’s “The Roaring Nineties” and Alan S. Blinder and Janet L. Yellen’s “The Fabulous Decade.” The authors have a lot in common. All three are respected economists who held policymaking positions in the Clinton administration and/or at the Federal Reserve during the 1990s. Their books have several things in common too -- things that hint at a deep ambivalence among mainstream economists about the ‘90s and suggest that the period’s distinctiveness may lie elsewhere in the simple fact of growth. Both make sweeping claims about the 1990s -- “The World’s Most Prosperous Decade,” screams Stiglitz’s subtitle -- don’t bear up well under scrutiny. Both advance seemingly contradictory claims about their subject.

Stiglitz, who won the Nobel Prize for economics in 2001, claims that the Clinton administration’s accomplishments were considerable and that many contained the seeds of their own destruction. There is a not altogether appealing undertow of “If they’d only listened to me.” Blinder and Yellen maintain that Clinton’s 1993 deficit deal was the decade’s “fiscal turning point” and that the economy’s strong performance was the product of a series of unexpected developments, such as a decline in health-care costs.

One can almost feel the frustration in these arguments. For the first time since John F. Kennedy was president, some great gear had turned in the U.S. economy; some huge -- and, at least initially, positive -- force had been unleashed. But economic policymakers had almost nothing to do with it and economists could not even discern its shape.

What was it? To tackle the question, we turn to the New Economy.

By now, the New Economy has clogged bankruptcy courts. The techies who populated its far frontiers have gone off to law school or are looking into dry-cleaning franchises. Further criticism could be considered ungracious. But last year’s 50% rise in the Nasdaq composite index shows that not all of the wind has gone out of the New Economy dream. And several books -- even hedge fund manager Roger Alcaly’s strained defense in “The New Economy” -- demonstrate that there is plenty still to criticize. Indeed, a close reading of Alcaly, along with Doug Henwood’s “After the New Economy” and Century Foundation analyst Simon Head’s “The New Ruthless Economy,” suggests that a very different and more disturbing version of the New Economy may be the chief legacy of the 1990s to this decade and those that follow. As summarized by Henwood, the canonical New Economy account of the 1990s runs something like this:

“Finally, after a long wait, the computer revolution is paying off economically.... It took some time for people and organizations to learn how to use computers ... but now they’ve finally learned. All that [equipment], along with a political regime of smaller government and lighter regulation, has unleashed forces of innovation and wealth creation like the world has never known before.”

On this foundation, New Economy advocates erected a Magic Kingdom of spiritual claims, libertarian politics and market enthusiasms. Intentionally or not, the three books puncture most -- but not all -- of the advocates’ balloons. Henwood is particularly withering on the spiritual claims. Consider the short work he makes of Greenspan’s recurring fascination with the physical weight of U.S. GDP, a subject dear to New Economy cheerleaders. He starts with a 1988 article in which the Fed chairman argued that “if all the tons of grain, cotton, ore, coal, steel, cement ... that Americans produce were combined, their aggregate volume would not be much greater on a per capita basis than it was 50 or 75 years ago.” Henwood notes that the Fed’s own industrial production figures contradict this claim. Henwood moves on to 1998 congressional testimony in which Greenspan asserted that as a result of recent technological advances, “the physical weight of our GDP is growing only very gradually.” Here, he helpfully inserts the Web address of the New York City sanitation department, where one can learn that the city generates 26,000 tons of garbage a day.

Advertisement

Head is similarly tough on another New Economy conceit that underpins advocates’ libertarian politics: the notion that high technology is democratizing the American workplace by flattening corporate hierarchies and boosting the knowledge content of jobs. Head’s argument takes off from the simple but powerful observation that conditions in many tech-heavy workplaces look a lot more like those on the factory assembly lines of the 19th and early 20th centuries than like the sunny 21st century think tanks of the New Economy. Head asserts that corporate America’s ambition to use technology to expand factory floor-like conditions extends well beyond the computer software mills and telephone call centers to the highest reaches of white-collar employment, including health care.

By contrast, Alcaly desperately wants to save the original New Economy agenda but does so in a way that reveals some of that agenda’s more terrifying aspects and suggests what actually may be underway in the United States. He insists that we take the long view in examining the New Economy, and he looks back nearly two centuries for its roots. In the process, he comes dangerously close to losing perspective, at one point writing that while living standards improved over the last century, “there were also many bumps in the road, particularly the Great Depression....” Some bump.

Since the bust of 2000, some Americans have been looking for signs that their retirement accounts will recover. Others have focused on the unemployment rate, hoping their chances of landing a job will improve. But mainstream economists have fixed on another, more arcane number: productivity, or output per hour worked. What they’ve seen has caused them to launch new studies and to revise old views, especially about the New Economy. The public was generally aware of productivity in the 1990s, when there was much hand-waving about how lightning-fast computers and globe-girdling telecommunications were improving it. But with the bust and Sept. 11, the idea lost much of its charm.

What the public is not aware of is that mainstream economists have generally moved in the opposite direction. Leading economists were deeply skeptical that 1990s productivity improvements reached beyond computer and telecom industries. But they have since become very impressed. In part, that’s because most economists bought the Blinder-Yellen theory that growth during the decade was the result of lucky breaks and expected that, as the breaks disappeared, gains would trail off. Instead, productivity has kept right on rising.

In part, it’s because economists have finally begun to figure out precisely how computers and related gear are affecting productivity. The picture that’s emerging is nowhere near as dazzling as the one painted by New Economy enthusiasts. And its probable effect on working Americans is no longer unceasingly bright.

Big case studies, like McKinsey Global Institute’s 2001 volume, “U.S. Productivity Growth 1995-2000,” show that mainstream economists were wrong in the last decade when they concluded that productivity gains were confined to high-tech industries. Instead, some decidedly run-of-the-mill businesses, like retailing and warehousing, were among the biggest gainers. On the other side, the McKinsey study also shows that New Economy advocates were wrong in suggesting that the process is as simple as equating a new computer with greater productivity. The study argues that it is not the machines themselves so much as it is how companies bend their organizations -- and their employees -- around them that determines productivity.

Advertisement

This raises the question of what all these changes mean for working Americans. One way to appreciate what is at stake is to look at the problem through the eyes of theorist Ronald H. Coase, a Nobel Prize winner and conservative economist who wrote a seminal paper in the 1930s in which he posed the seemingly innocent questions: Why are there firms? Why doesn’t everybody simply buy what they need to do their jobs in the market, then sell the product to the highest bidder? Coase’s answer is that they would, if it weren’t for the fact that gathering the information needed for all these purchases and sales is costly. So, he reasoned, the cost of information determines the size and shape of firms.

The implication for the present is that if, as more economists believe, computers really are bringing down the cost of information, then they have thrown open the barn door on virtually everything about what a corporation is and how it should operate. Such a conclusion has some surprising -- and disconcerting -- implications.

Take the disastrous January 2000 merger of America Online and Time Warner, which unwound within two years at a cost of $100 billion. If computers and productivity really are changing the rules on corporations, then the executives involved didn’t have the faintest idea what they were doing. They were simply trying out a new corporate shape to see if it fit. Or consider the stock market, where share prices will almost certainly swing wildly in coming years as investors try to gauge not just how companies do but which will survive.

Or, most important of all, think about working Americans. One of the great mysteries of the 1990s was why working people did not demand higher compensation as firms clamored for new employees. Greenspan often put it down to “worker insecurity” but left the sources largely unexplored.

But what if Americans know in their bones what the nub of the New Economy story really is: that all the sticks are in the air about what a job is, how long it lasts and what happens next, and that, as Alcaly would agree, it’s going to be this way for a very long time?

Advertisement