Advertisement

Awareness Is Helping Cool the Y2K Fever

Share
TIMES STAFF WRITER

After two years of doom-and-gloom pronouncements about the year 2000 problem, a growing cadre of Y2K experts has begun to recast its predictions of potential calamity into a tamer vision of the millennium bug.

This month, Canadian speaker and computer consultant Peter de Jager, one of the earliest and most vocal Y2K pessimists, published an article on his Web site titled “Doomsday Avoided”--a play on his first article on the topic, written six years ago and titled “Doomsday 2000.”

“We’ve finally broken the back of the Y2K problem,” de Jager wrote. “Most if not all companies are working on this issue. They are fixing, or have fixed, their systems. They have examined, or are examining, their embedded system problems. We are, for the most part, no longer ignoring Y2K.”

Advertisement

Edward Yardeni, chief economist for the investment banking firm Deutsche Bank Securities Inc. and one of the most persistent drumbeaters on the Y2K issue, recently revised his estimate for a long global recession due to the glitch, from a 70% chance to 45%.

“I’ve toned down the message partly because progress has been made,” Yardeni said. “I would be happy to back off entirely.”

Although it is unlikely that Jan. 1, 2000, will be an Information Age blackout, it is still entirely possible that serious problems are looming.

But Yardeni and de Jager are among the most prominent examples of what has become a discernible turn in the mood surrounding the Y2K problem. The alarms, at least in the United States, have begun to subside, replaced with a parade of repair statistics and completion percentages.

Even though the latest cost estimates are skyrocketing into the trillion-dollar range and reports from overseas paint an ever gloomier picture of inattention, there is a much stronger sense, at least in the United States, that the problem is being taken seriously.

Many experts say the changing attitude is partly due to “spin control” from companies and government agencies that have realized the panic and bad publicity that comes with poor repair reports.

Advertisement

The result is an often-confusing release of information that seems to be both good and bad at the same time.

Last month, California’s Department of Information Technology reported that 75% of its critical systems had been cleansed of the Y2K problem. But just three days later, the auditor general’s office released a report that said two-thirds of the systems were not ready.

Even as federal Y2K czar John Koskinen was reassuring the public of the government’s progress, a congressional report was giving a failing grade to such critical agencies as the Federal Aviation Administration and the State Department. Both agencies promise to be ready well before the new year.

“Some of the stuff people are saying is just a load of hogwash,” said Kazim Isfahani, a Y2K analyst for Cambridge, Mass.-based Giga Information Group Inc., an information technology research and consulting firm. “A lot of companies, associations and agencies have realized that putting a positive spin on the year 2000 is vital.”

But amid all the hype, there is a sense, even among pessimists, that things are actually getting better and that progress is being made.

Out of the panic that seemed to infuse the issue over the last two years, has come a calmer interpretation driven by the sense that most businesses and government agencies are at least aware of the problem and are working to repair it.

Advertisement

The year 2000 problem is a simple technical issue that stems from the long tradition in computer programming of abbreviating years to two digits. In 2000, computers could become confused because of the ambiguity of two-digit years. For example, “00” could mean either “2000” or “1900.”

Some parts of the Y2K problem, such as its effect on embedded microprocessors--critical components that control automated processes in power plants and hospital equipment--have turned out to be far less pervasive than previously believed.

Giga Information Group was one of many companies that sounded the alarm, proposing that embedded microprocessors could be the most serious aspect of the Y2K problem because of their widespread use in critical industries.

But in January, the company released a new report, titled “It May Rain, but the Sky Won’t Fall,” stating that problems with embedded microprocessors “will not have the crippling effect as originally thought.”

Alistair Stewart, a senior advisor for Giga on embedded systems, said that only about 3% of chips have been found to have minor problems, typically requiring resetting the date or restarting a device. The percentage of chips that experience outright failures is “so small as to be statistically insignificant,” he said.

“There won’t be a systemic shutdown,” Stewart said. “You will have some localized inconveniences with some localized failures.”

Advertisement

Phone Companies Are Encouraged

Consumer electronic firms have turned up a tiny number of defective products after testing tens of thousands of devices.

The Telco Year 2000 Forum, an organization made up of eight of the nation’s leading phone companies, released a report this month saying it found no serious problems out of nearly 2,000 tests.

Even the federal government--the perennial sick man of the Y2K movement--has had its share of good news. The same congressional report that flunked the FAA also gave top grades to such important agencies as the Nuclear Regulatory Commission, Social Security Administration, Environmental Protection Agency and Department of Housing and Urban Development.

Isfahani said much of the mood change is due to real improvements, particularly in the flow of information that has partly taken the Y2K problem out of the realm of hypothetical dangers and imagined catastrophe.

“We’re at the point where we can point to specific events that indicate progress,” he said. “A year ago, we didn’t have that, and a lot of what people were saying was because there was no information.”

Lou Marcoccio, Y2K research director for the Stamford, Conn.-based information technology consulting firm GartnerGroup Inc., said the millennium bug has become a relatively well-understood phenomenon over the last two years.

Advertisement

“Two years ago, we didn’t know the extreme detail we know now,” he said. “There was a lot more guesswork. But we don’t have to guess much now. We are capturing realistic information on what is really being done on a frequent basis.”

Marcoccio said most of the research points to little impact on most consumers and businesses, although small businesses and those with overseas connections continue to be vulnerable.

He added that the one revision from GartnerGroup is a substantial increase in its estimate of total Y2K costs. The company originally estimated that repair costs would total $300 billion to $600 billion.

Marcoccio said the figure has now been raised to $2 trillion to account for the amount companies are spending on risk assessment and contingency plans.

The possibility of a flood of lawsuits stemming from Y2K problems also remains an issue that could add billions to the already enormous tab.

The progress in repairs and the increased flow of information has not turned the year 2000 into a love fest. There are still glaring weak spots in the repair effort.

Advertisement

A report by a special Senate committee last month singled out the health-care industry as significantly lagging other sectors. “Because of limited resources and lack of awareness, rural and inner-city hospitals have particularly high Y2K risk exposure,” the report said.

Edward Yourdon, a respected software engineering consultant who has written 25 books on software design and management, said he remains pessimistic about the situation because of the short time left and the unrealistic projections of project managers.

“If anything, I’ve grown more pessimistic,” he said.

Yourdon said Y2K repairs have followed the pattern of other large programming projects. Typically, 15% of those projects do not finish on time. Repairs and new programming usually introduce about one new error for every 1,000 lines of software code, he said.

“This is what happens in normal projects,” he said. “Clearly, we have a situation where people desperately want to believe that everything is on schedule.”

It’s a Different Story Overseas

Yardeni, of Deutsche Bank Securities, said that even though he has reduced his probability of a global recession, information about slow repairs by small companies and dismal efforts overseas have partially offset those figures.

“A year ago, I would have said the main problem was the federal government,’ Yardeni said. “Now we’re getting more reports about the situation overseas, and the reports are not good.”

Advertisement

The World Bank, the U.S. government and other entities have focused on the slow pace of repairs overseas as a weak link.

The State Department announced this month that it is considering issuing travel warnings and preparing evacuation plans for Americans living in countries that experience serious Y2K problems.

But even the pessimists concede that despite the continuing flow of grim news, there has been a marked change in how the millennium bug is discussed these days.

After two years of warnings and alarms, few people still debate whether a problem exists. It is accepted as a valid and serious problem and, to many of the pundits and consultants, the debate now is largely about getting the job done.

“We’ve overcome the largest Y2K hurdle,” de Jager wrote in “Doomsday Avoided.”

“The Y2K problem was never the actual act of fixing code; it was the inaction and denial regarding a problem so easily demonstrated as real and pressing.”

Advertisement