Advertisement

Anticipated Computer Chaos Is a No-Show

Share
TIMES STAFF WRITER

The world did not come to an end.

Planes did not fall out of the sky. Nuclear reactors did not melt down. ICBMs stayed in their silos, ready to unleash their own form of technological Armageddon another day.

After years of anxiety, the peculiar technological glitch known as Y2K washed across the globe at midnight, leaving only the faintest traces of mischief.

“We’re finding no significant incidents and the anecdotes we’re hearing are all laughable,” said Bruce McConnell, director of the United Nations-supported International Year 2000 Coordination Council. “We’ve been predicting very few problems, but this is better than I expected.”

Advertisement

Of the few problems reported around the world, most were minor. In Japan, a computer linked to a nuclear plant monitoring device failed, but it wasn’t considered serious enough to shut the plant.

Ticketing machines on some buses in the Australian cities of Adelaide and Hobart briefly jammed, and a provincial court in South Korea reported that it had issued automated summonses to 170 people to appear for trial on Jan. 4, 1900, instead of Jan. 4, 2000.

In the grand scheme of modern life, where technological failure is a routine irritant, from car breakdowns to computer malfunctions, the anticipated New Year’s Day showdown with year 2000 computer problems turned out to be a grand anticlimax.

Perhaps the biggest reason for the tranquillity on New Year’s Day was the massive world effort to fix the problem--a task that ultimately cost hundreds of billions of dollars.

The effort was helped by the rapid pace of technological advances in the past few years in which older computer systems were replaced.

The Y2K problem was caused by a simple shortcut that had been used by programmers since the earliest days of computing. To conserve memory and disk space, programmers used only two digits to represent years. For example, 1943 would be recorded as simply 43.

Advertisement

The system worked fine as long as all the dates were in the same century, but in 2000, they became ambiguous to some computers. For example, the year 00 could stand for either 1900 or 2000, resulting in a host of possible malfunctions and miscalculations.

The use of two-digit dates eventually found its way into microprocessors that controlled factory machines, programs for personal computers, and some appliances.

‘You Could Say We Have Won the War’

Many programmers realized the use of two digits would eventually cause problems, but few imagined that either the machines or the programs would last so long.

Experts cautioned, however, that the coming months are certain to bring more malfunctions as computer systems stumble on the two-digit abbreviation for the year 2000, 00, that is at the heart of this computer glitch.

The Gartner Group, a technology research and consulting firm based in Stamford, Conn., has estimated that more than half of all Y2K failures will take place in the coming year.

One of the first key dates will be Monday, the first regular business day of the year, said Lou Marcoccio, year 2000 research director for the Gartner Group.

Advertisement

But he said getting through Jan. 1 was a critical victory because it showed that basic industries, such as electrical and water utilities and telephone companies, would enter 2000 in good shape.

“You could say we have won the war, as far as basic infrastructure issues are concerned,” Marcoccio said. “We didn’t expect many problems in this area, and it looks like things are going pretty well.”

The year 2000 problem has arguably been one of the grandest technological blunders in history.

The estimates for repairing the problem and dealing with the potential wave of litigation over malfunctions have ranged up to $1 trillion worldwide.

The United States has spent the most of any country in repairing the glitch, investing an estimated $150 billion to $225 billion, according to the Gartner Group.

The U.S. government spent $8.4 billion, a princely sum that the president’s Y2K czar, John A. Koskinen, defended as necessary.

Advertisement

“I don’t think we should underestimate the problem that was originally there,” he told Reuters.

The global cost of fixing the year 2000 problem put it in the same league as some of the most costly events and disasters of our time--the Vietnam War (U.S. cost of $755 billion, from 1959 to 1975), the Korean War (U.S. cost of $352.2 billion, from 1950 to 1953) and Hurricane Andrew in 1992 ($40 billion)--according to various estimates.

Bill Schoen, a mainframe computer programmer in Detroit, was perhaps the first person to sound the alarm over the year 2000 problem, forming a company in 1983 to help large businesses rid themselves of the millennium bug.

He was ridiculed in the industry, not because he was wrong, but because people thought he was an idiot for suggesting that the problem had to be fixed with so many years to go before 2000.

“They basically told me to buzz off,” said Schoen, who still works as a programmer in the Detroit area. “The people whose job it was to stop this thing did nothing.”

The Problem Grew to Mammoth Size

What engineers failed to realize was how stubbornly technology clings to life. As mainframe computer programs grew larger and more capable, they became almost too complex to be easily or cheaply replaced.

Advertisement

By the 1990s, when many large companies began waking up to the problem, fixing the glitch had become a mammoth undertaking.

Bank of America, for example, spent more than a half-billion dollars repairing the problem.

The state of California had to deal with more than 125,000 desktop and mainframe computers running dozens of computer languages, all tied into about 1,800 computer networks.

The problem for the state, as with other large operations, was not just the size of its networks, but also the fact that many of the systems were interconnected. So just getting one system repaired would not work; everything had to be fixed together.

Elias Cortez, the state’s chief information officer, said that what worked to the state’s advantage--and that of other large enterprises--was that they were large enough to have the resources and expertise to tackle the problem.

Most companies, and even some governments, faced much smaller Y2K problems than California or Bank of America.

Advertisement

Devin Barber, information systems manager for Yuba City, said year 2000 was barely a problem because the city had planned and had relatively few computers to repair.

Yuba City had run all of its computer services off an aging NCR computer. In 1996, it began planning to replace the machine with a new system at a cost of about $690,000.

The new system was largely Y2K ready, and so were almost all the new programs the city would use. Along with updating its main computer, the city also upgraded some of its 100 or so personal computers.

“Y2K wasn’t a big deal for us,” Barber said. “It was all about timing. I don’t think it was very painful for most small businesses. For big companies, it was probably very painful.”

Yet high-tech companies generally had an even easier time of things because most of their systems are relatively new and needed few adjustments. Amazon.com spent only $1 million to repair its systems despite its reliance on complex computer systems.

In Every Nook and Cranny of Modern Life

The world’s biggest software maker, Microsoft Corp., reported Friday that it was receiving fewer phone calls from customers than it expected. Most were questions about how to install some of the company’s software upgrades to repair programs.

Advertisement

“This has been so smooth,” said Microsoft spokesman Dan Leach. “Even if the average homeowner does nothing, they’re going to be pleasantly surprised that their computers will be fine.”

One major area of concern in the industry was the billions of tiny microprocessors that had found their way into every nook and cranny of modern life. These so-called embedded devices are used to control everything from traffic signals to elevators and factory machinery.

This part of the Y2K problem was viewed as hopeless at first. There were simply too many of these devices to deal with.

But as it turned out, the number of embedded chips that actually used the date in some way was minuscule. The chips that were bad, however, tended to be used so frequently that they were easy to identify and replace.

At the San Onofre nuclear power plant, engineers identified 190,000 embedded chip devices. But of those, only 35 types of devices had to be repaired.

“The numbers were tiny,” said Robert Haverkamp, year 2000 project manager for San Onofre.

The Human Element Was Biggest Unknown

For all the angst over the possibility of a technological meltdown, the biggest unknown turned out to be the human element.

Advertisement

Fears of widespread breakdowns reached a high point at the beginning of 1999, with thousands of people rushing to stock up on survival gear and supplies.

Stephen Quayle, chief executive of Safe-Trek Outfitters in Bozeman, Mont., a supplier of survival goods, said his business tripled in 1998 and doubled again in 1999.

But by last summer, the orders for freeze-dried food and other supplies began to slow as more companies and government agencies announced the completion of their Y2K repairs.

“The pendulum totally swung in the other direction,” Quayle said. “That’s great. I can’t wait for things to get back to normal, but it’s been going on so long, I don’t remember what normal is.”

Advertisement