Advertisement

No ‘Silver Bullet’ for Software’s Growing Complexity

Share

Software complexity and its effects on society and the economy is likely to be one of the more significant problems we’ll face in the next few decades, and it appears to be getting worse instead of better.

Recently I worked long into the night setting up a digital subscriber line Internet connection and an office computer network for a small nonprofit organization. After wading through a thicket of acronyms such as DHCP and NAT and staring at uncooperative screens for a long time, I called technical support for the Macintosh operating system. Four hours later, we discovered an obscure check-box option that should have been checked but wasn’t. After that, everything worked fine--until the next software glitch a few days later.

This is an experience increasingly familiar to computer users, and such experiences are having wide and important effects on the economy.

Advertisement

Last month at the Department of Commerce’s Technology Opportunities Program convention in Arlington, Va., keynote speaker Mario Marino, a venture capitalist and philanthropist, estimated that 90% of nonprofit organizations in the U.S. cannot afford to have more than 15 networked computers. The technical-support requirements for more machines are just too expensive. There was no disagreement among the nonprofit managers in the audience.

In Austin, Texas, where I live, the technical-support ratio in the local school district is about 2,500 computers per tech-support person. The district’s ratio means a majority of its computers get no attention at all. One local high school has computers still sitting in boxes, months after their purchase, because no one available knows how to set them up.

Andrew Grove, the founder and chairman of Intel Corp., once joked that at current levels of growth in the tech-support field, early in the new century every person on the planet should be a tech-support specialist. Intel employs more than 5,000 tech supporters.

Much of this is due to software complexity, program bugs and poor quality in software programming. These all add up to an immense burden on the economy.

Capers Jones, chief scientist for the consulting firm Software Productivity Research in Burlington, Mass., has estimated that “about 60% of the U.S. software work force are engaged in fixing errors which might have been avoided.” Moreover, he writes, for software engineers, “only about 47 working days in a full calendar year are available for actually developing or enhancing software applications.” The rest of their time, about 150 days, is spent on testing, fixing bugs and working on projects that are later canceled.

Jones concludes in his published paper, “There would probably be no software labor shortage if software quality could be brought under full control.”

Advertisement

Jones writes, “Much of the work of software engineering is basically ‘wasted’ because it concerns either working on projects that will not be completed, or working on repairing defects that should not be present at all.”

Jones’ figures do not even address the vast overhead of tech support and maintenance required within organizations that use even modestly sophisticated computer software and networks.

In 1986, computer scientist Frederick P. Brooks--this year’s winner of the Turing Award, something akin to a Nobel Prize in computer science--wrote an influential essay titled “No Silver Bullet.” Brooks compared large software projects to werewolves “because they transform unexpectedly from the familiar into horrors.” Werewolves have to be killed with silver bullets, according to the legend, wrote Brooks, and so people have searched for a “silver bullet” to end software complexities and difficulties. But there is none and will be none, he said.

One problem is that software projects begin with a nearly limitless number of possible approaches; on top of that, software has to work with so many different and uncontrollable variables, such as other software, hardware of unimaginable variety and users with wide variations in skills.

On the user end, repeated experiences with software glitches tend to narrow one’s use of computers to the familiar and routine. Studies have shown that most users rely on less than 10% of the features of common programs such as Microsoft Word or Netscape Communicator. It takes a high tolerance for frustration and failure to explore beyond the boundaries of one’s own comfort level. This adds to the exasperation of tech-support personnel, who often don’t understand why users are reluctant to venture into the unfamiliar features of a program. It also calls into question how much money and energy we spend on new software features that most people don’t use or even know about.

“This is just a national scandal, this problem with software complexity and unreliability,” says Leon Kappelman, director of the Information Systems Research Center at the University of North Texas in Denton. “No one should have to put up with computers being so unreliable or so difficult. We don’t put up with this with any other product we use.”

Advertisement

Across the country these days, community and national leaders are talking about such issues as the “digital divide,” the severe shortage in technically skilled workers, massive investments in education to increase the skills of young people and those willing to be retrained and how high salaries in tech fields are transforming neighborhoods.

But few people are talking about how to make technology easier to use. There’s a universal assumption that people will have to adjust to the rampant, irrational and escalating complexity of a hyper-technologized society--or fall into the ranks of the losers and the ignorant. This split is likely to characterize modern life in the 21st century.

*

Gary Chapman is director of the 21st Century Project at the University of Texas at Austin. He can be reached at gary.chapman@mail.utexas.edu.

Advertisement