Be Thankful Pythagoras Didn’t Have a PC
I knew that I would never be a prodigy after reading a childhood story about the great mathematician Karl Gauss.
It seems that his teacher, wanting to buy some free time, told his charges to add up all the numbers from 1 to 100. All the obedient little German schoolchildren started scratching away on their slates. The young Gauss closed his eyes, thought for a brief moment and scribbled a number on his slate: 5,050. He then tossed it onto the teacher’s desk, saying, “ ligget se "--there it is.
How did he do it? I figured it was either some obscure “mathemagical” trick or the kid had a Hewlett-Packard calculator implanted in his skull. I was wrong. Instead of adding up a hundred numbers, Gauss spotted a pattern that even a 10-year-old dolt could have seen had he been clever enough to look for it: 1 plus 100 totals 101; 2 plus 99 totals 101; 3 plus 98 totals 101, etc. There are 50 such pairs, so the answer to the question is 50 times 101 totals 5,050. Ligget se !
The simple elegance of Gauss’s solution blew me away. It still impresses me. How many of us spend the day drearily adding up all the numbers instead of recognizing the obvious (if you think about them) patterns that can swiftly solve the problem? How many problems do we face daily that would vanish if we could just get some glimmer of insight into the underlying patterns? Wouldn’t it be great if there was some tool or technology that could help open our eyes to these patterns? Unfortunately, that’s exactly where our technologies fail us. It’s nothing to an IBM PC to add up all the numbers from 1 to 1,000 and back again. If Gauss (1777-1855) had had an HP-35 calculator, he might not have bothered to close his eyes and think. For all intents and purposes, doing spreadsheets on a Macintosh, editing on an Epson or modeling on a Sun is all the same software shtick. The dominant design theme is that a computer and its software are systems built to automate thought. The technology peels away the drudgery of repetition and routine from intellectual labor. Call it the Information Age, the Personal Computer Revolution, whatever. But it’s basically white collar robotics.
Want another example? Consider “expert systems,” artificial intelligence programs designed to codify human decision-making expertise in software. While that’s a link up the value chain from most personal computer software, the goal remains white-collar automation; let’s replace experts with expert systems. For obvious reasons, this design philosophy has sparked a lot of wariness and resentment from experts whose brains are being tapped.
The problem is that virtually all these computing technologies are designed to substitute for human thought instead of complement it. This is ultimately both foolish and dangerous. The issue shouldn’t be automating intellect; it should be augmenting it.
This isn’t a distinction without a difference; it represents a radically different approach from the kind of software that now dominates the market. Automation eliminates thought; augmentation incites it. Microsoft Word and Lotus 1-2-3 may be terrific programs that make life a little easier in writing and finance--but they don’t make people better writers or better financiers. They’re as evocative as a blank sheet of paper.
What we need is software that can prompt and cajole and point users into looking for the best patterns of solutions. Computer pioneer and electrical engineer Vannevar Bush proposed the idea of a “memex” back in 1985--sort of a software “agent” that would be smart enough to track down relevant information in computerized libraries.
Mathematician John von Neumann, widely credited with conceiving the idea of the digital computer back in the 1940s, said that software should be like “an intelligent graduate student"--just tell it what you need to know and it will come back with lots of interesting information and ideas.
Sponsored by the Pentagon’s Advanced Research Projects Agency, Douglas Engelbart built prototypes of computer tools designed explicitly to augment how people solved technical and conceptual problems. (Engelbart now explores these questions as head of the Bootstrap Foundation at Stanford University). Apple Computer scientist Alan Kay has long championed the idea that software agents are the destiny of interpersonal computing--although Apple has hardly been a pioneer in this area.
The fact is, there has been remarkably little software designed to augment thought. One product--IdeaFisher from Fisher Idea Systems in Irvine--is a clever but ultimately flawed effort to get people to brainstorm new ideas. The program relies on getting you to specify desirable attributes of the idea and then prompts you with questions and related key words intended to spark further thought. This is sort of a software firecracker approach, more noise than illumination, but it does get the user’s attention.
IdeaFisher is one of the first and few examples of what must eventually become a new class of software. Increasingly, the software that we use is going to be spiced with subroutines and expertise designed to augment the way we work with information. Word processing programs that contain a spell checker and thesaurus will soon contain things like Bartlett’s Quotations and Strunk & White’s “Elements of Style.” The program will generate a host of relevant quotations based on the frequency of key words used in the document. Those quotes may inspire further thought or be woven into the text. Similarly, awkward phrasing might trigger the appropriate passage from “Elements of Style” for a bit of pungent editorial advice.
Spreadsheets will be imbued with comparable pattern-recognition capabilities. The software will be capable of questioning the assumptions used in the forecasting model--and pointing out logical inconsistencies. It’s not out of the question that expert systems can be grafted into the spreadsheet to encourage users to explore alternate modeling assumptions and suggest other ways to achieve budget goals.
Some software designers argue that the reason we haven’t seen more augmentation-oriented software is that it’s too difficult to design and, besides, the market doesn’t want it. That’s rubbish. There hasn’t been enough good augmentation-oriented software out there for the market to decide anything. And of course it’s hard to design; most designers at Lotus, Microsoft, Ashton-Tate and Borland, etc., are slaves to the automation ethic. They don’t think in augmentation terms. If you’ll excuse the term, they need to augment their approach to software design.
Actually, I’m fairly confident we’re going to see augmentation-oriented software sooner rather than later. The reason is simple: so much of today’s personal software is degenerating into commodity functionality that companies are going to grab onto anything to differentiate their products.
Yes, most companies will opt to make their software faster and cheaper. The smarter firms will recognize that people want more value for their money--and imbuing software with an augmentation-oriented design ethic could be the best way to do that. A Gauss may only come along once or twice a century; the tools that help us think a little more clearly, crisply and creatively should appear with much greater frequency.