Raging Against the Machine

Theodore Roszak is professor emeritus of history at California State University, Hayward. His books include "The Making of a Counter Culture," "The Cult of Information" and most recently the novel "The Devil and Daniel Silverman" (Leapfrog Press, 2003).

Who played in the Super Bowl in 1984? Not many people can remember -- fewer, I’ll bet, than remember the woman who came sprinting across the television screen at halftime to toss a great big hammer at a glowering Big Brother. Talk about coming on strong. That was how Apple Computer announced the first Macintosh: a 60-second Orwellian mini-drama directed by Ridley Scott that was destined to become perhaps the most famous commercial ever made.

It’s 20 years later, and Apple has “repurposed” the ad to help sell iPods as Super Bowl XXXVIII rolls around. Once again you can see an insurgent little company advertising itself as the hope of the human race. Brash as it was, that commercial embodied the Utopian future so many people saw in the computer just two decades ago.

Of course, Apple got a lot of things wrong.

First, the casting. In 1984, the cognoscenti saw Big Brother as IBM, which dominated the PC market at the time. But the future of the computer industry didn’t belong to IBM’s machines, it belonged to the “disk-operating system” IBM had franchised to run its machines. That was a program called DOS, created by a little-known firm called Microsoft. Apple never saw it coming, but Bill Gates would become the Big Brother of modern computing. Before the end of the decade, he would, shall we say, “borrow” the Macintosh graphical interface, call it Windows, and capture the industry.


Nothing did more to ruin the high hopes represented by Apple’s hammer-tossing woman than the dominance of Microsoft, soon to become the most ruthless monopoly since Standard Oil. The result has been inferior technology cleverly contrived to keep the public buying one mediocre and buggy program after another. But then, what would you expect from a company that seems to make as much money from litigation as invention?

Given the commercial opportunism with which Microsoft has contaminated the industry, it’s difficult now to recapture the ebullience that originally greeted the personal computer. This was not simply a machine, it was a dream, a cause, an ideal. The hackers who tinkered the first computers into existence were driven by high social expectations. They were bringing humankind the great gift of information -- endless amounts of free information.

Even when the Internet was nothing more than a restricted military messaging system, enthusiasts envisioned a day when politically restive millions would network their aspirations and talents via computer. All they had were funky little CPUs that scrolled sickly green letters and numbers at a snail’s pace across a 6-inch screen, but that was enough, they said, to build the New Jerusalem.

The PC was considered a people’s technology, a guerrilla technology, one of the last gasps of countercultural rebellion. In a larger sense, Big Brother in Apple’s “1984” TV spot was not just IBM but the elephantine military-industrial complex. It was everything big and domineering and slick, the whole corporate world of men in suits.

Apple’s idealism was marvelous, but how sadly misplaced. Perhaps we can see that now in the wake of the dot-com bust. We have watched high tech become the next wave in big-bucks global industrialism, the property of the crass and the cunning, who are no more interested in empowering the people than General Motors was.

The computer has brought us convenience and amusement, but, like all technology, it’s a mixed blessing. Far from smashing Big Brother, computers have given him more control over our lives. They have been a blessing for snoops, con artists and market manipulators. They have turned global communications into glitchy, virus-plagued networks. Along with some highly valuable resources, the World Wide Web has brought a time-wasting flood of trivia, trash, pornography and spam. We have burdened our children with the distractions of becoming computer literate before they are just plain old literate.

Some would say that it’s the sign of a mature technology to generate as many problems as it solves. But in the case of the computer, there has been one peculiarly pernicious result. We have equated a machine with the mind. We believe computers are “smart,” so smart that we cast ourselves as “dummies” in their presence. Thanks to the computer, we have begun to believe that the mind, the defining feature of human nature, is a somewhat inferior information-processing machine.

And, of course, the computer-makers agree. Microsoft is now peddling “E-house” systems that will run our homes better than we can. No doubt there will soon be books to help us out: “E-house Living for Dummies.” Will we ever again be able to see information for what it really is? Minor, sometimes useful pieces of mental furniture beyond which lie the higher, never-to-be-computerized powers of the mind: imagination, revelation, insight, intuition, wisdom?

Sound judgment, good citizenship, being “smart” in the best sense of the word has never had anything to do with information. The real irony in Apple’s charmingly defiant “1984” commercial is that it failed to understand that the Macintosh too represented Big Brother. Deifying the computer and downgrading the human mind are the first steps toward enslaving ourselves to our own technology.

There will never be a computer program that can effectively respond to the command, “Tell me everything I need to know that is true, wise and relevant.” When we search for that, we will always have to fall back on our own hard-won ability to make graceful use of ideas we inherit from those who needed no machines to think with, but only the resources of their own naked minds, a quiet place to gather their thoughts -- and perhaps a stick with which to scratch those thoughts in the sand.