From the Archives: Why Robert Taylor is one of the most important figures in the history of computer science

UCLA professor Leonard Kleinrock discusses the Interface Message Processor in Korn Convocation Hall in 1999.
UCLA professor Leonard Kleinrock discusses the Interface Message Processor in Korn Convocation Hall in 1999.
(PR·Perry C. Riddle / Los Angeles Times)

Since it’s so fashionable these days to question whether government can do anything right – whether it’s regulating banks, bolstering the economy or overseeing healthcare – it’s worth noting that we’re about to celebrate the 40th anniversary of one of the most important federal initiatives of our time.

The event was the launch of the Internet, which we date from Oct. 29, 1969, when a refrigerator-sized special-purpose computer in Leonard Kleinrock’s engineering lab at UCLA transmitted its first message to a twin machine in Menlo Park, Calif. (The message was the first two letters of the command “Login.”)

That was the first exchange over what was then known as the ARPAnet, which evolved, after many intermediate steps, into what we know today as the Internet.


The ARPAnet had been hatched many years earlier in the mind of a Pentagon research official named Robert W. Taylor. We should begin the story with him, because his role reminds us that sometimes private enterprise isn’t always up to the task of advancing technological progress, and sometimes even gets in the way. Then it’s crucial for the government to step in.

Taylor, now 77, isn’t known to the public. But his name is a byword in computer science and networking, where he’s regarded as one of the most important figures in the field’s history.

That’s not only because of his role in creating the Internet but because of what he did after leaving the Pentagon: He moved to Xerox’s Palo Alto Research Center, the legendary PARC, where he oversaw the engineering team responsible for such inventions as the personal computer, Ethernet (a local networking system, for you non-geeks out there), and the visual computer display.

I first met Taylor 10 years ago, when he became the central figure in a book I was writing about PARC. He was outspoken, uncompromising and visionary then, and he still is, as he showed an audience honoring him last week at the University of Texas, his alma mater.

As the chief of the information technology office at the Defense Department’s Advanced Research Projects Agency (ARPA) in 1966, Taylor demanded that the computer research projects he was funding around the country learn to talk to one another.

Taylor was deeply frustrated that while his researchers were in constant communication with one another coast to coast, their computers labored in mutual unintelligibility. Terminals cluttered his own office – one to interact with his government-funded computer project at Berkeley, another to speak to MIT, and so on. By the end of the year, he had secured a $1-million appropriation for the design and construction of a network that would seamlessly interconnect MIT, Berkeley and other university research computers nationwide.


Taylor foresaw that the network he ordered up would evolve beyond an administrative convenience. In a 1968 paper entitled “The Computer as a Communication Device,” co-written with his ARPA mentor, a transplanted MIT scientist named J.C.R. Licklider, he foresaw its development into a public utility.

Forty years on, that remarkable paper reads like a work of clairvoyance. “In a few years,” it began, “men will be able to communicate more effectively through a machine than face to face.” It forecast that the network would provide some services for which you’d “subscribe on a regular basis,” like investment advice, and others that you would “call for when you need them,” like dictionaries and encyclopedias. Communicating online, it concluded, “will be as natural an extension of individual work as face-to-face communication is now.” Sound familiar?

Taylor tried to interest private industry in his project, but the companies he approached dismissed the idea. IBM told him its computers already talked to one another, completely missing his point that their computers should talk to everyone else’s.

AT&T, then the monopoly proprietor of the phone system over which the network would operate, fought Taylor’s project tooth and nail, contending that the network’s “packet switching” technology (a method of transmitting data in discrete blocks) wouldn’t work on its phone lines and might even damage them. Packet switching remains the Internet’s governing technology to this day.

His experience underscores the importance of a government role in fields like basic research, which profit-seeking enterprises tend to shun.

“Industry generally avoids long-term research because it entails risk,” the veteran computer scientist Ed Lazowska told Congress a few years ago. Why? Because it’s hard to predict the results of such research, and since it has to be published and publicly validated, corporations can’t capitalize on their investments in isolation.


Yet once the research reaches a certain point, private industry piles in – Lazowska cited a National Research Council list of 19 multibillion-dollar industries that had been incubated with federal funding, generally via university grants – including the Internet, Web browsers and cellphones – before becoming commercially viable. Taylor’s ARPAnet was eventually turned over to the National Science Foundation, which in 1991 opened what was then known as NSFnet to commercial exploitation. Four years later, the dot-com boom was underway.

The real world brims with other examples. No private company would have made the investment to build a toll-free interstate highway system, yet who can deny that the cheap large-scale movement of goods coast-to-coast on capacious roadways is a crucial lubricant of our economy?

Healthcare? Private insurance companies have demonstrated over the last decade that they can’t be expected to cover the entire community at an affordable price – the realities of the free market mandate that each one cherry-pick the healthiest (read most profitable) risks, and hope its rivals get stuck with the higher-priced clientele. Take the government out of the equation, whether as regulator or competitor, and they will continue to pursue their own interests, not yours and mine.

Taylor also believes that, despite the Web’s successful commercialization, it may be time for the government to play a stronger role. The corporations making billions of dollars from the Web haven’t done their part to build up its capacity, so a shortage looms as customers increasingly use the network for bandwidth-hogging tasks like downloading movies. Instead, service providers are plotting to profiteer from the bandwidth scarcity by hiking user fees.

“The telecommunications industry has promised us for years that if we only let them raise prices and do mergers, they’d increase bandwidth,” Taylor said this week. “They haven’t kept their promise.”

He thinks the proper model for the Internet, given its critical role in our lives today, is as a taxpayer-supported service available to everybody, rich or poor, at no charge.


Both notions spring from his experience witnessing the interaction of government and private enterprise. What he learned then he still believes.

“The idea that private industry can always do something better than the government is false and sad and divisive,” Taylor observed at the University of Texas event. “People should know better.”

Keep up to date with Michael Hiltzik. Follow @hiltzikm on Twitter, see his Facebook page, or email


A new boss ponders the past and future of the fabled Xerox PARC

To compete with Silicon Valley for engineers, aerospace firms start recruitment in pre-kindergarten


Man who claims to have invented email has filed a lawsuit that could put one news website out of business