In 1968, much of the world was in turmoil. Thousands died in the bloody Tet offensive in Vietnam. Russian tanks rumbled into Czechoslovakia. Robert F. Kennedy and Martin Luther King Jr. fell to assassin’s bullets. Police clashed with demonstrators in Chicago.
But Gilbert P. Hyatt distanced himself from the turbulence. He was not interested in politics or protests. The 30-year-old electrical engineer was absorbed with matters both big and small: designing a computer to fit on a silicon microchip no bigger than a fingernail.
Risking his family’s savings and security, Hyatt quit his well-paying job as a research scientist at Teledyne Inc. and retreated to his Northridge home. In the family room, alone, seven days a week, Hyatt tinkered with home-made electronics hardware.
An idea emerged. And though the design was never specifically used to produce a chip that powered an electronic gadget, Hyatt is now asking the world to recognize him two decades later as the creative inventor who made the computer revolution possible.
In July, after a 20-year legal fight that generated an estimated 10,000 pages of paperwork, the U.S. Patent and Trademark Office gave Hyatt patent No. 4,942,516 for a “Single Chip Integrated Circuit Computer Architecture.” The patent shocked the electronics world.
If it withstands legal challenges, the patent could establish Hyatt as the father of the “computer on a chip,” or microprocessor, the technology that spawned modern wonders of electronics from pocket calculators to microwave ovens.
“I didn’t invent the computer, but I came up with a very good improvement,” said Hyatt, a 52-year-old workaholic who prefers to work alone in a modest home on a cul-de-sac in La Palma. “My work in those days led to the PCs (personal computers) of today.”
The patent has brought Hyatt’s work under the scrutiny of an army of attorneys for computer-chip makers who are trying to determine its scope and validity. If upheld, he would become wealthy, possibly earning millions of dollars in royalties a year.
Depending on the viewpoint, Hyatt is either an underdog inventor who was undermined by greedy investors or a frustrated scientist who exploited an idea in the patent system instead of the marketplace.
Among those who begrudge Hyatt a place in history are former Intel Corp. researchers Marcian E. (Ted) Hoff and Federico Faggin, who along with engineer Stan Mazor have been credited with inventing the first commercial microprocessor between 1969 and 1971.
Another inventor, Gary W. Boone, an ex-Texas Instruments engineer, also has been given recognition because he was awarded the first microprocessor-related patent in 1973--three years after Hyatt filed for his patent.
Boone, now a 45-year-old researcher in Colorado Springs, Colo., built TI’s first microprocessor, the TMS 0100, starting in January, 1970. He said in a statement that several firms contributed to the technology, including TI, Intel and a common customer, Computer Terminals Corp., now known as Datapoint Corp., of San Antonio.
But Hyatt contends that he is the true inventor of the microprocessor and that the technology “leaked out” to the industry through investors. He also says that patents awarded to Intel and TI cover only limited microprocessor improvements that were product specific.
“The patent office did a very thorough job before they issued the patent,” he said. “That is why it took 20 years.”
This is the story of Hyatt’s efforts two decades ago to change the world, and the difficult and complex legal battle to get the world to formally recognize those efforts.
The competitive atmosphere in the microelectronics industry in the mid- to late 1960s was similar to the race between the United States and the Soviet Union to send a manned spacecraft to the moon. Electronics companies were going all-out to build chips with consumer uses.
The semiconductor chip had been invented a decade earlier by engineers Jack S. Kilby at TI in Dallas and Robert N. Noyce, then at Fairchild Semiconductor in the Silicon Valley. With the chip, the components of a complete electronic circuit were built on a single slice of silicon.
Still, the computers built from the chips stood several feet high, mainly because of the huge core memory they held on reels of magnetic tape. They could control missiles but were too big to drive consumer items such as watches. The industry needed something smaller.
“The idea of a computer on a chip was the next frontier of what we were doing,” said former Intel researcher Faggin, 48, now president of Synaptics Inc., a San Jose company developing artificial-intelligence products. “A lot of people were playing with it.”
Like the technology, the electronics industry itself was in flux. Noyce and Gordon E. Moore left Fairchild Semiconductor and formed Intel in July, 1968. Other researchers left companies such as TI or Motorola Inc. to be entrepreneurs.
Hyatt also decided in 1968 to form his own company, but he followed a different path. He avoided the professional circles of researchers in the emerging Silicon Valley in Northern California.
“They all knew each other, but they didn’t know me,” he said. “It was essential I get away from them and do my own original thinking.”
Those who knew him at the time describe Hyatt as slender, mild-mannered and a bit naive for his age. He could be impatient with people who didn’t understand his ideas. The son of a civil engineer, Hyatt knew from an early age that he wanted to be an engineer. His father helped him with math and science. When he was 16, the family moved from New York to Southern California.
Hyatt describes himself as a late bloomer who did not stand out in undergraduate school at the UC Berkeley. He graduated in June, 1959, was married that same year, and went to work for Hughes Aircraft Co.
In 1965, he decided to return to school at the University of Southern California to get a master’s degree in electrical engineering. By that time, he had a family with three young children to feed and, he says, that motivated him to study. He took a job as a research scientist at Teledyne in 1966, but his ambitions lay elsewhere.
The idea for a computer on a chip didn’t come to Hyatt as a sudden inspiration but evolved over many months after he began tinkering with different designs in his spare time in late 1967. He quit his job at Teledyne in early 1968 to devote himself to the project.
Using technology mostly available on the market, Hyatt believes he was the first to put all the pieces together with a design for a small, single-chip computer that bucked the industry trend of building ever-bigger computers with high-capacity memories.
“All products are built on technologies of the past,” Hyatt said. “Only God can create from nothing. Man must work from older elements.”
In concept, Hyatt’s architectural solution seems simple enough, but it involved innumerable calculations and intricate circuit designs as complex as a detailed city street map.
Instead of relying on core memory, Hyatt decided computers could perform relatively simple tasks with far less memory. He chose a compact component known as read-only memory, which could be programmed once with the basic instructions telling the computer what calculations to make. It would work together with another memory component called an alterable scratch-pad memory, which could store the data that was subject to change with each computation. Both were small enough to fit on a chip.
To complete the tiny computer design, Hyatt also proposed using an efficient processor that would require only about 2,000 transistor components and still be able to process large, 16-bit chunks of data at a time. Packing so many transistors onto a chip might have required too many bulky interconnections known as pins, but Hyatt devised an architecture in which data could be transferred to a small number of pins on a time-sharing basis, one after another.
As a general-purpose device, the microprocessor could be programmed for each different application rather than painstakingly redesigned for each customer. This would enable the chip to be mass produced, driving down its cost and expanding its potential applications.
Hyatt’s next step was to put his ideas into practice. In July, 1968, he trademarked the term “Micro Computer” for the invention. In November, he created a working “breadboard,” or an oversized prototype that demonstrated the chip’s circuitry.
He painstakingly tested each of the thousands of circuits with a primitive oscilloscope. Once the hand-wired circuitry worked, Hyatt said, all that he needed was to build the chip for a specific application.
John Salzer, a Santa Monica management consultant, recalled that he was so impressed with Hyatt’s chip design that he invested $5,000 in the company. One potential investor contacted by Salzer, Richard Petritz, a former chief technology officer at TI who launched a venture capital firm called New Business Resources, also liked what he saw.
“The principal innovation in the Micro Computer is the use of large-scale integration semiconductor technology for memory, logic and control circuitry, eliminating the need for relatively costly core memories,” Petritz wrote in a March 3, 1969, evaluation. “New Business Resources is confident that a general-purpose computer selling for $1,000 to $2,000 will open many new applications.”
But Petritz could not work out a deal and decided not to invest in Hyatt’s firm. In March, 1969, attorney Stuart Lubitz agreed to find more investors for Hyatt and filed incorporation papers for Micro Computer Inc. a month later.
Irving Hirsch, a friend and veteran manager in several small technology firms, joined Hyatt as a partner and invested $60,000 in seed money in Micro Computer. Hirsch became president of the company, which moved to an office in Reseda, and began hiring employees.
“Hyatt was a very hard worker and one of the most brilliant guys I’ve run into,” said Hirsch, 68, now owner of a lighting-distribution firm in Inglewood. “We operated seven days and nights and he was there constantly. We used to work at my house in Woodland Hills and take walks at 1 a.m. or 2 a.m. and talk about the technology.”
Hirsch and Hyatt decided that the best use of the chip technology would be to build small computers to control machine tools in factories. From there, they reasoned, other markets would evolve.
The big plans included developing a business computer for small businesses.
Hirsch and Hyatt knew the firm would need about $1.5 million, so they asked Lubitz to search for more investors. Lubitz rounded up a group that included Intel founders Noyce and Moore. Hyatt hoped he could commission Intel to manufacture the chip.
Other investors included Hale Bros. Associates Inc. in San Francisco, a now-liquidated firm started by a founder of the Carter Hawley Hale retail chain, and the San Francisco investment firm of Hambrecht & Quist. They pledged to invest $500,000 for 20% ownership. Lubitz served on the board of directors, as did Hirsch, Hyatt, Joseph Chulick Jr. of Hambrecht & Quist, and a Hale representative. Since Lubitz served as the go-between between the company and the investors, Hirsch said he talked with Noyce and Moore only rarely.
With the venture-capital backing, the company moved to offices in Van Nuys in 1970 and hired 25 employees. Hirsch said the firm concentrated on applications in the machine-tool industry.
But a battle was brewing. Hyatt filed a patent application for the computer on a chip on Dec. 28, 1970. Working through Lubitz, the investors tried to persuade Hyatt to give up rights to the technology if the company closed or was sold, Hyatt and Hirsch recalled.
On July 17, 1971, the investors staged a showdown at a board meeting, Hyatt said. He said both Lubitz and Chulick tried to badger him into giving up his rights to the technology. When Hyatt refused, the investors withheld their funding.
“I think the investors decided the technology was too good for Hyatt,” Hyatt recalled. “They tried to squeeze us by cutting us off. Their motive was to sell the company and take the technology. “
Lubitz, who later worked as a patent attorney for Intel and is now a Los Angeles patent attorney, said Hyatt was a stubborn, impractical researcher without a viable technology who was obsessed with “paper rights.” He believes Hyatt’s patent has stolen credit from Intel.
“It was a question of whether investors wanted the patents developed with company funds to be assigned to the company,” Lubitz said. “Mr. Hyatt refused, and that was the end of it. There was never any mention of a computer on a chip.”
The firm went out of business in September, 1971. Hyatt contends that the investors then leaked details of the chip to the industry, although he will not elaborate on the evidence to back up that claim.
As Hyatt’s dream was unraveling, his rivals were not sitting still.
TI’s product was built by a team led by Boone between January, 1970, and April of 1971, with input from Victor Poor at TI’s customer, Computer Terminals Corp., and with indirect help from Intel, which was also a competing supplier to CTC, Boone said. The chip was delivered in prototype form to CTC in March or April of 1971, and an improved chip, the TMS 0100, hit the market in July, 1971. It was a runaway success.
Between 1969 and 1971, Hoff, Faggin and Mazor at Intel developed a series of four chips that could be sold as a package, including the Intel 4004, a central processing unit on a single chip that was generally recognized as the first microprocessor.
On Nov. 15, 1971, the company ran an advertisement in Electronic News, touting the new product as beginning a “new era of integrated electronics.” Later, Intel produced the 8080, an improved microprocessor that found its way into digital watches and video games.
Intel and TI went on to become multibillion-dollar electronics industry giants while Hyatt’s Micro Computer produced only one prototype for one customer in an application that did not even use his advanced computer design.
“It was frustrating watching people exploit my technology and get rich while I was on the sidelines,” Hyatt said.
Unable to find funding elsewhere, Hyatt went to work as an aerospace consultant and tinkered with his own inventions at home, eventually collecting more than 50 patents.
But he didn’t give up on the patent application. Hyatt and his attorney, Gregory L. Roth, were aware that, under patent law, they did not have to build a working microchip but only prove that the chip could be commercially built from the design.
They had the benefit of the earliest filing date. But they still had to prove the case to patent examiners. During the next two decades, Hyatt and the examiners climbed up and down the appeals ladder several times, going through 16 different legal reviews.
In June, 1988, the U.S. Court of Appeals said that Hyatt had not proven that any chip fabricator could have built a working chip based on Hyatt’s patent application. However, the court broke the stalemate between Hyatt and the patent examiners when it acknowledged that he had solved the technological problems that earlier designers could not.
For the next two years, Hyatt built a case based on published industry literature that the technology existed to create his chip in the 1970s. He found evidence that TI, Intel and other firms could have fabricated the chip if he had directed them to do so.
The patent examiner finally issued the patent on July 17. Gary Hecker, a Los Angeles patent attorney who examined the patent, said its claims are broad, apparently covering most microprocessors and other computers contained in a wide range of electronic products.
Enforcing the patent could prove time-consuming and expensive. Hyatt has agreed to testify about his invention in a patent-infringement suit between Zenith Data Systems Inc. and TI, which is attempting to enforce patents related to Boone’s microprocessor work. Tandy Corp., a computer maker in Fort Worth, is attacking the validity of Boone’s patents in another suit.
“If this suit goes to trial, I think we will get to the bottom of who was the first,” said Albert J. Hillman, an attorney for Tandy in San Francisco.
Hoff and Faggin, the former Intel researchers, feel cheated by the Hyatt patent. Boone declines to comment, except to emphasize that a number of companies deserve credit.
“I feel we at Intel made a very small, very economical processor that was orders of magnitude cheaper than the minicomputers at the time,” said Hoff, a legal consultant in Mountain View. “When you do that, that’s when you really achieve a breakthrough.”
In rebuttal, Hyatt said his chip would have been marketable if only his investors had backed him.
T.R. Reid, a Washington Post reporter and author of the 1984 book “The Chip,” said that historians may solve Hyatt’s case with the same compromise that ended a 10-year dispute between Intel’s Noyce and TI’s Kilby over the invention of the chip in 1958.
“Kilby got the idea first, but Noyce made it practical,” Reid said. “The legal ruling finally favored Noyce, but they are considered co-inventors. The same could happen here.”
THE INTEL 4004 Widely regarded as the first microprocessor, the 4004 was designed by Intel engineer Ted Hoff, left. Hoff’s design employed four separate chips to perform all the functions of a computer. A Processor for All Reasons The Intel 4004 was a general-purpose computer designed to be easily reprogrammed for a variety of tasks. However, each member of the four-chip set was a specialist: a CPU chip performed all the calculating chores; program instructions were stored on an uneraseable Read-Only Memory or ROM, chip; data was stored on an eraseable Random Access Memory, or RAM, chip; and a fourth chip served as the 4004’s primary link to the outside world. HYATT’S COMPUTER ON A CHIP La Palma engineer Gilbert Hyatt, right, was recently awarded a patent for a design that combines all the elements of a computer on one chip. Hyatt’s patent filing preceded Intel’s. A Space-Efficient Specialist Hyatt designed his chip specifically for the job of controlling an industrial milling machine. This simple task required a limited number of program instructions, which could be stored in a small ROM memory array on the chip itself. Similarly, a RAM “scratch pad” could be used for the small amount of data involved. By squeezing all these elements onto a single chip, Hyatt’s design avoided the need for redundant control units and input/output channels. DANCING ON THE HEAD OF A PIN Electrical signals travel from the chip’s chip’s microscopic circuitry across a bridge of thin gold wire to metal pins that line the chip’s airtight, plastic package. The pins connect the chip to the rest of the machine. Hyatt’s design required more pin connections for input/output than would fit in the chip’squarter-inch-square dimensions. Hyatt needed to prove he could overcome “the pin problem” to get his patent approved. The solution Hyatt devised was a time-sharing scheme that would allow several data streams to share the same input/output channels by taking turns, like gandy dancers pounding a railroad spike.