A Single-Minded Focus on Multiple Threads
Bucking tradition is nothing new for Deborah T. Marr. The Cornell graduate is one of a handful of female computer chip designers in a field that is 95% male.
So when Intel Corp., the world’s largest computer chip maker, embraced a radical new design for its microprocessors, it turned to the tenacious 36-year-old mother of two to push the technology out of the lab and into PCs.
The product of Marr’s three-year effort is expected to debut next week in the form of a 3-gigahertz Pentium 4 chip that the Santa Clara-based company boasts is the fastest commercial microprocessor for personal computers.
The phenomenal speed of the processor is less significant than the way it works.
It relies on a technology Intel calls Hyper-Threading -- a radical concept that makes a single computer chip act like two. That notion, however simple, defies the entire architecture of computers, which excel at executing instructions one at a time, millions of times a second.
“Hyper-Threading is like giving a cook two pans to work with instead of one,” said Brian Fravel, Intel’s marketing manager for desktop chips. “It’s not as fast as having two cooks, each with his own pan. But it’s also not as expensive because you’re only paying one cook.”
Like chips, the people who make them generally tackle work in sequence, one task after another. Building a new kind of chip required a new kind of thinking. For order-loving geeks, it was a tough sell.
Enter Marr, who joined Intel in 1988 and whom a colleague describes as the “conscience of Hyper-Threading.”
Her journey demonstrates how Intel was able to introduce a technology that defied its decades of monomaniacal focus on building smaller, faster chip transistors, an endless cycle predicted by company co-founder Gordon Moore and later referred to as Moore’s Law. Instead, Hyper-Threading turns Moore’s Law on its head by making existing chips work more efficiently instead of just throwing more transistors into a chip.
Another challenge with Hyper-Threading is its unproven nature. When the concept was floated in the early 1990s, no one knew exactly how it would perform in the real world, and many still are uncertain how much benefit it will provide. Intel claims the process boosts a computer’s performance by 25%. Independent tests meant to simulate real-world computer use yielded mixed results.
That uncertainty, combined with a natural reluctance to change, meant Marr had her work cut out for her when she joined the Pentium 4 team as an architect in 1996. By then, the corporate decision already had been made to incorporate Hyper-Threading into the Pentium 4.
The idea began with Glenn Hinton, 45, one of Intel’s top engineers in charge of designing chips at the company’s campus in Hillsboro, Ore.
Hinton had been thinking since the late 1980s of ways to give a chip the ability to tackle more than one task by splitting its resources. When he worked on a project to make a smaller version of a chip in 1992, he noticed that slicing the chip in half slowed things only 25% to 30%.
“Then it hit me: If I could get two threads executing at the same time, each thread getting half the chip’s resources, I could improve performance 1 1/2 times,” he said.
Traditional chips crunch only one program at any given point in time. Computers give the illusion of running more than one program by rapidly switching from one to another. But because the chip is working only on a single program at any instant, only about a third of its transistors on average are engaged at once. The rest sit idle.
Hyper-Threading lets chips work on two programs at once and make better use of all of a chip’s transistors. Although not as fast as two chips, it’s also not as expensive.
A chip with Hyper-Threading is about 5% larger than one without. And in the world of semiconductors, size relates directly to cost. Larger chips are more expensive both to produce and to use because they require so much power.
Hinton began to champion the idea when discussions about the Pentium 4 took place in 1993. People were starting to use their computers to do several things at once.
Operating systems, especially Microsoft Corp.'s Windows, were finally capable of exploiting dual-chip PCs. Software engineers were beginning to write programs in ways that would take advantage of multiple processors.
Hinton also was helped by two papers published in 1995 and 1996 by University of Washington researcher Dean Tullsen, who first simulated a chip capable of processing multiple streams of instructions simultaneously.
In 1996, the company gave Hinton the green light. He paired up with Marr, whom he had met on an earlier chip project.
Hinton and Marr were a study in contrasts.
At 6 foot 2 and loquacious, Hinton was constantly throwing off ideas, cornering engineers in the hallway to talk about a new thought he just had. Intel workers who showed up late for meetings would say they had run into Hinton and would get understanding nods.
Marr, a 5-foot-5, 105-pound daughter of Chinese immigrants from Shanghai, was quiet, focused and methodical. She wasn’t shy in open debate, but Marr preferred more strategic routes, recruiting her allies in advance of crucial meetings. She hated surprises, and was the type to know exactly how a meeting would turn out even before it began.
Marr came to believe in Hinton’s ideas, and Hinton respected Marr’s ability to get things done.
To create a chip like the Pentium 4 takes thousands of engineers. For Hyper-Threading to work, every person at every step of the way had to buy into the technology. Implementing it required retooling every aspect of the chip to make it behave as if it were two.
Before design work could begin, though, Marr had to plow through thousands of pages of technical documents covering all the functions of the Pentium 4. Then she had to rework each of them so the new chip would know how to handle two jobs instead of one.
Once the new rules were in place, Marr had to make sure all the engineers built them into their designs.
“Let’s say I’m an engineer, working on my part of the chip,” Hinton said. “I’ve got hundreds of things to do to meet my milestone. The camel’s back is about to break. The tendency is to work on the things I know have to get done anyway and put off things I think are optional -- like Hyper-Threading -- until tomorrow or next week.”
Marr stayed on top of the engineers, wheedling, lobbying and guilt-tripping them into moving Hyper-Threading tasks higher on their priority list day after day.
She marched groups of engineers through intensive weeklong seminars on Hyper-Threading to get them focused.
“It became an obsession for me,” Marr recalled. “People would tell me it’s just a job, but that only made me more stubborn.”
Working against her was the fact that the concept ran against the grain of how engineers like to think -- that is, in a single sequence.
“It can be a fundamental shift in mind-set,” Marr said. “A lot of the rules you had all of a sudden don’t work anymore. A very large number of algorithms had to be rethought.”
Intel executives acknowledge that the concept will be as much of a challenge to sell to consumers as it was for Marr to push to engineers. Consumers have been trained to pay attention to the speed of a chip, measured in megahertz.
Hyper-Threading has nothing to do with megahertz. Instead, it has to do with getting the same number of megahertz to perform more work, even as chip speeds continue to power forward.
“The vast majority of consumers don’t want or need a 2-gigahertz processor, much less a 3-gigahertz chip with Hyper-Threading,” said Kevin Krewell, senior analyst of Microprocessor Report, an industry newsletter that nevertheless gave Hyper-Threading an award last year for best new technology.
In the end, it’s unclear how much benefit Hyper-Threading will actually deliver.
In a small number of cases, it may even slow overall performance. Because one thread can sometimes monopolize a chip, there are times when the second thread is stalled while it waits for its turn, causing the computer to perform more poorly with Hyper-Threading than without.
Intel is working with companies such as Microsoft and Adobe Systems Inc., the maker of Photoshop image-editing software, to anticipate and circumvent such bottlenecks.
Marr, who gave birth to her second child this week, is characteristically undeterred.
“I really believe that this technology is revolutionary and that it will greatly improve how computers serve us,” she said.