Advertisement

Movies : How They Did That : With ‘T2,’ computer-generated graphics made a huge leap into the spectacular; today actors can be transformed, one day realistic human figures will be created

Share
<i> Charles Solomon is a frequent contributor to Calendar</i>

Any sufficiently advanced technology is indistinguishable from magic. --Arthur C. Clarke

After more than a decade of inflated promises and disappointing performances, computer graphics are finally assuming a significant role in mainstream filmmaking.

Audiences and critics agree that James Cameron’s “Terminator 2: Judgment Day,” this summer’s one bona fide blockbuster, owes a large measure of its success to the spectacular special effects--which include some of the most sophisticated computer-generated imagery (CGI) ever rendered. Viewers had never seen a character flow through the barred entrance of a prison or rise out of a tessellated linoleum floor, or watched hundreds of shattered fragments melt and flow together to form a metallic robot like the T-1000. It had never been possible to create these effects.

Six-time Oscar winner Dennis Muren, senior visual effects supervisor at the Industrial Light & Magic (ILM) division of LucasArts, who oversaw the more than 40 CGI shots in “Terminator 2,” assesses the impact of this advanced technology:

Advertisement

“I think you could have gotten some of these shots on the screen for less money with traditional methods, but they wouldn’t have looked as good. So the director has a choice: Does he want something in the film that’s going to look artificial and break the progression of the story for the audience, or does he want something that looks real?

“We can now do stuff that couldn’t be done before, and I don’t think you can put a price on that,” he continues. “Ten years ago, you could have spent $10 million on a shot of a guy’s face being pushed through a grid, and I don’t know what you would have gotten for that $10 million. Almost every audience member has responded to these effects--we did when we saw the dailies. We’d never seen anything like it before, and it’s been really neat to know the audience hasn’t either.”

“The Final Nightmare: Freddy’s Dead,” which opened recently, includes a computer-animated sequence by Pacific Data Images (PDI) of the “dream demons” that haunt Freddy Krueger’s twisted mind. The film takes the viewer inside Freddy’s brain, where the monstrous floating skulls that inspire his murders swim through the caverns of his cerebrum.

In addition, computer effects are in the works for the action-adventure features “Star Trek VI,” “Sleepwalker,” “RoboCop III” and Steven Spielberg’s “Hook,” as well as for the comedy “Memoirs of an Invisible Man” starring Chevy Chase.

James Cameron hadn’t planned a sequel when he made “The Terminator” in 1984, but the film’s success mandated a reprise. After more than five years of legal battles over rights, Cameron began formulating a story that would bring back the murderous cyborg that made Arnold Schwarzenegger a superstar.

But as he and co-writer William Wisher worked on the script, they realized that a revolution in special-effects technology would be required to bring their “poly-alloy” liquid metal assassin, the T-1000, to the screen. The scenes of the character oozing through the prison bars or flowing into a police helicopter and assuming human features couldn’t be done convincingly with traditional techniques.

Advertisement

“When Jim wrote the script, we knew it was theoretically possible to achieve all these effects, but he didn’t know if it was physically possible,” explains Larry Kasanoff, Cameron’s partner at Lifestorm. “If Carolco could have had a guarantee that what was written on the page would appear on the screen, they’d have been pretty safe, because it was a great script. But the stuff on the page had never been done before, and it had to be done within a certain time. If the effects hadn’t worked, the T-1000 wouldn’t have been a threat and the movie wouldn’t have worked. So their real risk was betting that these effects could be pulled off in a certain time. We realized it would take a complete advancement of the technology, but we also knew it was all theoretically possible--so we went for it.”

Although experiments with computer graphics had been conducted in laboratories since the early 1950s, the real push to use the medium for film effects came almost three decades later, when Disney released “Tron” (1982). Steve Lisberger’s sci-fi adventure about an alternate electronic reality inside a computer featured about 20 minutes of computer-generated visuals.

‘Tron” received an enormous amount of advance publicity, and articles in Time, Newsweek, Rolling Stone and other journals announced the advent of a new era in film technology, in which computer images would replace models, miniatures, puppets, mattes and even sets and stunt men. Some artists boasted that visual automatons would soon replace human actors, and promised to bring back Marilyn Monroe and Humphrey Bogart in new roles.

The promised effects far surpassed anything the artists and machines could (and can) deliver, and the promises were tactfully forgotten after “Tron” failed at the box office. A similar flurry of promises and predictions greeted Lorimar’s “The Last Starfighter” (1984), which featured computer-generated battles between spaceships. Once again, the film failed to attract viewers; once again, the extravagant promises weren’t fulfilled. (Ironically, the only “computerized” character to attract widespread attention was television’s Max Headroom, whose unusual look was produced by doctoring live-action film--no computer graphics were involved.)

“I remember when ‘Tron’ and ‘Starfighter’ were being made, people promised a hell of a lot--we’ve learned from that and stopped promising everything,” comments Jamie Dixon, digital effects supervisor at PDI, which supplied additional effects for “T2.” “In the last couple of years, we’ve started using computers not as a way of replacing existing techniques that work fine, but to augment them and create really new things. Probably the first example in a feature film was the Water Weenie that ILM did for ‘The Abyss’: That effect would have been impossible to do using traditional techniques. It made people begin to think differently about how to use computers.”

To demonstrate that the aliens in “The Abyss” were friendly and wanted to communicate with the humans, they extended a pseudo-pod of what appeared to be seawater into the disabled underwater oil rig. The shimmering, transparent column adopted the features of Mary Elizabeth Mastrantonio when she tried to speak to it.

Advertisement

PDI President Karl Rosendahl agrees that inflated claims have damaged the credibility of computer graphics. He and the other artists involved in “Terminator 2” emphasize that for this film, the effects grew out of the story, rather than vice versa.

“I think the reason ‘Tron’ and ‘Starfighter’ did poorly was that the computer graphics were the star,” ILM computer graphics supervisor Jay Riddle adds. “You can’t help thinking there’s a reason for that, and that the stories weren’t quite as important as the technique. Today, we’re finding people who want computer graphics in their movies to help them tell the stories.”

The basis of this often fantastic imagery lies in the ability of the computer to assign values for brightness and color (a proportion of red, blue and green) to each point or pixel on a video monitor. The high-resolution monitors used for special effects have at least 2,000 lines running each direction for a total of 4 million pixels. (A standard TV set has 525 lines and 275,626 pixels.) Software programs instruct the computer to alter each image in a sequence to simulate motion; the calculations needed to create each frame may take a few minutes--or several hours.

The transformations of the T-1000 involved a program called “morphing” (from metamorphosing). ILM computer graphics artist Doug Smythe devised the software to help the model makers create the distorted beasts in the transformation scene in “Willow.” Morphing enables an artist to transform one shape into another by having the computer devise a series of intermediary steps.

“The program uses a grid of control points to tell the computer where certain features are located on the object you’re interested in transforming,” Smythe explains. “So you put a horizontal line of dots, say, along the top of the head, and vertical lines along the edges of the body. You put other lines of dots along the top and sides of the second shape. With that information, the computer can calculate a series of interpolations between the two.

“The grid system also allows us to transform different parts of the character at different times or different speeds,” he continues. “A good example is the shot where the T-1000 has changed into Sarah Connor, then turns into the cop as he spins around. If you were to look at single frames of it, there are places where the T-1000 has Sarah’s hair and the cop’s uniform. I also had his chest stay out in the form of the woman’s breasts, then deflate into the shape of the cop’s chest.”

Advertisement

In order to accurately reproduce the motions of Robert Patrick, who portrays the T-1000 in its human form, the crew at ILM put the actor in a skull cap and briefs and drew grid lines all over his body. They shot reference footage of him walking and running through the studio parking lot in San Rafael--to the bemusement of passing drivers.

“The audience would see Patrick walking and running throughout the film, so we wanted to match the movements of the chrome version to the live actor’s as closely as possible,” Smythe says. “We took some very exacting measurements of him, and often the easiest way to see how his muscles moved and flexed was to use those grids; we just watched how the grid lines move. It’s a lot easier when you have those fixed reference points.”

“When you talk about doing naturalistic effects involving human characters, the whole array of motor skills involved in locomotion comes into play,” Riddle adds. “When someone stands up, how exactly does he move? How do you get the flesh to stretch realistically? All those physical properties are very tough to iron out.”

The ILM crew also faced the problems that have bedeviled traditional animators at Disney and other studios for decades. Difficult as it is, making a drawn or a computer-generated figure move realistically isn’t enough: The character has to move with a style that expresses his individual personality.

“Various shots presented different problems. For example, the head splitting open and coming back together: How do you do something like that? How is it supposed to look?” Muren says. “In addition to getting a CGI character to walk more like a person than had ever been done before, there were all the problems of character animation: How does this character move? He’s very focused; he’s got a job to do and he isn’t easily distracted. That has to be manifested in his walk and the way he turns his head and all his body language.”

Once they had created suitable images of the T-1000, the special effects crew faced another traditional challenge: integrating their visuals into the live action. Computers produce essentially flawless images, and the artists often had to add grain and blur to match the images to their surroundings. The contrast between the razor-sharp CGI scenes and the ever-so-slightly blurred live action created a jarring discontinuity in “The Last Starfighter.”

Advertisement

“The little details that people don’t ordinarily notice are incredibly helpful in selling the reality of a situation,” Kasanoff says. “For example, when the T-1000 rises out of the floor, two things make it look convincing. One is the reflections of the surroundings on the surface of the cyborg; the other is the tiny reflection his foot makes on the polished floor. Getting those kinds of things to look right and move right were incredibly difficult, but they make it seem real. When he saw the tests, Jim kept saying, ‘Why does that foot look like it’s not on the floor?’ and the answer was because it needed a quarter of an inch of drop shadow around half of it--then it looked like it was on the floor.”

Improvements in computer hardware and software facilitated the production of convincingly realistic images for “Terminator 2.” The battling spaceships in “Starfighter” were rendered on a multimillion-dollar Cray XMP supercomputer. Today, the artists at ILM, PDI and other studios use a variety of smaller, cheaper computers and continue to refine their software.

“When we were doing ‘Star Trek IV,’ we got the complexity of our images to a point where it took an hour or two per frame to render them,” Riddle says. “We’re still rendering images that take an hour or two per frame, but the machines are 10 to 20 times faster than they used to be. Complexity goes up with what your equipment can do, and the limit becomes time, rather than anything else.

“The generations of software and hardware have been moving at a very uneven pace,” he continues. “The machines get a lot faster, but the software that’s available to run them doesn’t take full advantage of it. Then the software catches up and the machines jump ahead again. The computers we have right now are beyond what our software is capable of doing, but as soon as we develop software that’ll run great on those machines, along will come another machine that’s faster.”

“Economics are finally working for CGI,” Rosendahl adds. “The machines are cheap enough and fast enough that you can get the amount of work through for a price the industry’s willing to pay. You have to give a client either a lot more than he could get any other way or something completely different that he couldn’t get any other way, but at a price that’s competitive with what the rest of the industry would charge--if they could do it.”

Although most of the attention has been focused on the flashy effects in “T2,” crews used the same technology to manipulate some of the other visuals so subtly that audiences don’t even realize the images have been altered. (Rosendahl calls them “invisible effects--if you can’t tell we touched the film, we did a really good job.”) These undetectable effects, which allow the director to make minute changes in a scene after it has been shot, may ultimately have a greater impact on mainstream filmmaking than dazzling footage of the T-1000.

Advertisement

During an early chase sequence, the T-1000 hijacks a big-rig truck and chases a motorcycle carrying the Terminator and young John Connor through the drainage channels connected to the Los Angeles River. Cameron felt that one scene would cut into the chase more effectively if the truck were coming from the opposite direction, but a large street sign appeared prominently in the shot. If the film were simply flopped, the sign would read backwards. The PDI staff digitally reversed the lettering on the sign and corrected the perspective for the altered point of view.

“On another shot for ‘T2,’ we removed a camera scratch,” Dixon adds. “It was an emergency, and we obviously didn’t look on it as any great artistic challenge. But we were in a position to use the technology to solve that problem for them: Without that intervention, they would have been forced to choose another take, re-shoot the scene or have a big scratch in the middle of the movie.”

This technique, called “digital compositing,” offers filmmakers more complete control over every element in the frame than has ever been possible before. Kasanoff describes it as “incredibly important.”

“You can suspend someone on a wire, then just take the wire out,” he says enthusiastically. “If there’s lint on someone’s shirt and it wasn’t in the previous shot, you can take it out. When Arnold jumps the canal on the motorcycle, it’s really a stunt man on a motorcycle rig, with a wire on each side of the cycle going up to cross wires held by cranes on either side of the canal. We digitally composited out all of those wires, making it look like someone drove a motorcycle off a 20-foot embankment. That to me is incredible: I believe this effect is going to become pervasive in filmmaking.”

These changes can be made so seamlessly that even an expert can’t tell from a print if the film has been altered. One animator irreverently suggested transposing old footage of the cast onto the faces of the crew in “Star Trek VI” to make them look younger. He later conceded that such extensive changes on human faces wouldn’t be entirely successful--yet. But as the technology advances, it becomes increasingly simple to doctor film--offering the Orwellian possibility of creating fake “news” footage and fraudulent evidence for trials.

“On a Schick commercial we recently did, we liked the first part of one take and the second part of another,” Dixon says. “They were each perfect, but the actor couldn’t do them at the same time. So using the morphing technique, we were able to combine the two takes seamlessly and get exactly what we want. That process has some potentially scary implications.”

Advertisement

Some of the effects in “T2” set new standards for realism in CGI. But all the artists involved in the film believe the medium still has limits that make it less effective for some images than the traditional techniques of model-building, matte painting, makeup and puppetry.

“It’s still difficult to do character animation that’s really high quality,” says Muren thoughtfully. “I don’t think CGI is really good at creating natural phenomena that move: We can’t do oceans . . . I mean, we can sort of do them and we can fudge them by mixing media, but it’s not quite there yet. Somewhere down the line, I know all that stuff is going to be done with the turn of a switch.”

“As far as the predictions go about doing realistic human figures, it’ll happen--someday,” Riddle adds. “But consider, say, the texture of skin. Being able to create realistic skin and hair that stands up off that skin is a really tough project. No one’s done much work toward developing techniques for doing fur and hair. And even if you could create something that looked like Marilyn Monroe for a single frame, how would you get it to move like a real human?

“In the near future, I think a lot of the other arrows in the filmmaker’s quiver are going to continue to be important,” he says. “Building a model you can touch has a lot of advantage, and a human can sculpt very high levels of detail that you just can’t get right now with a computer. You can also do a great deal more in terms of lighting--lighting is kind of a weak point right now in computer graphics.”

Until recently, CGI had been used primarily in television logos and commercials, rather than feature films, and viewers around the world can expect to continue seeing elaborate, computer-generated ads. ILM recently completed a Toyota commercial for Japan using the morphing technique: A pair of silvery metallic lips transform themselves into a new car that drives away.

“The effects from ‘Terminator’ are going to trickle down into the commercial market. I can imagine a floor wax commercial with something rising out of the floor to personify the wax,” Smythe predicts. “A few months after ‘The Abyss’ came out, a bubble bath commercial in Britain had a pseudopod-type thing coming out of the bathtub to hand a woman her bath oil. It’s interesting to see how your work gets translated into that market.”

Advertisement

But the success of “Terminator 2” seems to have assured CGI an increasingly important place in feature filmmaking.

“It used to be that if you wanted to see cool stuff, you’d go to the movies, because that’s where it happened, but video has completely eclipsed that aspect of filmmaking in the last few years,” Dixon says. “You see far more interesting stuff on TV than you do in films: MTV, “Liquid TV” or even the new ‘Star Trek’ series--it has zillions of effects. But I think that’s going to change now and feature films are going to become the exciting area of entertainment.

In upcoming films, Muren says, “we’re reusing some of the techniques we developed for ‘Terminator 2,’ but we’re applying them in entirely different ways. Audiences won’t realize it’s the same software, but we couldn’t do these films without doing ‘T2’ first.”

The artists at PDI recently completed their first film using a technique the call “performance animation.” A human actor wears an armature (called a “Waldo,” after the mechanical arms in a Robert Heinlein story) that fits over his head, arms and upper body. The armature is attached to a computer that records his body positions and copies them on a video screen with a simple computer-generated figure in real time. The actor works out the character’s movements, much the way a puppeteer does. The computer stores that information, and can later reproduce the motions with a fully rendered character.

The new technique is showcased in “MuppetVision 3-D,” a 70-millimeter, 3-D film that premiered recently at the Disney/MGM Studio Tour in Orlando, Florida. Waldo C. Graphic, the computer character the PDI staff designed and animated for “The Jim Henson Hour” (1989), appears to fly out of the screen and interact with the audience.

“A lot of the gags in the film involve Waldo because he’s the only character who can come out past the screen,” Rosendahl said. “The rest of the Muppets can only lean forward, because they’re bound below the edge of the screen.”

Advertisement

All the artists involved in top-of-the-line computer graphics seem excited at the prospect of devising new visuals for features, rather than repeating what can be done by conventional methods.

“The bottom line is that something you can do in traditional media isn’t of interest to us,” Rosendahl says. “We’re interested in doing things you can’t do any other way and really using the technology to create new images and new effects, and to express ideas you couldn’t express before.”

“After ‘T2,’ a lot of people have come to us and said, ‘Boy, if we get a shape-changing movie, you’re the ones to do it,” ’ Muren said. “But we don’t want to do shape changing again. We have tools that allow us to do shape changing, but that’s not all these tools can be used for: You can build different types of houses with one hammer.”

Advertisement