Advertisement

A Leap in Technology for a Giant Step Back in Time

Share

For four years, about 350 people labored to make a movie using technology that often didn’t exist until they invented it. They seemed to play God with filmmaking, creating a digital world that frequently incorporated multiple live-action backgrounds with computer animation in a single scene. “They just look into their computer screens, and they can have anything they want,” says Pam Marsden, who produced “Dinosaur.” “The frontier is limitless.” Four of those pioneers spoke with Times staff writer Valerie J. Nelson about how they brought a new dinosaur world to the screen.

Dick Zondag

Supervising Animator, Bruton

Zondag was working in story development for “Dinosaur” when he was given the chance to animate Bruton, an aging iguanodon who’s the lieutenant for the film’s bad-guy dinosaur. Like many artists who worked on the film, his previous experience was in drawn, not computer, animation. His brother, Ralph Zondag, was one of the film’s two directors.

*

Bruton is the largest of all of the iguanodons, but he’s overweight and he’s definitely in middle age. I needed to be able to show that. We played him a little like a big Cadillac, where you’d hit the brakes but the car would keep moving.

Advertisement

To place Bruton in a scene, I am handed a folder with a sheet of paper that has a frame-by-frame accounting of the scene. The dialogue is written on the sheet, every letter of every word is broken down.

That packet also has a storyboard that gives me a brief visual reference of what’s expected of my character. Then I have a little conversation with the director, in this case Eric Leighton. I would pitch him any ideas I have about what the character might do. He goes away, and I go to work. We used two different computer programs. One did the body animation, the other the facial detail. I’d type in a code that would bring up the exact scene, with the virtual background in place, that was prepared for me in advance by the layout department.

If there’s a line of dialogue, I have another tool that replays the dialogue in concert with the animation. It’s like turning on the video as I work. Once I’m happy with the animation, I go back and do quite a few renderings. I’m constantly tweaking the animation. Once I feel I’m in the ballpark, I show it to Eric.

At that point, we would probably buy off on the scene in its rough form. I’d save it, then import it into the other program to animate the actual face. I always tend to animate the eyes first. I didn’t allow Bruton to blink too much.

After the eyes, I’d do the rest of the features. The mouth and dialogue came after the fact. Then I’d add the last little bits of detail, such as jaw clenches or muscle spurts. Once I got it, I’d show it to Eric. If he liked it, it would move on to the next department, skin and muscles, and we would go back and forth on it. We played down the amount of muscle inertia on his arms and legs and played up the skin inertia. His skin was looser than his muscles.

Bruton didn’t enter the film till midway of Act 2, and he was more of a side character with lesser footage. But one person can only ever go so fast. My crew wasn’t so big. I had one other animator and a couple of assistants.

Advertisement

Sean Phillips

Model Development Supervisor

In “skin and muscles,” as model development became known, characters were fleshed out as realistically as possible. Phillips, who has a degree in engineering, supervised secondary animation that made muscles bulge and skin wrinkle. The filmmakers consider the work a technological breakthrough.

*

We wanted to build a full muscle-stimulation system that would drive exterior skin. In the past, skin was wrapped around bones and any musculature was animated by hand. We spent a lot of time looking at elephants. They were about the largest creature we could find that were close to the size of the dinosaurs, although they were still small.

Unlike other films, we didn’t have human characters for a sense of scale. We needed the characters themselves to give a sense of their massiveness. Without the moving skin and bulging, they don’t seem as big.

First, we built the exterior skin of the character, in a neutral pose, that was a shell of what it would look like. From there, we fit the bones back into the character. Those would serve as the animators’ controls. We’d set the point where the hip or knee would rotate inside the interior skin.

We fit muscles between the skin and bones, then had animators do a range of motion study--sort of calisthenics for dinosaurs. It gave us a starting point to adjust the parameters. We would ask, “How much should it bulge when it stops quickly?”

Then we worked on adjusting the muscles to behave in a way that seemed reasonable with skin attached to it. Each point in the skin was made to stick to the closest point in the underlying muscle. That made for skin that looked crinkly where limbs joined the body.

Advertisement

To refine the skin, we turned it into a spring-mesh system that allows each point in the skin to push and pull. The initial skin is analogous to a cotton shirt that wrinkles around your elbows and arms when you are bending while the spring mesh applied on top makes the skin act like Lycra that stretched tight yet showed the push and pull beneath it. But it smoothed out the ribs and other bones. So we developed an attribute map that controlled how much Lycra-like material you’d apply in a given area.

That was the pre-production that got us ready for actual shots from the film. In production, we would get motion from animation, run it through the system, critique the final skin, then go back in. We would want a little more bulging in a front arm or more bounce if a dinosaur was running really fast.

We did about 30 scenes a week for two years, which was a pretty nutty pace. At the end of the day, it was about getting it right within reason. Once they decided the dinosaurs would be talking, we joked, “How real do they have to be?”

Charles Colladay

Look Development Technical Director

In the credits, Colladay is listed as “fur stylist,” but he’s actually a member of the digital effects team that invented a tool to create realistic-looking animated fur. The tool has already been utilized in another upcoming Disney film, “102 Dalmatians.”

*

Our department is where the rubber meets the road. We would get all this art direction, then have to translate it into some deliverable product. We took models that didn’t have texture and skin and made them look kind of appealing by putting on skin and fur. Over the course of 3 1/2 years, I helped develop a fur tool that enabled us to accomplish our objectives.

The fur tool could paint the colors of the hair, the skin, control which direction the hair would go--the grooming. We could put rim lighting on the fur, so the light would show through the edges. Each hair has an individual serial number, and we could play with that number to give the hair more visual interest.

Advertisement

The fur had to be wet, it had to get dusty, and it had to blow in the wind. We developed an inertial guidance system. The wet hair has its own set of weight, different than the surrounding dry hair that makes it move slightly differently, as if it’s on a spring. We also came up with a hairnet system that keeps hairs from crossing over into other hairs. It’s something you would never see, but if it didn’t exist, you’d definitely notice it.

We developed a way to have objects react to force. One lemur would lay a hand on another lemur, and the hairs would have to be pushed out of the way. The same approach was useful on the dinosaur’s feet. We put the force objects on the feet to push the grass down, combined it with a hold effect that holds the grass down after the dinosaur has lifted a foot from the place he stepped.

The grass is basically green fur with tiny little texture maps painted on each one. I painted 12 maps, or blades, for the grass, which are randomly selected when a field of grass is created. Another program creates a vortex for the wind. It’s a nice swirly effect that you can see moving through the scene of grass. With normal grass, you can’t change the wind-speed direction of the light falling on you.

There are very powerful things about doing a whole movie in 3-D. You have a whole lot of choices, sometimes too many, and at some point you stop. We got to the point where it looked pretty good. Everything feels like it fits in the same world. They use this term, “photo-real.” I’m not sure it’s photo-real, but it’s definitely cohesive.

Neil Eskuri

Digital Effects Supervisor

Special effects are present in all but two of the 1,245 shots in “Dinosaur.” Eskuri’s practiced hand reached into every department “to create the vision on screen that the directors have in their minds.”

*

In terms of difficulty, I like to talk about the last two scenes in the movie. Each scene has more than 500 characters. Those two scenes last about a minute total, yet they each took six months to do.

Advertisement

In the that scene, we had to create the entire foreground. Throughout the course of production, the end changed, and we couldn’t send the crew back to Hawaii to shoot new background plates. We developed a technique of image projection that allowed us to paint what we wanted in the foreground. Twenty-three scenes or paintings are projected onto the foreground for the camera’s position as it pulls back to reveal the nesting ground.

The other scene was going to be a helicopter shot that followed the bird and pulled back. Again, the story changed, and we couldn’t do that anymore. We liked the beginning of the shot, so we used the projection technique again to marry the extension of the shot to the end of the helicopter shot.

To do that, we replaced the entire water plane and digitally extended the background so it matched the helicopter camera movement. We also had to create the far background of the mountains, which didn’t really exist. About halfway through the scene, we transition from live-action helicopter to complete CG [computer graphic] camera. It’s very hard to see where it happens. The characters are walking along live-action ground that becomes CG ground.

The idea of effects on the show was to utilize every technique available, So whatever worked for a particular scene we used whether it was practical effects shot on location, or inserted on stage, or a computer graphic effect or traditional effects, meaning they were drawn frame by frame. Whenever we could get the effect in camera, that’s what we would use, because it just looks better.

For digital effects, we created effects engines that would allow artists to generate the same type of effect whenever it was required, so we wouldn’t have to reinvent the wheel every time. This gave us tremendous flexibility and efficiency to do a lot of interactive effects--splashes, dust puffs, explosions, moving vines, comets, fire balls.

It’s similar to other films in terms of techniques used, but the sheer magnitude of shots is really only rivaled by “Phantom Menace.” It’s different in that we’re looking for a consistent style, where we are creating characters and environments that had to be believable all the way across.

Advertisement

My philosophy is the possible is easy, the impossible just takes a little more time and money. At some point we had to stop. We always were asking ourselves, “When do we freeze the code for the movie?” We didn’t freeze the code until about 10 months before the end of production. We said, “This is it, we have to sprint to the finish line.”

Advertisement