Visual effects nominees, in the mix from the get-go, are eroding cinema’s limits
The five nominees for the 2023 Oscar for achievement in visual effects didn’t just wait at their computers for the postproduction process to begin. Each was intimately involved before, during and after production, with boots on the ground and capes in the air.
“All Quiet on the Western Front”
To help achieve the most organic version of “All Quiet on the Western Front” possible, visual effects supervisor Frank Petzold got down in the trenches. He was on set to experience the sights, sounds, feels and smells of those brutal battles.
“All the snow, the freezing, the being in water knee-high; we were there too. You’re part of the filmmaking process rather than being presented with a plate at the beginning of postproduction. You throw ideas around. You’re attaching your heart and your soul to every effects shot.
“I like to improvise. We did 3-D scans of the whole battlefield and everything. But on set, ‘You guys are doing dialogue; let me grab a few cameras, grab some stunt guys and ... blow up some stuff with the [practical effects] guys.”
What most viewers will likely come away with from the movie is the visceral insanity of the battle scenes.
“I shot stuff with six different cameras: a lot of soldiers running — different camera speeds, just collecting elements. On another project, you’d go, ‘Let’s simulate this in the computer.’ I didn’t want that. We added fog, the airplanes we didn’t have. We had one tank that could go a few yards. At some point you have to get out the hot-glue gun and do stuff.”
“Avatar: The Way of Water”
James Cameron’s “Avatar” sequel isn’t so much a marriage of concept, production design, cinematography, sound, costumes, acting and visual effects as a new creature with the DNA of each.
“We get involved pretty early because we’re helping with character design, we’re helping building out the world,” says visual effects supervisor Joe Letteri. Fantastic environments and action aside, the film’s ultimate success is determined by whether it can make viewers forget they’re watching nine-foot-tall, blue aliens and see the actors’ performances instead.
“This new facial system is the key,” Letteri says. “It allows us to look into what the face does [inside], so we understand better what the actors do. We built a neural network that can analyze the performances. That was a real breakthrough. If you move one muscle on your face, muscles on the other side of your face move; they’re all connected [underneath].”
They scanned the actors through a range of facial expressions and speech exercises. Cameron sat with actors and went through their entire performances with eight cameras trained on them. The resultant data — “tens of thousands of frames we fed into the system,” says Letteri — powered the engine that translated human performers into Na’vi characters.
“There’s a shot of Kiri when she’s been captured in the forest in the rain — it’s Sigourney Weaver playing a 14-year-old — it shows how this new system can express an actor’s performance through any character they want, even if it’s not how they look now. We live and die in the closeups.”
Visual effects supervisor Dan Lemmon says Matt Reeves and company wanted to make “The Batman” “not so much a superhero movie, but a gritty, noir detective story, grounded in reality.”
To make an unreal place — Gotham City — and unlikely abilities — like Batman gliding to safety after jumping off a skyscraper — seem viable, the team used LED volumes — soundstages with massive video screens. They shot the actors against virtual Gotham cityscapes projected onto the screens, rather than green screens. Thus, instead of regular movie lighting, the actors were illuminated by the virtual sunset and city ambience.
Lemmon says, “You can see the sunset reflecting off Batman’s cowl, Selina Kyle’s outfit. Sunsets have a lot of different colors; you see that on their costumes — oranges and blues reflecting off everything, off the puddles on the ground.”
An LED volume also helped Batman escape from police headquarters.
“Matt’s mandate in the wing-suit sequence was, ‘I want people to think Robert Pattinson actually jumped off a building and landed without a parachute.’
“We watched action-sports videos, YouTube-Red Bull stuff. He wanted to emulate that style of photography — they’d have cameras mounted to their bodies. He felt that gave a sense of a real action video rather than a contrived piece of cinema,” Lemmon says.
“We built a wind tunnel out of LED panels. We hung both the professional wing-suit performer and Rob Pattinson on safety cables to keep them suspended in the wind tunnel and forced a lot of air across the suit. That gave it more of a sense of realism.”
“Black Panther: Wakanda Forever”
Visual effects supervisor Geoffrey Baumann says the most challenging part of “Black Panther: Wakanda Forever” was the underwater world of Talokan. However, “The most frightening part was making Namor [Talokan’s ruler, played by Tenoch Huerta Mejía] a character people would like. Namor had been reinvented so many times, but generally, he was just kind of an ass—.”
There was also the fact that, as in the comics, Namor’s power of flight depended on little wings on his ankles.
“How do you not make that silly?” Baumann asks.
“Wakanda” production departments ran many underwater tests. They discovered one character’s grass skirt played floating peekaboo in the water. They learned hair and clothing often didn’t behave, so some performers wore skull caps and had their lovely locks added in post, while some wore partial costumes, Baumann’s team adding the top layers digitally. Finally, they shot most underwater footage dry-for-wet (meaning actually out of the water), often reshooting scenes they’d already captured.
“The challenge was to be able to seamlessly intercut,” he says. The effort was helped by what the camera team and actors learned from shooting it wet initially.
For Namor’s movements, they studied athletes including triple jumpers and all-time great running back Barry Sanders. But when it came to those winged ankles ...
“The first step was, we animated,” Baumann says, laughing, “these mini CG helicopters on the ankle of a CG character to see how they would move.”
Somehow it all worked out — Namor’s flight became majestic and Huerta Mejía’s take on the character became downright ... likable.
“Top Gun: Maverick”
“Top Gun: Maverick” flies so high in large part because of how seamlessly it creates the illusion it’s really the characters performing those insane feats in the sky.
“We came in where things were just too dangerous to accomplish practically, because so much was practical, so much was real,” says Ryan Tudhope, visual effects supervisor.
Tudhope’s team would add or remove jets when necessary, change the skins on some planes to turn them into others and craft environments, such as expanding the enemy base.
Then there’s the climactic fight, with an F-14 jet from the era of the first movie against new “fifth generation” fighters. Much has been said about the camera rigs the filmmakers developed with the Navy to fit inside the cockpit of today’s F/A-18, the plane most in the movie.
The F-14 cockpit is smaller. Those cameras wouldn’t fit.
To maintain the film’s visual language and keep using those cameras, “it didn’t make sense to do it any other way than have [the actors] be in an F/A-18 ... then digitally replace it to look like [an F-14] Tomcat.
“There’s moments when you look over the top of Maverick onto the cockpit. That was an F/A-18, but we completely replaced it so you have the Tomcat cockpit. Even the little tracking screen is running at eight frames per second or something.
“That puts visual effects in a supporting role, which is where we wanted them to be on this film.”
From the Oscars to the Emmys.
Get the Envelope newsletter for exclusive awards season coverage, behind-the-scenes stories from the Envelope podcast and columnist Glenn Whipp’s must-read analysis.
You may occasionally receive promotional content from the Los Angeles Times.