Visual effects nominees, in the mix from the get-go, are eroding cinema’s limits

A photo collage of scenes
The five nominees for the 2023 Oscar for visual effects each show, in their own unique ways, how their art and craft are expanding what’s possible in film. Clockwise from top left: “Black Panther: Wakanda Forever”; “The Batman”; “All Quiet on the Western Front”; “Top Gun: Maverick” and Sigourney Weaver as Kiri in “Avatar: The Way of Water.”
(Photos from Marvel / Disney; DC / Warner Bros.; Netflix; Paramount Pictures; 20th Century Studios)

The five nominees for the 2023 Oscar for achievement in visual effects didn’t just wait at their computers for the postproduction process to begin. Each was intimately involved before, during and after production, with boots on the ground and capes in the air.

A WWI German soldier fires at a green screen before the VFX team completes the shot in "All Quiet on the Western Front."
Paul (Felix Kammerer) faces down a mysterious enemy before Oscar-nominated VFX supervisor Frank Petzold works his magic in “All Quiet on the Western Front.”
(Frank Petzold/Netflix)
In the completed VFX shot, the German soldier is shooting at a (digitally inserted) tank in "All Quiet on the Western Front."
In the completed VFX shot, Paul (Felix Kammerer) shoots at a tank inserted digitally — along with smoke, debris and other subtle bits — by Oscar nominee Frank Petzold’s team in “All Quiet on the Western Front.”
(Frank Petzold/Netflix)

“All Quiet on the Western Front”

To help achieve the most organic version of “All Quiet on the Western Front” possible, visual effects supervisor Frank Petzold got down in the trenches. He was on set to experience the sights, sounds, feels and smells of those brutal battles.

“All the snow, the freezing, the being in water knee-high; we were there too. You’re part of the filmmaking process rather than being presented with a plate at the beginning of postproduction. You throw ideas around. You’re attaching your heart and your soul to every effects shot.

“I like to improvise. We did 3-D scans of the whole battlefield and everything. But on set, ‘You guys are doing dialogue; let me grab a few cameras, grab some stunt guys and ... blow up some stuff with the [practical effects] guys.”

What most viewers will likely come away with from the movie is the visceral insanity of the battle scenes.

“I shot stuff with six different cameras: a lot of soldiers running — different camera speeds, just collecting elements. On another project, you’d go, ‘Let’s simulate this in the computer.’ I didn’t want that. We added fog, the airplanes we didn’t have. We had one tank that could go a few yards. At some point you have to get out the hot-glue gun and do stuff.”

Sigourney Weaver, the rough animation from her performance and the fully rendered character in "Avatar: The Way of Water."
Sigourney Weaver’s performance is captured, and then used to generate a facial animation puppet from which the fully rendered character is created.
(©2022 20th Century Studios)

“Avatar: The Way of Water”

James Cameron’s “Avatar” sequel isn’t so much a marriage of concept, production design, cinematography, sound, costumes, acting and visual effects as a new creature with the DNA of each.

“We get involved pretty early because we’re helping with character design, we’re helping building out the world,” says visual effects supervisor Joe Letteri. Fantastic environments and action aside, the film’s ultimate success is determined by whether it can make viewers forget they’re watching nine-foot-tall, blue aliens and see the actors’ performances instead.

“This new facial system is the key,” Letteri says. “It allows us to look into what the face does [inside], so we understand better what the actors do. We built a neural network that can analyze the performances. That was a real breakthrough. If you move one muscle on your face, muscles on the other side of your face move; they’re all connected [underneath].”

They scanned the actors through a range of facial expressions and speech exercises. Cameron sat with actors and went through their entire performances with eight cameras trained on them. The resultant data — “tens of thousands of frames we fed into the system,” says Letteri — powered the engine that translated human performers into Na’vi characters.

“There’s a shot of Kiri when she’s been captured in the forest in the rain — it’s Sigourney Weaver playing a 14-year-old — it shows how this new system can express an actor’s performance through any character they want, even if it’s not how they look now. We live and die in the closeups.”


An overhead view of Gotham City at night with Batman visible on a clock tower. The city is not real; it's a visual effect.
One of the most important uses of visual effects in “The Batman” was the creation of Gotham City.
(Warner Bros. Pictures & DC Comics)
Batman looks over Gotham City (a virtual city projected on giant LED screens) in "The Batman."
The Oscar-nominated visual effects team of “The Batman” helped achieve this shot by creating a digital Gotham and projecting it on an LED volume — a massive wall of giant LED screens.
(Warner Bros. Pictures & DC Comics)

“The Batman”

Visual effects supervisor Dan Lemmon says Matt Reeves and company wanted to make “The Batman” “not so much a superhero movie, but a gritty, noir detective story, grounded in reality.”

To make an unreal place — Gotham City — and unlikely abilities — like Batman gliding to safety after jumping off a skyscraper — seem viable, the team used LED volumes — soundstages with massive video screens. They shot the actors against virtual Gotham cityscapes projected onto the screens, rather than green screens. Thus, instead of regular movie lighting, the actors were illuminated by the virtual sunset and city ambience.

Lemmon says, “You can see the sunset reflecting off Batman’s cowl, Selina Kyle’s outfit. Sunsets have a lot of different colors; you see that on their costumes — oranges and blues reflecting off everything, off the puddles on the ground.”


An LED volume also helped Batman escape from police headquarters.

“Matt’s mandate in the wing-suit sequence was, ‘I want people to think Robert Pattinson actually jumped off a building and landed without a parachute.’

“We watched action-sports videos, YouTube-Red Bull stuff. He wanted to emulate that style of photography — they’d have cameras mounted to their bodies. He felt that gave a sense of a real action video rather than a contrived piece of cinema,” Lemmon says.

“We built a wind tunnel out of LED panels. We hung both the professional wing-suit performer and Rob Pattinson on safety cables to keep them suspended in the wind tunnel and forced a lot of air across the suit. That gave it more of a sense of realism.”

Namor sits on his underwater throne in "Black Panther: Wakanda Forever."
Namor (Tenoch Huerta Mejía) sits on his underwater throne in “Black Panther: Wakanda Forever.” Many scenes, including this one, were shot dry-for-wet (not underwater) as well as underwater; it fell to the Oscar-nominated visual-effects team to make them match and appear as if they all occurred under the sea.
(Annette Brown/Marvel Studios)
The finished shot adds water to the scene.
The finished shot places Namor underwater.
(Marvel Studios)

“Black Panther: Wakanda Forever”


Visual effects supervisor Geoffrey Baumann says the most challenging part of “Black Panther: Wakanda Forever” was the underwater world of Talokan. However, “The most frightening part was making Namor [Talokan’s ruler, played by Tenoch Huerta Mejía] a character people would like. Namor had been reinvented so many times, but generally, he was just kind of an ass—.”

There was also the fact that, as in the comics, Namor’s power of flight depended on little wings on his ankles.

“How do you not make that silly?” Baumann asks.

“Wakanda” production departments ran many underwater tests. They discovered one character’s grass skirt played floating peekaboo in the water. They learned hair and clothing often didn’t behave, so some performers wore skull caps and had their lovely locks added in post, while some wore partial costumes, Baumann’s team adding the top layers digitally. Finally, they shot most underwater footage dry-for-wet (meaning actually out of the water), often reshooting scenes they’d already captured.

“The challenge was to be able to seamlessly intercut,” he says. The effort was helped by what the camera team and actors learned from shooting it wet initially.

For Namor’s movements, they studied athletes including triple jumpers and all-time great running back Barry Sanders. But when it came to those winged ankles ...

“The first step was, we animated,” Baumann says, laughing, “these mini CG helicopters on the ankle of a CG character to see how they would move.”

Somehow it all worked out — Namor’s flight became majestic and Huerta Mejía’s take on the character became downright ... likable.


Above: F/A-18 jets; Below: The same jets after the "Top Gun: Maverick" VFX team changed them into two different planes.
The “before” picture (top) shows the F/A-18 jets as they were photographed for the climactic battle in “Top Gun: Maverick.” The “after” picture (below) shows the jets after the visual-effects team changed them into a fifth-generation enemy plane (rear) and a much-older F-14 Tomcat fighter (foreground).
(Framestore; Paramount Pictures)

“Top Gun: Maverick”

“Top Gun: Maverick” flies so high in large part because of how seamlessly it creates the illusion it’s really the characters performing those insane feats in the sky.

“We came in where things were just too dangerous to accomplish practically, because so much was practical, so much was real,” says Ryan Tudhope, visual effects supervisor.

Tudhope’s team would add or remove jets when necessary, change the skins on some planes to turn them into others and craft environments, such as expanding the enemy base.

Then there’s the climactic fight, with an F-14 jet from the era of the first movie against new “fifth generation” fighters. Much has been said about the camera rigs the filmmakers developed with the Navy to fit inside the cockpit of today’s F/A-18, the plane most in the movie.


The F-14 cockpit is smaller. Those cameras wouldn’t fit.

To maintain the film’s visual language and keep using those cameras, “it didn’t make sense to do it any other way than have [the actors] be in an F/A-18 ... then digitally replace it to look like [an F-14] Tomcat.

“There’s moments when you look over the top of Maverick onto the cockpit. That was an F/A-18, but we completely replaced it so you have the Tomcat cockpit. Even the little tracking screen is running at eight frames per second or something.

“That puts visual effects in a supporting role, which is where we wanted them to be on this film.”