×
Cookies help us deliver our Services. By using our Services, you agree to our use of cookies. Learn More.

The Mandalorian's Breakthrough VFX Explained

With its simple narrative and retro-futuristic western aesthetic, The Mandalorian represents a bright spot in the modern Star Wars landscape, especially following the uneven response to The Rise of Skywalker. However, you may not know that the flagship Disney+ series also represents the bleeding edge of production technology, too. 

The first season of The Mandalorian features a lot of long, lingering shots on outdoor alien landscapes — and normally, one would anticipate these are generated in post-production, well after filming is completed and the cast goes home. The Mandalorian is different because it utilizes a new kind of digital matte technology with LED projection panels and software called StageCraft, so the digital sets are ready while filming occurs. The backgrounds can be wholly generated by digital artists, or real locations can be simulated through photogrammetry and projected onto the screens.

Since so much about The Mandalorian was kept secret before it premiered at the launch of Disney+ in November 2019, many of the novelties of its behind-the-scenes process are only just coming to the public fore. ICG Magazine, a premiere industry publication for cinematography, recently published an in-depth article about the new technology utilized in The Mandalorian. Since it's a professional magazine, the information is highly technical, so let's take a slightly simpler look at what a big deal this new production method is, and why it has a lot of potential to revolutionize the industry going forward.

Melding the old method with the new for The Mandalorian

Everybody is familiar with the tried and true method of matte painting in film — creating large, intricately painted backgrounds to blend the physical set with a simulated background for shooting. When the entertainment world shifted to more digital post-production, this process began operating backwards instead, sometimes so completely simulated there are no physically-constructed sets at all. Everything is draped in that characteristic green, and people talk to tennis balls hung on a string. We take it for granted now, but the bad texturing, mismatching perspective, and wooden camera work of early-2000s post-production effects isn't that far behind us.

This new technology marries both eras of set production: pre-rendering entire environments, then projecting them onto LED screens that track the cinematographer's camera filming in real-time with IR readers. One of The Mandalorian's two directors of photography, Greig Fraser, welcomes the technology in his interview with ICG Magazine: "It's phenomenal because it gives so much power back to the cinematographer on set, as opposed to shooting in a green screen environment where things can get changed drastically in post."

With StageCraft technology, the static two dimensions of traditional matte painting become dynamic. The LED screens onto which the background is projected is curved into a semicircle. It doesn't just simulate an entire environment as if the camera exists within it, it can also be changed on the fly as the cinematographer wants or needs. The digital sun can stay at dusk all day, or change to high noon if so desired; perspective shifts with the camera's focus. Now, a practical set can be built again as best fits the strengths of all possible methods. For example, part of the Mandalorian's (Pedro Pascal) ship, the Razor Crest, was physically built for the sake of actors' need to walk inside it, but the rendering environment meant the team didn't have to build anything more than what was necessary for the tactile interaction.

Gaming's strengths benefit other forms of entertainment

How does a production create and change a digital environment that can be used during filming? By borrowing a bit of help from video games. 

The Unreal Engine — the baseline tool many triple-A video game development studios use to create their products, including Fortnite — has found a home assisting Disney. The game development engine is designed to be a leg-up off the bat for artists, using modular tools and artefacts that can be customized without the lengthy process of coding from scratch. It's fast and easily manipulated, so the background set can be built to the exact desires of the cinematographer, who's in charge of lighting and blocking every shot of the show, before filming even begins. Two different computer workstation operators also were constantly on set to make changes for cinematographers Baz Idoine and Greig Fraser.

"I worked with guys who built 3D models, and to some degree, I had to teach them filmmaking," Fraser explained to ICG Magazine. "These images were not just created for a movie; they were going up on a screen that had to fully integrate with what I did on the live-action side. That meant showing them how I would light something and also why it was important to me that the light be controlled in that fashion, and how it was in service to the story." 

This way, the background setting is custom-made to the circumstances of the storytelling, rather than trying to make it fit imperfectly after the fact with no input from the person who crafted all the visual narrative. Your average 3D modeler isn't necessarily trained to appreciate the variables that light brings to the real-world set, and they certainly can't anticipate the artistic concept of a cinematographer they usually never meet, so this technology's ability to let them collaborate before and during production is a milestone in television and film work.

Limitations don't make for a magic bullet

Every technology has its limits, and understanding where and when any tool is best utilized is part of quality filmmaking. This LED panel system was first used by Disney for certain shots in Rogue One: A Star Wars Story, but when filming was underway in 2015, pixel size prevented implementing the screens in anything more than wide-range shots. When cameras focus on screens too closely, the moire effect is produced. (If you've ever taken a photo of a TV or computer screen with your phone and seen a wavy pattern, you've seen the moire effect in action.) The newest version of the panels have drastically reduced pixel size — Rogue One's screens were 9mm, and today it's about 4mm — but the cinematographer still needs to use particular lenses to avoid focusing issues.

The curved stage can also only contain so much. The "volume" of a set is the total area in which the camera can see, depending on how wide the angle of view is. If the relative perspective or size of an object doesn't work for rendering onto the LED screen, it's constructed or filled in with traditional post-production VFX. LED's RGB colors can be difficult to match against real-world color gamuts, too.

Finally, as you might imagine, it's insanely expensive. Single LED panels can reach $3,000 apiece, and we're talking about an entire custom-made stage of them used for The Mandalorian, including a ceiling. Unreal does distribute its software suite for free if you're an independent creator, but it's a very different story for custom non-game uses like Disney's projects. Not to mention all the hardware required to do that kind of on-the-spot rendering. Using it can reduce overall cost because it limits the need to shoot on location, but that up-front cost and training is the kind of thing only mega studios can afford right now. It'll be some time before this remarkable technology truly takes over the entertainment world.