‘The Mandalorian’: How ILM’s Innovative StageCraft Tech Created a ‘Star Wars’ Virtual Universe

IndieWireFallTV
IndieWireFallTV

Click here to read the full article.

After making the virtual production leap on “The Lion King” with the help of virtual reality and the real-time game engine, director Jon Favreau took it a step further with Industrial Light & Magic in the making of his Disney+ “Star Wars” series, “The Mandalorian.” The VFX studio created an innovative virtual production workflow called ILM StageCraft, which allowed the filmmakers to generate complex and exotic digital backdrops in real-time (using Epic’s Unreal game engine) while shooting the eight-episode bounty hunter series at Manhattan Beach Studios in L.A.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of real-time, in-camera rendering,” said Favreau in a prepared statement. “We are proud of what was achieved and feel that the system we built was the most efficient way to bring ‘The Mandalorian’ to life.” This tech partnership also includes Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI, and ILM announced that it is making StageCraft available as an end-to-end virtual production solution to the industry.

More from IndieWire

More than 50% of “The Mandalorian’s” first season was filmed using StageCraft, eliminating the need for costly and time-consuming location shoots entirely. Instead, actors in “The Mandalorian” performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

CG environments created by ILM played back interactively on the LED walls, and edited in real-time during the shoot, which allowed for accurate tracking and perspective-correct 3D imagery rendered at high-res via systems powered by NVIDIA GPUs. The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as though the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets. This gave showrunner Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, cinematographers Greig Fraser (“Rogue One”) and Barry “Baz” Idoine (“Rogue One”), and the episodic directors (including Taika Waititi and Bryce Dallas Howard) the ability to work in a comfortable live-action setting.

The genesis of StageCraft actually began in 2008, when George Lucas planned the first “Star Wars” live-action series, realizing that he would need to employ a real-time game engine to help generate CG environments on set. At the time, ILM’s Bluff ran the digi-matte environments department and was brought in to help create a virtual production solution.

But this game changer didn’t come to fruition until Favreau came aboard “The Mandalorian” to brainstorm a solution predicated on his familiarity with VR prep for location scouting, camera layout, and full previs. However, the biggest benefit came from relying on small location shoots in Death Valley and Chile, supplemented by ILM’s vast archive of plate photography from previous “Star Wars” movies.

“We searched out these exotic locations, where we then photographed them extensively and then rebuilt them all in the computer based on this imagery that we captured,” said Bluff. “And then we could transfer them to the LED screen and effectively shoot the actors as if they were there on the day.”

ILM worked with Kim Libreri, CTO, Epic Games (and former ILM VFX supervisor), in customizing the Unreal engine to drive the VFX content in real-time on the LED screen. But the biggest breakthrough of StageCraft was the ability to track the digital camera on the stage with the proper perspective changes in the CG environments so that it was seamless, dynamic, and believable.

“For an interior, you would effectively see a camera pass by a doorway through the screen,” added Bluff. “So, in order for us to do the perspective correcting, which was a huge leap technologically, the pixels had to go on the screen via a real-time game engine.”

For Season 1, the environments were first created in VR in pre-production, and then turned over to the ILM VFX team to create fully photo-realistic completions while also assuring performance in a game engine on set during production.

ILM also pulled out a few old-school techniques, such as model building, to relieve pressure on the artists generating these environments. “So whenever we could, we would build miniature models that would ultimately be turned into a real-time, CG environment,” Bluff said. This was accomplished by building tiny objects, doing a full photography pass, shooting thousands of images, which would be processed to generate a CG model. Then they wallpapered the photography back onto the model to give it the proper texture.

The most challenging environment for ILM to generate was the iconic Mos Eisley Cantina on Tatooine. “The risk was very high using this technology,” said Bluff. “If we didn’t [doing it right], then we would be doing ourselves and, of course, the audience and the original movie, a disservice. And so that was one that we paid particular attention to, rehearsing in VR exactly where the cameras were going to go and making sure that we blocked it in, while we were building the environment, in order to make the magic trick work.”

Watch a video about the innovative digital backdrops below:

Best of IndieWire

Sign up for Indiewire's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.