Step inside the innovative technology developed by Jon Favreau and ILM for the Disney+ series, which is changing the face of in-camera visual effects and the future of filmmaking.
When The Mandalorian opens on a frigid ice planet, the titular bounty hunter zeroing in on his next prize, you can almost feel the gusts of wind howling past the unforgiving terrain. The effect is visceral, a seamless marriage of tried-and-true practical sets and costuming coupled with an innovative new technique that brings visual effects to the forefront of the production process.
Behind the scenes on the set of The Mandalorian, the storytellers and unprecedented visual effects engineers creating the first live-action Star Wars television series collaborated to crack the code for what has become a game-changing creation: StageCraft, a technological marvel that immerses the cast and production crew inside their CG environments in real time with the help of a massive wraparound LED screen.
“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of real-time, in-camera rendering,” showrunner Jon Favreau has said.
“Jon Favreau found the breakthrough that George [Lucas] was always looking for when he first explored the idea of a live-action TV show,” adds Richard Bluff, the visual effects supervisor on the acclaimed series. Known as StageCraft, the innovators at Industrial Light & Magic, Lucasfilm, and their collaborative partners on the project have achieved the planet-hopping magic of Star Wars storytelling that transports viewers to the galaxy far, far away by settling The Mandalorian in a largely computer generated, photo-real environment that wraps around physical sets and real actors to create a seamless effect.
It’s staggering to watch the final effect unfold, as a team of artists and engineers known as the Brain Bar act as mission control just a few yards away from the Volume, a curved cocoon of glowing LED screens ready to transport those standing inside of it literally anywhere. “It’s exactly the same sort of technology as the large LED screens you see in Times Square,” Bluff says. “What we wanted to do was shoot on a small stage with small physical sets that could be wheeled in and out fairly quickly and then extend those physical sets on the wrap-around LED wall.” And by moving visual effects to the beginning of the filming process, StageCraft enriches the performance of the actors, and the experience for directors and cinematographers using the new methodology for more precise storytelling in a completely fabricated galaxy.
It was just about six months before filming began that showrunner Jon Favreau, executive producer Dave Filoni, and DP Greig Fraser joined forces with ILM, Epic Games (maker of the Unreal Engine), and production technology partners Golem Creations, Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI to unlock this innovative achievement. While ILM had pioneered virtual production tools and had worked successfully with LED technology on previous Star Wars films, StageCraft was still very much in its infancy at the time, a virtual reality platform that helped storytellers scout fabricated environments to set up their shots. Propelled by the support of Lucasfilm president Kathleen Kennedy and the sheer will of Favreau, who was always pushing the collaborative team to try new things and bringing together some of the brilliant minds capable of making it happen, the crew took their next steps into the larger world by crafting the prototype of the Volume.
Fresh off projects like The Jungle Book and The Lion King, Favreau was passionate about employing new technology to enrich storytelling as he began his work on The Mandalorian. But the scheduling constraints of a television show -- and a Star Wars show that had to satisfy the planet-hopping scope fans have come to expect while making it feel entirely authentic and accessible no less -- meant whatever the team came up with had to appear realistic and be able to be shot on a Los Angeles soundstage without the traditional challenges of location shooting. “One of the things we wanted to do is move away from green screens and make the scale of a Star Wars TV show work,” Bluff says. “And we knew that we needed a technological innovation to push the boundaries and provide a solve for the production. Through the collaboration with Jon Favreau, Greig Fraser, ILM, Epic Games, and others we landed on the idea of utilizing video wall technology.”
"The moment everybody realized this was going to work.”
Bluff, like many fans, discovered Star Wars as a child enthralled by the galaxy far, far away and the stories that unfolded there. “I fell in love with Star Wars as a kid. I would watch the movies over and over again and I would lie in front of the TV drawing the characters. I desperately wanted to get into any sort of creative environment, particularly the film world, through drawing.” When Jurassic Park broke new ground with realistic computer graphics animation, Bluff saw his opening. “I understood that the doors were about to be blown off the industry and it was going to be a race to learn this technology because it was in its infancy. So I took that opportunity with both hands!”
“It was that pioneering spirit from the early days of ILM and the birth of computer graphics and the birth of the Pixar computers that really got me into it,” adds Kim Libreri, the chief technical officer of Epic Games, who collaborated closely with Bluff and the crew. “Honestly, ILM and George’s legacy had a big influence on me choosing my career path.” One night when he was studying in college in the UK, Libreri had a dream about a realistic video game that involved flying a snowspeeder over Hoth during the battle with the AT-ATs from Star Wars: The Empire Strikes Back. “It was totally real. I was in the movie, but it was a game. I thought, one day it will be possible that a video game will look as good as a movie. And that has driven me my entire career.”
Bluff went on to land a job with ILM as part of the team of matte artists on Star Wars: Revenge of the Sith; Libreri also went on to work as a visual effects supervisor with ILM before moving on. And they’re both thrilled to be working on new Star Wars stories. “For me coming back to Star Wars in the role that I’m in advancing technology, I hope it inspires kids that were my age when I [discovered Star Wars],” Bluff says.
Nearly two decades after his first role working behind the scenes building the galaxy, Bluff recalls those early StageCraft tests in the spring on 2018, when the crew working on The Mandalorian had mere weeks to come up with the wraparound screen prototype. “We got the go ahead to start actively developing the tech sometime in the end of March 2018. The idea was that on the 10th of June the same year we would do a test on the stage with a much smaller LED Volume wraparound screen with our picture camera, a prototype of a costume, and many different LED environments to test the technology."
“The theory I had was that the textures and the lighting from content had to be unmistakably real,” Bluff says. Some attempts using the procedural and computer-built landscapes fell flat. “There were a lot of test environments that actually didn’t work, which was a beneficial part of the process. More often than not they were environments that were built in a more make-everything-from-scratch approach."
When they put high definition images of real deserts and snowy landscapes, evoking Tatooine and Hoth, on the Volume, the effect felt tangible, yet somehow distant. With the action happening so far away from the landscape background, it behaved more like a traditional matte background. “When the camera’s moving 10 feet, the mountain’s not going to move. So that really only proved one thing, which is if we can put photo-real imagery on the screen, we will believe that the character is in that environment in a certain lighting situation.”
On the second-to-last day of testing, the ILM team had concocted a 3D interior environment using thousands of overlapping images from a real location on Angel Island in San Francisco. Parallax had to be built in so when the background shifted on screen the directional positioning followed. Finally, an actor dressed in the Mandalorian’s costume was placed in the center of the curved Volume. “That was our 'Eureka!'moment,” Bluff says. “It wasn’t a wide-open space, it was a much smaller environment and everyone on set believed it was real.
“The Angel Island set was actually a collection of rooms with doorways in between, and we had the camera dollying back and on screen it gave the impression that we were moving backwards traveling from room to room," Bluff says. “We had the Mandalorian in the center of frame walking and he was completely integrated because the lighting from the CG environment moved with him as the camera tracked backwards. Nobody could tell that he wasn’t in that room. And that was the moment everybody realized this was going to work.”
“Everybody was scared there would be flickering and all sorts of technical problems,” Libreri adds. “And people were literally gob smacked at how good it looked. It was just mind-blowing. Everybody sat on the stage going, ‘Oh my god, I think this is the future. I think we’ve all together, as a team, worked out what is going to be the template for making visual effects in-camera.’ The birth of a new way of working.”
"Everybody was standing on a desert"
Months later, the Los Angeles set of The Mandalorian was a bustling hive of activity. At the epicenter, the Volume, a curved 20-foot high 270-degree LED video wall made up of 1,326 individual LED screens topped with an LED ceiling. Inside the 75-foot diameter performance space, a variety of physical props and partial sets, created by production designer Andrew Jones and his team, could be swapped out to match the screens and quickly transform the innovative arena into a variety of planets and interior locations.
StageCraft’s genius lies in the way it brings together the collaborative creative forces on the first Star Wars live-action series and allows them to work together in real-time. The green- and blue-screens that were once standard for shooting real actors and props before digitally swapping out the background for computer generated environments would have caused a nightmare for the post-production effects team with a central character like the Mandalorian, whose gleaming beskar armor would have reflected back the very hues intended to be cut out from the final frame. In that way, StageCraft is a boon for cast, crew, and fans alike. "Wrap around green screens cause confusion for both actors and crew, limiting spontaneity and on-the-fly creativity," Bluff says.
But beyond the time and cost savings of eliminating the step of meticulously replacing the green screens, StageCraft allows for better lighting on set, with environments reflected back in Mando’s armor without a laborious post-production detailing work. And StageCraft’s magic effectively allows creators to produce high-resolution visual effects in real time thanks to the game engine technology. “Everything we are doing we would have always done in post-production and it would have always looked exactly the same,” Bluff says. “That was the goal. We never wanted to compromise the quality of the show. In fact, to the contrary we wanted to improve on the quality because our challenge was a main character with a fully reflective costume.
“This approach was a game-changer for us, not only by eliminating the green screens but by providing everyone on set with a VFX set extension that typically isn’t created until weeks after the shoot wraps. Instead it was there on the day, people could interact with it and immerse themselves -- the actors, the camera operators, the director of photography. Everybody was standing on a desert or in a forest or in some hangar somewhere. There was no need for the questions of: where am I going? What is this location? How deep does this go back? How tall is the ceiling?
“With the addition of camera tracking technology, wherever the physical camera was moving as it was photographing the actors, we were able to offset the background in real time as if the camera was looking through a magic mirror into a [computer-generated] world that we built,” Bluff says.
“It’s like we’ve put them inside a video game,” Libreri adds, although the photo-realistic images go beyond the scope of current gaming technology. “The fact that we’re able to go to these locations, I think they got better content, better performances. It’s like going back to the days when George [Lucas] was out there in Tunisia,” or more recent Star Wars films that took the crew of Star Wars: The Rise of Skywalker to the sands of Jordan. “You get that feeling back,” Libreri says. “It feels like classic Star Wars…and it’s going to have resonance for many years.”
"Schooled by George"
Such groundbreaking innovations are often the culmination of years of small, hard-won victories that propel filmmaking technology ahead inch by inch. For instance, Favreau and other filmmakers had already been experimenting with the technology, making films like Favreau's The Jungle Book and others, like Gravity before it, possible, when LED screens were first used by Lucasfilm for lighting on Rogue One: A Star Wars Story. At the time, the quality of the images meant most of the screens still had to be replaced by higher-resolution effects in post-production on the Star Wars standalone film. “We built on what was done for Rogue One, we both utilized a wraparound LED screen but at the time they were limited to playing back pre-rendered environments from a single camera perspective,” Bluff says. “Their intention was to focus on capturing the interactive lighting from the screens knowing that the content playing back would be replaced in postproduction for a high-fidelity version which would maintain the exact same lighting.”
Jon Favreau introduced his own ground-breaking VR advancements developed on both The Jungle Book and Lion King, but wanted to take the effect to the next level. “It takes an enormous amount of courage and vision to believe that we could pull this off,” Bluff says. “And there were many times when the folks involved in building out the technology wondered whether or not it was going to be feasible,” he admits. But Favreau never faltered. “Jon helped us pull through and prove the technology out.”
As a longtime caretaker of Star Wars storytelling, working at the right hand of Lucas himself when he first signed on to helm the animated series Star Wars: The Clone Wars, “Dave [Filoni] brings a connection to George’s original vision,” Bluff says. “Dave is incredibly creative and is a wonderful storyteller in his own right, but the fact that he was schooled by George himself keeps everybody honest. You feel like you’ve got an open door back to what George was trying to do in the original movies. He guides us. He inspires us. He makes us laugh every day on set. He’s a wonderful, wonderful human. And everybody loves working with him.”
Libreri first got involved in discussions when Filoni was exploring using Unreal Engine and similar platforms for animated storytelling. “We talked about the idea that a TV show like Clone Wars and some episodic animated content could be done in engine,” Libreri says. Tracking physical cameras in a virtual environment was something that Favreau had helped advance with his work on The Jungle Book and more recently, The Lion King. “As you start to look at all these different projects, Jon was always in the conversation and always pushing for what this next advance was going to be.” For years, “nobody quite had the answer,” Bluff says. “It took everybody coming together and talking about their experiences and what Jon wanted to do whether it was Greig Fraser the DP, whether it was Kim Libreri at Epic, myself from ILM or of course Jon and Dave the filmmakers themselves. And it was through those conversations that we came up with this idea of how we wanted to try to shoot The Mandalorian.”
For Bluff, StageCraft is the crowning glory of two decades in the industry. “It’s been the most collaborative experience I’ve ever had in 20 years working in this industry,” he says.
The Brain Bar
Day to day, the real-time rendering and manipulation of the environments being beamed on screen comes down to an elite crew of visual effects artists and ILM engineers known as the “Brain Bar.”
“They are basically air traffic control,” Bluff says, although huddled together just feet from the Volume, they resemble NASA’s stoic mission control room. “They are the ones that are operating the massive screens. They are bringing up all the different environments that you would see, that you would shoot against. They’re able to move mountains quite literally. They can rotate the world. They can move us from one end of the hangar to the other end of the hangar. They can add extra lighting into the scene that of course would appear to have an effect on the actors on the stage so they do many, many, many things to continue to make the camera believe the magic trick.”
With the crew hard at work on The Mandalorian Season Two, Bluff says the lessons from the first eight chapters in the groundbreaking series have left the team with new ideas for even better integration of the technology moving forward. “We learned an enormous amount from Season One,” Bluff says. “Chapter 6 was one of the last episodes that we shot. The Roost hangar was probably the most complicated environment that we did because it was so vast. And in that environment we not only had the Razor Crest in the virtual environment, we also had people walking around the virtual environments. We had sparks dropping, we had steam rising, and cranes moving. So you could tell that by the time we got towards the end of the show, we got quite comfortable with the technology and were excited to just continue to push the boundaries of what was possible.” And although he can’t go into details just yet, “I can say for Season Two we went above and beyond anything we did for Season One with this technologically,” Bluff hints. “It has even left the crew open-mouthed with some of the magic tricks that we were able to perform with the video wall technology.”
None of it could have been possible without the vision of George Lucas, and the inspiring achievements of Industrial Light & Magic and other visual effects wizards who continued to strive for more realistic and authentic effects since the days of stop-motion and matte painting.
“We’re just very proud to be part of The Mandalorian and we’re very very grateful to Kathy [Kennedy], Lucasfilm and Jon Favreau,” Libreri says. “He pushed us all. He pushed everybody to do something that’s never been done before. And when we look back in 20 years, this is a landmark. It’s as much of a landmark as The Adventures of Andre and Wally B was when the Pixar group made its first animated short.”
“I don’t believe that the technology would have worked as well as it did without the special place that Star Wars has in everybody’s heart,” Bluff adds. “You can see it in the work that they did and the love that they had for everything that we did. Everybody brought something to this project that you rarely find anywhere else and it all goes back to George Lucas and the wonderful movies he made decades ago.”