THE MANDALORIAN
3D World UK|September 2020
THE MANDALORIAN
ILM DISCUSS A BOUNTY OF VISUAL EFFECTS
3D World UK sits down for a conversation with Richard Bluff, visual effects supervisor and Hal Hickel, animation supervisor, both at ILM in San Francisco, to unpack the creative adventure of telling the story of a galactic bounty hunter.

Werner Herzog, world-renowned filmmaker and occasional actor, takes a key role in the new Star Wars TV series The Mandalorian. Speaking in 2019 about the series, Herzog made the key point that it deploys what he described as mythic images, and his description speaks to the visually compelling quality of the series and the rich tradition of Star Wars. It’s a pop-culture phenomenon that has captivated the imagination in two ways: the story unfolding on screen, and also the story of the creative impulses, choices and challenges to put those stories in motion.

Richard Bluff begins our conversation by identifying the project’s landmark approach to environment creation: “I think the biggest challenge was wrapping our head around how we wanted to utilise real-time game technology in collaboration with the LEDs, effectively prototyping out what that technology would look like, and then of course executing a production-ready tool for the first day of shooting. That was by far and away one of the greatest ever challenges that I’ve faced in the visual effects industry.”

Fascinatingly, ILM has a connection to game engine use that dates back to their work on Steven Spielberg’s dazzling science-fiction fairy tale, A.I. Artificial Intelligence, where it was used for virtual production approaches during filming of the Rouge City sequence.

Bluff sketches out the longstanding relationship between ILM and its use of LED: “There had been an awful lot of work done prior to The Mandalorian utilising LED screens at ILM and Lucasfilm: they’d been used on Rogue One for example, and the game engine technology, particularly Unreal Engine for season one, had been used extensively for X Lab, our immersive development department at ILM, on various augmented reality and VR projects. So, there were various pieces of all of the pipeline that had been utilised in visual effects, or with Jon Favreau [series creator] and his past projects including The Lion King and The Jungle Book. I think the biggest challenge was pulling all of that together, but more than that it was the goal that we set ourselves of shooting half of the season in the LED volume, and within that amount of work making sure above 50 per cent of every take would constitute an in-camera final. And, as a result, that would have meant us building over 110 real-time environments that in theory had to be photoreal and played incamera. There was nothing that existed prior to The Mandalorian that had attempted anything near what we tried to do. Up until now it was isolated to one or two shots or scenes with content that was intended as previz only for dynamic lighting, whereas we were attempting in-camera set extensions – effectively taking the post-production aspect and putting it in prep.”

Bluff goes on to offer context for ILM’s work in applying the game engine technology to their production and visual effects collaboration: “ILM has a rich history of projects, plus supervisors and artists working in new media. Prior to Kim Libreri joining Epic he was a visual effects supervisor at ILM and had been the lead supervisor behind an ambitious project to turn a video game environment (that was never used), and try to imagine how we could utilise that world and those characters and use it in a real-time game engine to generate content. So, he’d already been pursuing game engines for television or theatrical content for a long time.

“The same goes for various artists – take Landis Fields, he and I had worked on the Millennium Falcon Smuggler’s Run ride [for Disneyland] a year or two prior to The Mandalorian starting, and the list goes on in terms of the projects that various folks at ILM had been involved in, so I think people saw that this was a possibility on the horizon. There were few people that saw the link with the LEDs. We were also pursuing game engines as a back-end post-production renderer to see what leverage we could get out of there. But this was certainly the project that pulled all of those ideas together and it goes without saying that without Jon Favreau, his vision and his thorough understanding of the technology, we wouldn’t have done what we did. Various pieces of this tech have existed for a long time, but there’d never been a filmmaker that ILM and Lucasfilm had come across since George who was so willing to put the entire project on a theory behind what we could do with technology. And for Jon it was his ‘Holy Grail’ shooting methodology that would allow him to reduce the stage footprint, maintain the project in LA as well as increase the speed of production, while advancing the technology and giving his actors, directors and DP something to shoot against aside from just green screens.”

articleRead

You can read up to 3 premium stories before you subscribe to Magzter GOLD

Log in, if you are already a subscriber

GoldLogo

Get unlimited access to thousands of curated premium stories, newspapers and 5,000+ magazines

READ THE ENTIRE ISSUE

September 2020