Watch this previously unseen ‘Mandalorian’ previs reel from The Third Floor

September 28, 2020

The studio also gives new insights into the virtual art department approach on the show.

The Third Floor is well-known as one of the specialist previs studios out there. On The Mandalorian, the studio took its role to a new level by integrating their work closely with the production’s Virtual Art Department, partly to help enable the heavy virtual production side of the show, where LED screens were used during shooting for in-camera virtual backgrounds.

Below, check out a new visualization reel from The Third Floor for The Mandalorian, and hear from some of the team as they discuss the previs and virtual blocking, including for elements such as the Blurrg rides, speeder chases and droid fights.

b&a: Can you talk about The Third Floor’s collaboration directly with Lucasfilm’s art department, and your role in feeding things to the virtual art department and ILM? How was this different than perhaps other projects you’ve worked on?

Chris Williams, visualization supervisor, The Third Floor: ​Collaborating with Lucasfilm’s art department was incredible. Traditionally when working on television and film projects, there can be substantial wait times for clearance and approvals on certain assets. Here, we were brought in and quickly given access to all the designs and references we would need.

The Virtual Art Department was heavily focused on the design environments, so where The Third Floor played a vital role was with characters, props or creatures. When layout on any given scene began on a sequence, we would immediately get started on creating assets of hero creatures and characters. Oftentimes the assets we created would be then used by production later for scale reference and joint structure. Third Floor would also model backlot sets and exterior location environments to spec in hopes to achieve a true 1:1 virtual blocking layout.

b&a: There are so many legacy based assets in the Star Wars universe—how did you approach building or sourcing character, vehicle, creature etc CG assets for previs?

Chris Williams: ​The Lucasfilm art department and Doug Chiang’s design team are a well-oiled machine. They were so organized and we also had access to all the knowledge and expertise of executive producer and showrunner Dave Filoni at Lucasfilm. If you had the most wild question about a character’s scale, size or even temperament, he would have George Lucas’ answer ready to give. There wasn’t a Star Wars question he didn’t know the answer to.

Whenever we would start building a hero creature model or character, prop or vehicle, Doug was always part of the decision to begin. We then would receive multiple forms of reference and designs to achieve the best version for layout. Occasionally, they would run with the models as stand-ins and then find the best creature or character for the scene later closer to shoot. It’s an absolute pleasure to work with their team.

For the 9-on-1 fight in Season 1, Episode 6, in which Mando fends off multiple droids at once, we used the previs assets with motion capture to visualize character choreography in real time, working with director Rick Famuyiwa and the stunts crew. Our team including Casey, Johnson Thomasson and myself set up a motion capture performer to play Mando and three additional stunt performers to represent the robots. We captured motion in multiple passes, with Mando repeating his motion and the other actors playing different robots for each pass.

The data from the Xsens suits was retargeted live onto characters in the Unreal engine, enabling the director to work hands on with the stunt performers while getting real-time feedback of the virtual characters in the virtual environment. Afterwards, with the motion captured and stitched together into a Maya master scene, it was possible to lens the previs into a fully realized stunt beat. Within a week, we were able to get an entirely blocked action sequence, with fully rehearsed stunts on a spatially accurate set build.

b&a: Did TTF’s previs move into techvis specifically for working out LED wall shooting scenarios? Can you describe how that worked?

Chris Williams: ​The virtual blocking we did through previs was already a kind of real-time techvis pass for shooting on the LED wall. We would follow the geographic dimensions of the LED footprint and create the visualized scenes within this. So when we would lens the previs with the DP or a director, it would be 1 for 1 to the LED filming setup on the day of shoot. So when they went to shoot the scene, it was like they had already been on set.

The Third Floor did specific techvis for shots that had specialized equipment and camera movement—including motion base rigs, cranes, dollies, jib arms, etc. The Third Floor’s head of virtual production, Casey Schatz, and team worked closely with Richard Bluff, VFX supervisor, to determine how these camera moves could be achieved “on the day.” The techvis was based on the virtual blocking we had done in previs and was an important piece of the puzzle in filming some of Season 1’s dynamic action scenes.

b&a: For other kinds of techvis, say shooting storm troopers on speeders, how would you approach the previs/techvis here?

Chris Williams​: Casey Schatz and Kevin Tan from The Third Floor supported a number of virtual production shoots that used motion bases, programmed to create a real-world look as characters rode speeder bikes or Blurrgs.

The Blurrgs training montage in Episode 1, in which Mando learns to ride the creatures as if on horseback, is one example of this. The sequence needed to be tech’ed extensively in order to control animation of the ​motion base and to plan for shooting on ILM’s Stagecraft virtual production filmmaking platform.

Casey Schatz, head of virtual production, The Third Floor: Working from the blocking that Chris and his artists did in previs and that had already taken the configurations of the shooting space into account, we “tech’ed” the motion base shots to the specs of the intended film camera and rigs. We could take pre-animation, for example a Blurrg walk cycle from the visual effects team, and run a technical “solve” to apply and play back that movement to the physical motion base.

This allowed us to have repeatable, believable buck rig motion to record performances for the actors as if they were interacting with the CG Blurrgs. The virtual programming for the motion base was particularly useful in achieving dynamic interaction for performer and rig, such as during Mando’s bucking bronco-esque first attempted Blurrg ride.

Kevin Tan, postvis lead, The Third Floor: ​The techvis for shooting characters on speeder bikes focused on the relationship between the camera and the subject. By isolating specific axes of motion and sharing them between camera and speeder, we were able to provide clear and cohesive shooting solutions easily replicated by both the camera and SFX departments with minimal equipment. The speeder bike sequence in Episode 5 used this approach. Using only a small amount of track for the vehicle, we “tech’ed” this high-speed sequence to fit inside the ILM Stagecraft virtual platform, leveraging the real-time lighting of the animated backgrounds.


Subscribe (for FREE) to the VFX newsletter




Leave a Reply

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading