Here’s how Quitasueño Studio crafted the scene with motion capture and real-time tools.
Caribbean-based outfit Quitasueño Studio recently opened a new motion capture volume, for which it implemented tools such as Reallusion’s iClone and Character Creator, as well as solutions from Unreal Engine, Vicon, Faceware and Rokoko, to offer capture services.
In order to demo their new wares, Quitasueño decided to make a fun short video that replicates the famous ‘I am your father’ scene from The Empire Strikes Back.
The re-creation of this Star Wars moment would take advantage of the use of motion capture (body, facial tracking and finger tracking), motion capture clean-up, real-time asset development, royalty free assets, virtual cinematography techniques and final photorealistic rendering of digital humans.
Here, Quitasueño’s studio director Luis Cepeda shares with befores & afters how the demo was made, including his thoughts on setting up the pipeline with user-friendly real-time tools which meant the production could be achieved as efficiently as possible.
b&a: Tell me about the idea behind re-creating the ‘I am your father’ sequence—what was behind that choice and what did you make the piece for?
Luis Cepeda: Since we are a new mocap studio, we needed to show the possibilities of our facilities and the interoperability of capturing body, face and hands at the same time, as well as both fast and subtle movements including facial expressions. That scene covered all those needs.
b&a: What were the steps you took to capturing the action for this sequence, in terms of staging on the mocap volume? What was the equipment used and the capture workflow?
Luis Cepeda: Two professional actors were selected for rehearsals in which they had to replicate the exact performances of Luke and Vader. After that, elements were built in the studio such as ladders and platforms in the same positions in the scene and with which the actors were going to interact. Body, face and hands were captured at the same time with Vicon, Faceware and Rokoko gloves.
b&a: How did you go about building assets for this sequence?
Luis Cepeda: Aside from the aforementioned set items, the big challenge were the sabers, as we had to have several because we knew they were going to break during the fight, but the installed Vicon capture system is so precise that it didn’t matter if we put the markers slightly differently between one saber and another, the system always detected that it was a different one, so we had to go through the process of adding and calibrating a new prop to the scene every time we changed sabers.
b&a: Can you talk about the steps from capture through to using Reallusion tools such as iClone and CC, and other tools? What were some of the benefits, but also challenges, you faced in terms of motion editing and mocap data cleaning?
Luis Cepeda: From the beginning, this project was conceived to be carried out and finished within the Reallusion universe, since I am a user from the first version of iClone and I knew how far I could take it. We did the motion capture in Vicon and brought it to iClone as FBX, we did the cleaning and editing of the mocap data directly within the software, while the facial capture with the Faceware were included directly with the Faceware Profile. Then we significantly improved facial expressions with iClone’s facial expression tools before proceeded to final rendering.
b&a: How were the close-ups of Luke’s face achieved?
Luis Cepeda: We did this by combining the different facial animation tools included in iClone. By the way, this part is very interesting because the shots of Luke’s close-ups are the only ones that are rendered with the iRay plugin, everything else came directly from the iClone rendering engine which is extremely fast and with incredible results like those seen in the demo.
The reflections on Vader’s helmet, the dynamics of his cloak’s fabric, the glow of lightsabers, the smoke coming from explosions–all inside iClone. The only FX that weren’t done in iClone were the sparks, as we wanted them to be exact to the original scene, so we had to use several layers of tricks before mounting them into the final composition.
b&a: After this experience, and also from any other Quitasueño projects, what do you think is exciting right now about real-time capture and the results you can achieve with tools like iClone and others?
Luis Cepeda: Right now the studio staff is very excited as our workflow will be greatly improved with the new plugin that will send the Vicon data in real-time directly to iClone, which will allow us to visualize the characters at the same time we capture.
The real-time preview is the most powerful tool that every director, producer or creator should have on hand to greatly facilitate their work and allow creativity to prevail over outdated technological limitations.