How the filmmaker has adopted Reallusion’s Character Creator, iClone and Unreal Engine.
In 2016, I met director Ishan Shukla at SIGGRAPH Asia in Macau, where we did an interview about his CG short Schirkoa. The film, a clever dystopian take on a world where people must wear paper bags on their heads, would go on to win several awards, with many recognizing the short’s impressive aesthetic that combined 3D imagery with a distinctive post-stylization treatment.
Since Schirkoa’s release, Shukla has been concentrating on making a feature film version of the short. This project, of course, requires significantly more planning, and Shukla also wanted more control and the ability to iterate on shots. So he revised what had been more of a traditional CG workflow used on the 2016 film to now adopt real-time tools such as Reallusion’s Character Creator and Unreal Engine.
Recently, Shukla’s Schirkoa feature was announced as one of the first recipient works in Reallusion’s Pitch & Produce program. This program brings tools and financial support for indies and commercial studios utilizing the iClone character and animation pipelines with Unreal Engine.
A look at the old approach
For the Schirkoa short, Shukla had relied on a classic pipeline of Maya, Photoshop, After Effects and Premiere Pro. Rendering had been carried out in Redshift. Character animation began with Mixamo.
“I had one master character rig for each character and I would modify its appearance according to the scene,” explains Shukla, of his previous approach. “Apart from the main props that were built in Maya, many assets in the background, especially vehicles and buildings, were brought from the Daz marketplace. The textures were a mix of hand-painted and filtered to achieve the flat look of the film.”
Above: a breakdown of the original Schirkoa short.
“Once the rough props and characters were more or less there I started laying out my shots inside Maya,” adds Shukla. “Once the layout was done inside Maya, I then reiterated the Maya playblasts inside Premiere Pro with temp sounds and music until I achieved my first cut. Then it was a matter of polishing the animations and rendering the shots. Compositing-wise, I had rendered a few extra ‘toon’ passes for outlines and diffuse information. These were then merged creatively in After Effects to create the look of the film.”
For Shukla, the main challenges he faced in creating the short film were generating crowd and second character animations, reducing rendering times and optimizing scene assembly. “While I could handle the crowd with Mixamo and optimize rendering times with the GPU rendering,” he says, “the scene assembly aspect was still very much an obstacle. With a single workstation and limited resources I just dealt with the constant crashes by optimizing the assets and scenes as much as possible. Another challenge was iterations. Once a render was done after a week or so, I had to rewire my brain and suck up whatever renders I would end up with.”
A new real-time paradigm
Looking to enhance this workflow and get more instant and interactive feedback for a feature-film version of Schirkoa, Shukla shifted to an array of real-time tools for character builds, animation and rendering. From an overall point of view, the new workflow now revolves around Reallusion’s Character Creator 3, Daz, iClone, Maya/Blender and Unreal Engine.
Most of the characters are crafted in Character Creator 3 and iClone (iClone is also used to animate the characters). Special characters—those requiring flamboyant costumes, horns or wings—are first done in Daz and brought to Character Creator through the Transformer tool. Hero characters are exported as FBX using presets for Unreal and Maya/Blender for auto rigging later.
Secondary characters are exported as FBX with instaLOD to confine geometry, materials and UVs. Hero characters are brought into Unreal Engine through Auto Character Creator Setup. Hero characters are also brought to Maya/Blender and rigged.
Motion capture is recorded in Reallusion’s iClone using a combination of Perception Neuron suits for body capture and Reallusion’s LIVE FACE App for facial capture. Secondary motions are directly sent to Unreal and assembled. Primary motions are sent to Maya/Blender for further polishing. They are then exported as Alembic or FBX.
A first pass of props and environments, which come from a mix of marketplace assets and basic blocks, is assembled in Unreal. A 3D layout is also done in Unreal for the animatic. Then, an asset second pass, using custom assets created based on camera angles, is done.
The 3D layout in Unreal is further polished, and post-process materials added in the game engine, too. Color correction is also done. The animatic is brought into Premiere Pro and the process repeats until, says Shukla, “the film looks great.”
What this new approach offers
Shukla’s adoption of this range of real-time tools has made the most difference for him in terms of cinematography. “With a WYSIWYG workflow I am able to realize my shots in a far more creative way than before,” the director attests. “If I can have a sense of camera focus, color, final look, textures, fx, post processing all in my shot, I basically end up ‘shooting’ the film in a more grounded way.”
“I have now started actually editing my film in a more traditional way with multiple camera angles and b-rolls,” continues Shukla. “I have a lot more wiggle room as the director/editor in the animatic stage to make the most engaging sequence possible.”
The other significant leap observed by Shukla is the ability to do many more shot iterations. He says he can continue re-shooting sequence, essentially rendering again and again, until it looks suitable and makes sense in the overall narrative.
“I have always avoided storyboarding personally as it leaves little room for creative choices and happy accidents. A gray viewport with a temporary look of the film, however, didn’t help in my previous workflow.”
Where the feature film is at
The feature film version of Schirkoa is progressing. Shukla says that he has been working with a team of producers based in Europe and Asia, and is in late negotiations with voice actors and music composers. “Once that is part is locked, we should be able to start the production very soon.”
Shukla adds that he’s been enjoying learning tools such as Character Creator, iClone and Unreal Engine along the way. “With iClone and Character Creator the learning curve has been quite straightforward. They have some good learning resources online, especially their YouTube channel.”
“If you come from a traditional 3D app like Maya, Max or Blender, you can get started in a couple of days. With Unreal the learning was manageable but it took me a while to get used to the workflow. It’s quite a shift in terms of linear storytelling inside a 3D application. The Unreal documentation and YouTube channel was tremendous help in making that learning as swift as possible.”
For more information about Reallusion’s Pitch & Produce program, including how to apply, visit the website here.
Brought to you by Reallusion:
This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.
Become a befores & afters Patreon for bonus VFX content