VFX Insight

Building a real-time pipeline for an animated feature AND an accompanying game all at once

How Hasraf ‘HaZ’ Dulull and his team at HaZimation jumped into Character Creator, iClone and Unreal Engine for ‘RIFT’.

Director and producer HaZ Dulull is one of those filmmakers finding new ways to create content with real-time pipelines. His animated sci-fi feature RIFT–which is also being simultaneously made as a game–is one such piece of content.

Here, Dulull and his team, via the HaZimation outfit, are capitalizing on several tools to make the film, including Reallusion’s Character Creator and iClone, while also incorporating several motion capture workflows and live links into Unreal Engine. The filmmaker shares his experience on RIFT with befores & afters.

b&a: What did you set out to achieve with RIFT?

Hasraf Dulull: We set out to produce a full length animated feature film which would be rendered out entirely in Unreal Engine as final pixels which was stylistically inspired by anime.

The creative team behind ‘RIFT’.

b&a: What drove you and your collaborators’ decision to use Reallusion tools to help craft the different versions of the characters for RIFT? What specific tools and techniques did you use to model/build the characters?

Hasraf Dulull: As a hands-on director I am responsible for all shot creations, and working closely with the team to create the key components for me to slot it all together to create the shot on my end. The team is small and everyone is pretty much generalists.

Andrea Tedeschi, my long time collaborator and CG Supervisor, wanted to streamline the process of the characters due to the fact that we had various iterations of the same characters due to the plot of the film being about the multiverse / alternate realities. So we looked at Reallusion pipeline from Character Creator for the characters (designed, built, shaded and rigged) to iClone for animation, facial animation and lipsync. All of this then linked into Unreal Engine seamlessly.

Andrea also used Substance Painter for additional texturing of the characters which he then finished off in Character Creator before exporting into Unreal Engine.

A character comes together in Character Creator.

b&a: How did you use ActorCore and other tools for the animation/mocap process? How did this help inform the final approach?

Hasraf Dulull: We knew on this ambitious project there was going to be a need for specific motion capture sessions using an Xsens suit, but we also knew we didn’t have a huge budget allocated to motion capture every single movement in the film.

So what I did was break down the script and instantly went straight into blocking out shots. Anything specific like a choreographed action scene was motion captured by our mocap artists, Gabriella Krousaniotakis (who also helped us lock down a pipeline for this) and Ace Ruele.

But all the other mocap such as idles, running, kneeling, generic conversations etc we were able to grab from ActorCore libraries and download as Unreal Engine FBX files and retarget these nicely onto our characters.

In the last few scenes of the movie we recently used a machine learning approach to motion capture via Move.AI which uses GoPro footage to capture the motion with the need for our actors to wear suits, and again the same FBX workflow into our pipeline.

A frame from the film.

b&a: Tell me a little about character performance. How did you handle lipsync, facial movement and also hand movement?

Hasraf Dulull: Lipsync has always been something that was tricky to get right, as well as being time-consuming if audio changes during editorial. So we needed something that worked well with the iterative nature of how the shots are being created and a solution that didn’t involve us animating a mouth rig against a video ref.

AccuLipSync is a plugin in iClone that pretty much saved the day, and I do literally mean save days of work! And even more efficient was that I tasked myself with handling all the lipsync work in iClone so that I can jump between editing and creating lipsync from the exported audio of the editorial timeline (we edit in DaVinci Resolve).

For facial animation we used the Live Link straight out of iClone, and our talented animator Alex Kong actually had fun performing many of the facial expressions and core moments of the characters himself with the iClone Live Link and then adjusted them further in iClone.

Alex Kong performing using Live Link in iClone.

For other generic facial expressions, I was able to dive into the iClone Face Puppet where with just selecting parts of the face I can then move my mouse around to create organic looking face expressions and performance and keyframe them all in iClone’s timeline before exporting out as FBX to work in Unreal Engine using our proprietary developed tools we created in Unreal Engine to allow us to control face, hands, body and lipsync as individual animation tracks in sequencer.

This meant that Unreal Engine artists we bring on later in the production, such as filmmaker / animator Mark Cheng who joined the team to help create shots with me, he was able to hit the ground running and dive into the shots on day one and start animating using the exported FBX’s of lipsync, body animation and facial animation from iClone in Unreal Engine Sequencer.

For the hand animation we used the Manus gloves with Xsens. Gabriella did pretty much all those shots in one take, but as I was creating the shots in Unreal and bouncing back and forth with my edit, I realized I needed more specific hand gestures for other shots, and didn’t want to have to go back to another session to just capture hands, so we used a plugin inside iClone called Hand Gestures, where I can just move the mouse around and the fingers and move to create the gestures I wanted, and this was such a quick process, like in a matter of minutes, and then export as FBX into Unreal and apply to the hand later in our animation pipeline. That’s the beauty about the way we designed the pipeline to allow each of us to contribute to various parts of the performance of a character’s animation.

Using AccuLipSync for lipsync.

b&a: Can you talk a little about what it has been like building both a film and a game with the Reallusion tools, and with Unreal? What are some of the benefits of this cross-over do you think, as well as things you need to look out for?

Hasraf Dulull: The idea of going into game development alongside the animated film was something that happened mid-way through production, and influenced with two things, the first core reason was that the plot of the film made for a perfect video game plot and game design, and there were so many things I wanted to do in the film but couldn’t due to the linear format of a 90mins feature film but with the video game branch narrative there were so many ideas on where we could take the story and the outcomes of our characters based on the player’s choice of action.

The second reason was the fact that we are using a game engine already and we decided to do a game jam session with our awesomely talented Unreal Engine dev / artist Sam Rebelo one weekend and the outcome of that was a very rough level with the characters moving around and shooting, but it was enough for us all on the team to get excited and see that we have a game.

An Unreal Engine UI screenshot for RIFT.

The big decision I made about the game was to ensure we were not creating a AAA scale game, yet it may look polished and super slick due to the art direction and assets already created in the game. I looked at the early days of arcade shooters and we used that as our game design approach to make a fun, intense arcade shooter which blends 1st person and 3rd person whilst seamlessly bleeding into story cinematics all in the game.

I remember back in the day when I worked video games, we had to create several levels of detail assets and environments etc, but with RIFT we didn’t do any re-designing / remodelling or reducing LODs etc, we literally just migrated our Reallusion characters and animation into the game and started developing the game. This meant we were only building assets once but utilizing them across two projects at the same time, and now we are also looking to turn some of them into NFTs to really increase the longevity of all the art we create.

Brought to you by Reallusion: This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.

Become a befores & afters Patreon for bonus VFX content


Leave a Reply

back to top