VFX Insight

Telling an all-CG animated Japanese samurai story, when live-action wasn’t possible

How Neil Biondich used Reallusion’s Character Creator and iClone for ‘Chiburi’.

When writer, animator and director Neil Biondich set out to make a short film set in medieval Japan, his first approach was to shoot live action in that country. COVID-19 prevented that from happening, but he quickly pivoted to an animated short with digital humans and a real-time workflow that incorporated Reallusion’s Character Creator and iClone, and Epic Games’ Unreal Engine.

Here, Biondich runs through the story he has told in the still-in-progress Chiburi, made with the assistance of Reallusion’s Pitch & Produce program, and the tools and techniques—including his own martial arts motion capture—he used to make it.

b&a: How did your Chiburi project get started?

Neil Biondich: Well, I have kind of a deep connection to Japan. I’m a lifelong martial artist and I’ve been doing traditional Japanese martial arts since 1989. I find it one of the most mystical, beautiful places., iIt’s very inspiring. So that was kind of the backdrop for where I wanted to tell a story and my original intent was to come up with the ‘ultimate underdog’. Somebody who has nothing going for them and has everything going against them.

I came up with the idea of the character Sister, who is the younger of two siblings in a Japanese family raised by a single dad who is a retired veteran samurai. And he would have been a man who was an executioner for a warlord. He’s somebody who’s killed tons of people and he was good at it. But now, he’s retired and he’s raising two kids and he largely ignores his daughter because she can’t be samurai. She can’t even touch a sword according to their customs.

Sister watches her slightly older brother get trained as a samurai and pretty much yearns to have the thing that he does and wants to get a name in a naming day ceremony. She doesn’t want to be Sister and just be this utility. She does all the chores, all the cooking. So it’s really the story of a young female trying to overcome the culture of the time saying that she can’t be something, and she can’t even have a name.

In the story, ironically, her brother doesn’t want to be a samurai. He’s an artist. He has kind of a soft heart and he doesn’t want to become a trained killer like his father was. He wants to make a name for himself.

b&a: When you came up with that story, did you have in your mind a firm feeling about the look and feel of the film, as well in terms of now photo real versus stylized versus digital human?

Neil Biondich: Well, originally this project was supposed to be shot in Japan as a live production. I went to Japan right before the pandemic broke out and I did location scouting and I was prepping to shoot in 2020. And, of course, COVID shut all of that down.

Coincidentally, at the time I happened to be studying Unreal Engine, which I fell in love with. Before doing live production, I was an animator for 10 years, so that was my first love, and I’ve since been doing live production. Unreal was kind of my path back to doing some animation and COVID kind of forced it on me. So I had this great story, production got shut down, and I needed a project to teach myself the Unreal Engine. So I was, ‘Sounds like this is going to be an animated story!’

b&a: What were the different tools and techniques you considered when setting out to make this as animated film, then?

Neil Biondich: I instantly set out doing research. I quickly came up with Reallusion’s Character Creator as one of the easiest ways to come up with very realistic-looking humans. I delved into their ecosystem and I was able to get some great results. The more I messed with it, the better I got at it, the more I realized I could get some pretty photorealistic humans out of this. And so I spent some considerable time learning Character Creator and iClone doing just that.

I bought myself a Rokoko suit and used an iPhone for facial capture, so I could do motion capture on my own and I practiced that for a while. I became pretty good at cleaning up the data, but then I found iClone to be a really nice central place to bring the mocap data in to clean and enhance it. It was easy to add layers and make refinements to the animation. So, I found it a really good tool and I’m not somebody who likes to use tons of tools. If I can get everything done in one package, I will.

b&a: Was Sister or any of the other characters based on any particular person or reference?

Neil Biondich: No, not really. I’m a very visual person and I’m a face person too. So I just had an idea of what I wanted her to look like, and Father and Brother too. I shopped around for starting points with Asian characters and, sadly, there weren’t many. There’s not a lot of great Asian models out there. So it took a lot of research to find kind of a base model to work from. In Character Creator, I found it very handy to morph the siblings between each other. So, they definitely had some similarities to them. And then the age differences, obviously, that was really key.

b&a: When I’ve talked to some other people about those tools, they’re always fascinated by SkinGen and I’m wondering how much you used it because you can go really far in terms of say adding sweat and all those other things or you can keep it a bit more stylized, I guess. Where did you settle on for these characters?

Neil Biondich: SkinGen came out mid-process to me doing this. So it was a tool that got thrown in the mix, but it was very welcome. I don’t shy away from added layers of detail. To me, it gets me excited. But on the other hand, Japanese people have very good skin and, traditionally, compared to other ethnicities, they have pretty blemish-free, smooth skin. So, my use of it wasn’t that overt. I did put some capillaries and other things subtly underneath the skin for all the characters. And the older characters like father, he’s got a nice scar running down his forehead. And the bad guy who shows up in the end eventually, he’s got a couple of major scars on his face. So mostly what I used it for was subtle touches like capillaries and scars.

b&a: One of the things that interests me is that you did adopt a real-time Unreal Engine workflow, which has really changed the game. How did you find that as a new workflow, but also going from Character Creator, iClone into Unreal Engine?

Neil Biondich: I consider them siblings of each other, in that Character Creator and iClone are real-time, too. And I’ve got an RTX 3090 card, which rasters nice and fast. But just in general, the whole real-time workflow is incredibly refreshing, especially coming from a traditional 3D background. It was always ‘hurry up and wait’, and you’d never know what you’re going to get. To me, that was the most frustrating thing of doing 3D was having to hit that render button and wait a day or two to check out your product. Getting to rapidly prototype stuff in real-time, move a light and see it in real time, that is just great. I mean, it really helps the creative process to be able to iterate that quickly.

b&a: When there’s more of the sword fighting action, how did you tackle that with the motion capture, but also mix in reference from your own martial arts and other martial arts work?

Neil Biondich: I did all of the movements for it, so I had to choreograph myself, but the difficult thing is choreographing me against me. So I developed a methodology where I made a verbal script, including describing the action of the scene. And the verbal script created a timeline that would unfold where things would happen and then I could play that back while I was acting and act out according to the verbal script, so my timing was correct. And then I would do the same thing for both sides, so both people would be operating on the same script.

b&a: I think another fascinating thing is that it was intended to be live action. Now that you’ve made it into a CG short, where is it at? What’s your plan with it in terms of showing it and getting it out there?

Neil Biondich: I don’t have any major plans other than if I like it enough, I’ll enter it in some festivals and I’ll show it around to people who I know and like. For me more than anything, it was the first project where I really learned these tools and I’m somebody who has to do a project to learn it. I don’t enjoy academic learning in a vacuum as much. I need to kind of just jump in the deep end and go for it, and that’s really what I did with this project. And it got me to really understand the nuts and bolts of all of these tools. So if I get to the end and I like the products, I’ll spread it around to some festivals, I think.

Part of me has have tempted, since I’ve got this world built out, to maybe do a hybrid. I’ve finished the animation, but I’m tempted to go back with the live actors who I wanted to work with and maybe bring them into my 3D environment and do a virtual production version of the same story. We’ll see.

Brought to you by Reallusion:
This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.


Become a befores & afters Patreon for bonus VFX content


Leave a Reply

back to top