The making of ‘The Liberator’.
There’s something different about The Liberator, the four-part live-action/CGI hybrid animation series directed by Grzegorz Jonkajtys and now streaming on Netflix. At first, you might think it’s simply a different take on the rotoscoped animation process. But look closer and you see so much more detail and emotion in the performances.
befores & afters decided to ask Polish studio Juice, which produced the series with US-based Trisoscope Studios, about how this look and this emotion were made. While the specific live-action-to-animation workflow—something dubbed ‘Trioscope Enhanced Hybrid Animation’—is being kept under wraps for now, we dived into what was involved in crafting the series.
b&a: What was different about The Liberator compared to a normal live action or animated film?
Michał Misiński (art director and second unit director): From the very beginning, we’ve tried to incorporate real emotion to the comic book style. We have a real guy who is actually playing the part on set. And on top of it, we could add a style which looks like a comic book. This is something which generates something new, which is unique. That was very challenging for us from the very beginning, because we made tons of tests of how we could cover the faces with crosshatches, with stylization, to not lose the emotions.
Marko Zarić (visual effects supervisor): From a technical standpoint, when you do a live action shoot and then you’re supposed to incorporate some sort of CG, your CG part is directed by the live action, by the lighting conditions, by environmental conditions, by matching all this nuance. Here, it was a little bit different because we shot something on stage, we directed it the way we wanted, and then we needed to adapt all the live footage and CG to the style, which gives us this final look and feel.
If you look at comic books, sometimes they will intentionally emphasize everybody who is in the foreground, and then further back the image goes, the details are becoming less and less visible and noticeable. So, from a technical standpoint, for us, who come from a regular CG world, we needed to find this balance. Where do we stop with details? Where is this thin line between, okay, this is a very realistic thing or a very realistic building or prop or whatever? At the same time, it needs to look graphical and it needs to give us this sense of reading a comic. It needs to shift our focus to the actors and their performance and not to the environment, but at the same time, it needs to look real. It needs to look like a part of the same world.
b&a: How did the process you followed differ from traditional rotoscoped animation?
Marko Zarić: Well, in a traditional rotoscoped animation, say like A Scanner Darkly, there you would paint frame by frame, or maybe use some semi-automatic things to recognize the faces. You get this particular kind of final look and feel, which ends up being a little bit flat. What Trioscope developed is actually a little bit more procedural. So you get all these nice nuances, all of this nice acting, and the subtle details which we don’t see in regular rotoscoped animation. It’s very present and it’s very driven by what they do on stage. We are actually using their performance and everything else is driven by it. So the final look is heavily dependent on how the actors behave during the shoot.
b&a: On set, the makeup, wardrobe and props were stylized. Can you tell me a little bit about what went into that and how far you needed to take that stylization?
Michał Misiński: Basically we needed to add line art to everything. The face has lines. The wardrobe has the same lines. Those lines came from a comic book kind of style. The set is actually less problematic because it’s not part of the emotion so much. So we painted as much as we could, but the key here was the face.
b&a: You used a lot of bluescreen sets. How did you make decisions about how much of the set needed to be built? How did you make decisions about whether it should be bluescreen or greenscreen?
Marko Zarić: The bluescreen/greenscreen—the decision here was strictly practical reasons. We are shooting a World War II drama, which means 90% of the uniforms are either green or grey. So bluescreen was the obvious choice. We tried to limit the amount of props on the set to everything that the actors needed to interact with, like a specific table or a chair or a gun or weapon. We built out in CG everything else, which gave us a lot of freedom to interpret this in the way we wanted and liked.
When it comes to the exteriors, we built different platforms and blue boxes and sets. So when they were running downhill, they were actually running downhill. Then there was a funny thing on the set. We had a bunch of blue rocks made of styrofoam, and they were supposed to be like placeholders, so the actors knew how to behave. Those started flying around quite fast because they were falling down and they were very light props and going everywhere!
You know, the entire technology was something new. So we had a lot of trial and error before we finally figured out the limits of using real prop and something very real on the set and at which point CG could take over. One of those things to work out, for example, was water. Interaction with water was giving us a headache from day one until we figured out how to do it. I’m not really allowed to disclose exactly what we did, but if I would have to pick one challenge from stage part from the shooting, it’s definitely interaction with water because that took a lot of time and engineering to figure out.
b&a: Because you were shooting on bluescreen stages, and because so much of the work had to be done in post, was there any part of this process that gave you any real-time feedback on set, say a live-comp or a virtual production approach?
Michał Misiński: We are trying to figure out how to do it ‘live’, it’s possible but it needs some time to develop, it’s like post-productuon in the pre-production stage. We did previs the whole four episodes and this previs showed us how many shots and how we should set up the camera, that kind of stuff. But for future production the virtual studio is the key to get feedback. We have a bunch of soldiers in the bluescreen and they have to pretend that this is a war. It just looks funny seeing them running and screaming and shooting in bluescreen. So for future productions, we would like to develop a real-time preview.Sign up to the weekly b&a VFX newsletter