Behind the real-time approach.
VFX studios are fast-becoming some of the principal adopters of new real-time tools. That includes game engines such as Epic’s Unreal Engine, a tool Zoic Studios recently employed to re-visit a battle scene on Stargirl that they had previously completed with more traditional CG software.
For this re-visited scene of S.T.R.I.P.E. and Grundy battling it out, which you can watch below, Zoic used Unreal Engine as part of its Epic Mega-Grant to test out what could be achieved with the real-time toolset.
befores & afters asked Zoic ECD Andrew Orloff about what was required to adjust to a game engine approach for the scene.
b&a: What was different and what was the same in terms of following an Unreal Engine workflow for this scene?
Andrew Orloff: We were able to utilize a good amount of the artistic framework and a lot of the modeling, texturing and rigging, but moving into a real time sequencer required a major shift in approach. Unreal operates like a simulation of an event, much more like filming on a set with actors in real time. Instead of individual animations for each shot, we made very long animations and found cameras that would best speak to those shots, which would affect the overall continuity and motion of the piece. We would have had over 50 animation files working together to ultimately achieve continuity, but in Unreal we had seven animation files with multiple camera angles on each. We were able to do things like adjust the lighting, change it with the artist and see how it will play out without re-rendering, which is a huge innovation for the VFX world.
b&a: What did you feel that the real-time aspect gave you in this test, and how do you think it might benefit future productions at Zoic?
Andrew Orloff: I think the way we iterate and loop communication in real time is entirely different. Our artistic pursuit when we approach a project comes with a certain amount of experimentation. With a traditional VFX process, a filmmaker will look at a shot, make a note, wait, view the interpretation of that note and then repeat as many times as the delivery date will allow. In Unreal, the ability to look at a scene and change an aspect of that scene in real time means a round of 2 or 3 revisions can be seamlessly executed in just 5-10 seconds. Real time allows you to experiment and find what will work best for the scene or sequence. You can see how the decisions you make affect everything automatically, and suddenly your pipeline operates instantaneously.
In a traditional VFX pipeline, we have to go to the top of the pipeline every time we have a note. To make this easier, we don’t render just one image; instead, we render every layer of an image as a separate piece and then re-combine them in compositing so we have control over each layer. When we render in real time, we don’t have to make those passes. Instead, the render occurs instantaneously in the engine. The traditional VFX pipeline was created to build in additional materials to allow for greater ease in making changes throughout the iteration process. Game engine technology allows us to interact with working material at a time scale that we haven’t seen before. We’re not just using it for previs–we are actively using it as the mechanism for making imagery for some of our current shows.
The technology lets us do what we are most passionate about: being dynamic partners and bringing the imagination of creators to life. With such a united process, we are better able to understand what filmmakers want and put their vision on the screen in an unprecedented way. The increased time spent amplifying the visuals allows us to keep pushing the envelope when it comes to the VFX quality audiences will see on television, and content in general. What excites us at Zoic is having the right tools to dial in on the needs of the filmmaker, and Unreal has been and will continue to be a gamechanger in that endeavor.
Not only does leveraging Unreal provide massive efficiencies in creative collaboration, but it also allows for unmatched render quality. Since the final rendered material out of Unreal was untreated with compositing, the resulting render quality was far superior to what would have been possible with a standard VFX pipeline under traditional network delivery times.
It’s an intuitive tool because it’s essentially a simulation of a real shooting environment. It makes sense to creators immediately. I’m able to have real time sessions with a director and for them it’s very similar to being on set talking to a camera operator with actors going through their blocking. That easy transition for creators is essential to our ability to continue to transition towards a fully real-time pipeline.
b&a: If you had to pinpoint any challenges of using Unreal Engine here, or things you would like to be able to do in future versions that you couldn’t necessarily utilize for this test, what would they be?
Andrew Orloff: The most challenging part is convincing our artists this new way of thinking about VFX is the right way. A lot of VFX artists are super invested in the current software and methodologies that they are using, and in some ways this shift can feel like a major high hill to climb. However, we have seen across the board that once they get involved, they quickly see the benefits.
In terms of the future, we have been seeing Epic roll out new features aligned with the needs of filmmakers and we are excited to continue seeing that growth and embracing the new creative possibilities they present.