Foundry gets real about real-time rendering.
With Nuke 13.1, Foundry has made several updates to its flagship compositing software, while also recognizing the growing trend of virtual production and real-time rendering work going on right now in visual effects and animation.
That recognition comes in the form of the UnrealReader node. With many artists now also working within Epic Games’ Unreal Engine, Foundry has, with this new node, made it easy to work directly with the pixels coming out of the game engine, and continue fine turning them, like they would do normally, in Nuke.
So what does UnrealReader actually let you do?
With UnrealReader, compositors can break objects into rendering layers, pull AOVs, build environment maps, and make other tweaks. It operates by connecting NukeX to Unreal Engine using a TCP/IP connection. Unreal can be run on the same machine as Nuke or even a different machine or even across different operating systems.
UnrealReader lets you control the data you want, within Nuke, out of Unreal Engine. You can get the AOVs only when needed. You can isolate objects into different render layers, and you can even override cameras in Unreal Engine from within Nuke in a non-destructive way.
Remember, the killer thing here is you are able to do this without ever leaving the Nuke program. The idea is that Nuke users, who would already be incredibly familiar with the fine detail that can be achieved with Nuke nodes, are able to take advantage of the speed and efficiency now prevalent in real-time renders.
These real-time renders, of course, are being used more and more in different VFX workflows, such as previs, or directly on LED wall stages. With UnrealReader, now you can actually shoot LED wall footage, and have the flexibility of bringing the streaming footage or captured footage directly into Nuke for any tweaking. You can implement the promises of real-time technology into tools you’re already familiar with from Foundry.
Real-life customer experiences
Several UnrealReader / Nuke beta testers were on board early to directly try out using the node as it was developed (it of course continues to be refined by Foundry).
For example, freelance senior in-house artist Shahin Toosi recently collaborated with DNEG’s Paul Franklin on the VFX supervisor and director’s virtual production project Fireworks, in which significant real-time rendering and LED stage work was done.
Here, the team wanted to utilize a number of Nuke tools to undertake the traditional kind of live-action compositing tweaks, in particular, for balancing the moire effect that sometimes come from LED walls, and to tweak lighting. Toosi used Nuke’s CopyCat–which was also used to try out some A.I. rotoscoping on the footage–and the UnrealReader node to do that. In addition, crowds for a marketplace scene were enhanced by re-rendering scenes out through UnrealReader into Nuke.
Meanwhile, VFX supervisor Matt Jacobs from Technicolor recently embarked on an LED stage project with Intrepid in San Rafael. One of the challenges he faced was, while shooting on a relatively compact LED stage, how to capture a wide master shot.
Ultimately, he talked to Foundry about working directly in Unreal Engine, and then outputting the renders directly to Nuke in order to show the director how the plate and the final Unreal scene would appear. “It was a big eye opener for me,” says Jacobs. “It’s an amazing process for the empirical approach and interactivity that you don’t [usually] get.”
Pixomondo senior compositor Indrajeet Sisodiya had a somewhat similar issue while working on his studio’s new LED stage set-up in Toronto, where, for instance, they’ve been filming sequences for Star Trek Discovery.
“We had a lot of scenes where the camera needed to be far away and wide, or where the camera was on the LED environment, but then pans up to reveal all the light rigs and top ceiling. I was looking into the default passes that come out of Unreal. I looked at, how do we extract passes out of the game engine that is not built to extract passes out of?”
Sisodiya was able to experiment with the UnrealReader node to effectively extract different AOVs, which meant Pixomondo could then use them directly in Nuke.
Where to go to learn more
With UnrealReader, you’re able to apply to your real-time rendered scenes that extra important plus’ing that comes from being able to fine tune in Nuke. If you’d like to learn more about how to do that, here’s some videos, sites and courses you can check out to learn more about the new UnrealReader node:
– Course: UnrealReader | Visualizing Unreal Scenes in NukeX course
– VFX Notes podcast with Ian Failes and Hugo Guerra on UnrealReader, in a discussion with Mathieu Mazerolle, Foundry’s Director of Product for New Technologies
Feature image courtesy of Epic Games.