The machine learning tools now in Nuke 13

See them demo’d in this video.

Foundry’s Nuke 13 was released today. Its new features include a machine learning toolset, a new Hydra 3D viewport renderer, extended monitor out functionality, enhanced workflows for collaborative review, and Python 3 support.

I thought I would share some more details of the machine learning toolset, specifically. From Foundry’s press release, the company notes that “applications of this flexible toolset include upres, removing motion blur, tracker marker removal, beauty work, garbage matting, and more.”

They also go on to detail the key components of the machine learning toolset:

CopyCat – an artist can create an effect on a small number of frames in a sequence and train a network to replicate this effect with the CopyCat node. This artist-focused shot-specific approach enables the creation of high-quality, bespoke models relatively quickly within Nuke without custom training environments, complex network permissions, or sending data to the cloud.

Inference – is the node that runs the neural networks produced by Copy Cat, applying the model to your image sequence or another sequence.

Upscale and Deblur – two new tools for common compositing tasks were developed using the ML methodology behind CopyCat and open-source MLServer. The ML networks for these nodes can be refined using CopyCat to create even higher-quality shots or studio-specific versions in addition to their primary use for resizing footage and removing motion blur.

You can see how some of these machine learning features work in the video below. Check out Foundry’s page for more info.

Sign up to the weekly b&a VFX newsletter