How does USD actually get used at a VFX studio?

Luma Pictures breaks down its Katana and USD implementation.

You’ve probably heard of USD and you probably know it’s Pixar’s open source Universal Scene Description (USD) framework. But you might not know just how visual effects studios are using the framework in their own pipelines for the interchange of 3D data between various DCC apps.

A major adopter of USD is Luma Pictures, which jumped head-first into the framework to help deliver the elaborate kinds of VFX it’s known for from films such as Doctor Strange, Captain Marvel and Spider-Man: Far From Home.

Here’s how the studio has implemented USD, in particular, in its Foundry Katana lighting and lookdev pipeline: why they did it, how they did it, and how it’s been used specifically on recent projects.

The problem to solve

In November 2016, shortly after wrapping on Doctor Strange, Luma Pictures flew the company to Fiji for a special employee retreat. The Marvel film had been one of the most complex in the studio’s history.

“In post-mortem conversations over Mai Tais,” recalls Luma Pictures head of software Chad Dombrova, “it became clear that while the film was a high-water mark, parts of the pipeline were pushed to their breaking points due to the enormous scale required for the film’s effects. The main pain points identified were Maya’s inability to operate at scale and a lack of flexibility in the overall pipeline.”

Luma Pictures’ head of software Chad Dombrova.

It was ultimately decided that there would be a two-pronged attack to the problem: switching from Maya to Katana for lighting and look development, and rebuilding the pipeline around USD.

“Katana has first-class support for USD via Pixar’s usdKatana plugin, now maintained by Foundry,” outlines Dombrova. “So by switching to Katana and USD simultaneously, Luma was able to leverage Pixar’s battle-tested usdKatana plugin to sidestep numerous Katana development tasks, which freed up time to focus on the core USD pipeline and add more artist-facing tools to Katana.”

A final visual effects shot by Luma Pictures from ‘Spider-Man: Homecoming’.

“As a testament to both platforms and our development team, we were executing shots on Spider-Man: Homecoming using Katana and USD just 3 months after beginning development.”

It has now been almost four years since that USD development decision, and Luma Pictures has successfully integrated USD into all of its primary applications – Katana, Maya, Nuke, Houdini, Arnold and Mari.

A Luma Pictures Katana template.

Luma has also created a number of open source projects themselves, part of its significant collaboration with Pixar, Foundry and other studios that have reached heavily into USD such as Animal Logic.

For example, discusses Dombrova, “our usdArnold has shader importers and exporters for various DCCs as well as a Hydra render delegate, and has since become the basis for Autodesk’s official arnold-usd repo. Maya-to-Hydra is a Hydra scene delegate for Maya which has been rolled into Autodesk’s official maya-usd repo. And UsdQt provides a set of Qt models for creating USD user interfaces.”

The action shot in Prague for ‘Spider-Man: Far From Home’, and initial CG render.
Luma Pictures’ final VFX shot.

What the USD and Katana combination solved for Luma

With a lot of complex visual effects work going through Luma’s pipeline – often with so many different tools and file formats being used – jumping into USD helped solve the massive challenge of data interchange. USD essentially provides a universal format for loading common objects like cameras and meshes in a multitude of applications.

“But,” adds Dombrova, “USD goes much further by encompassing features such as layering, references between files, overrides of attribute values between layers, mutually exclusive variants, like ‘hi’, ‘mid’, and ‘low’ levels-of-details, deferred loading, procedural scene generation, along with many, many other features.”

Molten Man from ‘Spider-Man: Far From Home’, in pre-final rendered form.
The final shot with effects and lighting.

Switching from Alembic (which had previously been relied upon at Luma) to USD gave the studio a way to describe and organize an entire shot, not just the leaf level caches. “Reaching that level of completeness means your shots are truly fully portable between applications,” says Dombrova.

In terms of where the Katana and USD combination has been critical, it starts with lighting and lookdev; this is what Luma uses Katana for. As part of its USD development, the studio wrote a usdShade exporter – based on the LookFileBake API – to inject materials and their bindings into the USD stage of an asset. Says Dombrova: “This process uses our usdArnold library, which adds support for AOVs and object properties, like subdiv settings, so that lookdev artists can pass these along with the asset to the downstream lighting scenes.”

Katana node graph from Luma Pictures.

“Nathan Rusch, our lead Katana developer,” continues Dombrova, “has added numerous improvements and bug fixes to the usdKatana plugin, including import support for usdVol volumes and usdSkel agents, and a custom node for managing graph state for upstream PxrUsdIn stages – variants, population masking, etc. – which allows us to integrate our USD scene management UI within the node graph.”

Katana and USD in action

So how does this all get used day to day at Luma? Dombrova notes, first, that the general approach at the studio is to operate at the sequence level as much as possible. Artists in lighting, FX and compositing create scene files that act as templates for many similar shots in a sequence.

Another key moment from ‘Far From Home’.

Luma then uses its custom batch submission UIs and event-based orchestration engine, Rill, to automatically propagate changes across shots within the sequence and through steps in the pipeline of a shot. The end result is that exports at any point in the pipe can automatically trigger a regeneration of any downstream composites.

“For a large sequence where many shots share similar assets and lighting setups, we can empower a handful of artists to manage hundreds of shots,” states Dombrova.

“On Spider-Man: Far From Home, we had a sequence where Molten Man battles Spider-Man and Mysterio in a town square in Prague. When an asset artist released a new version of the Molten Man asset, we would automatically re-export animation, regenerate lava sims, re-render lighting scenes, and recreate a final comp for every shot in the sequence, with no human intervention required.”

Spider-Man has to save the day again in ‘Far From Home’.

In addition to Far From Home, Luma has utilized this end-to-end procedural pipeline on several shows, including Spider-Man: Homecoming, Thor: Ragnarok and Birds of Prey. “Katana is the backbone of this kind of workflow since it is able to represent sophisticated per shot variation so elegantly in its node graph,” attests Dombrova.

Why USD is important

Combining USD and Katana has, adds Dombrova, enabled Luma to operate at a scale that was not possible before, both in terms of sheer scene complexity as well as shot throughput.

“We are several orders of magnitude beyond where we were before the switch to USD and Katana, and we’re still nowhere near the ceiling. We’ve worked on shows where a single lighter was able to deliver over 100 shots out of a single Katana script, and the script opened in seconds. It’s nothing short of game changing.”

Head to Foundry’s Katana page for more information about the lighting and lookdev tool. 

Sponsored by Foundry:
This is a sponsored article and part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series right here.

Get bonus VFX material by becoming a befores & afters Patreon member