Eagle vs. meerkat: behind Weta Digital’s stunning Unreal Engine short

November 18, 2020

Fur and feathers!

If you watched the Epic Games Unreal Build: Virtual Production event last week then you might have seen Weta Digital’s presentation of their ‘meerkat’ short.

It utilized Unreal Engine’s hair and fur rendering and simulation tool (coming in UE 4.26) to help tell a fun story of a meerkat and an eagle—which meant millions of fur and feather strands.

To find out a little more, befores & afters asked Keith Miller, Weta Digital visual effects supervisor and creative director – special projects, to explain how the team used Unreal Engine for the short.

b&a: What were the things in Unreal Engine you particularly wanted to experiment with, and what surprised you about what could (or couldn’t) be done?

Keith Miler: We really wanted to accomplish a deep dive into a couple main areas – ongoing real-time fur exploration and using Unreal for rendering linear media. The former was really just an extension of the initial work we started in 2019 with Caesar and Maurice.

We’d been trying to target a GDC fur demo working in 4.25, but plans of a big showcase fell apart early in the year with COVID. We continued talking with Epic’s fur team and wanted to pick things up again with the new work going into 4.26 and make sure we were able to help push that development and ensure it handled our needs.

In parallel, it seemed like a great opportunity to approach Unreal as a framework for rendering linear media. We’d created a number of interactive experiences previously and we knew there were workflows there that were likely to benefit production of ‘offline’ rendered media in the engine as well.

We were definitely surprised at what the engine was able to handle with the new strand system. The overall fidelity, combined with the performance, allowed us to really push the limit and directly start to integrate the types of dense grooms Weta Digital typically produces for VFX work. The feathers were the main challenge, primarily due to the real-time rigging requirements and the nature of the structure of individual feathers. Nathan Farquhar, our modeller/groomer, worked out the setup for the geometry based rachis/spines as part of the skeletal mesh, enabling us to bind the barbs as strands.

b&a: Can you talk about what you were able to accomplish with meerkat fur/hair and the eagle feathers? How far could you take these in real-time?

Keith Miler: It was probably back in 2017 that we first started exploring fur in game engines. Back then, you didn’t really have many options outside of labor intensive card setups that always required subjective interpretation to translate from the original grooms, at least for the type of heavily-furred VFX assets Weta Digital frequently works with.

This is also true for some of the earlier shell-based systems out there at the time. NVIDIA had HairWorks of course, but development stopped and it was always difficult to incorporate in more recent engine updates. Once Epic indicated their planned developments in the strand rendering space we immediately got on board and have been amazed at what we’ve been able to make work.

As noted in our talk at the Virtual Production Summit, we were able to avoid the type of aggressive optimization that might have been required with most interactive products. Since we knew we could render “offline” we were able to preserve as much detail as we wanted and use the 400k strand meerkat groom and 2.5M strand eagle groom we rendered with. That said, we were still getting solid interactive performance in the editor for most of the content. In the few shots close to camera where performance was an issue, we simply managed a swap to a lower resolution groom as required.

 

b&a: The lighting and camera movement for the piece were fantastic – can you run down how you approached these in-engine?

Keith Miler: Richard Frances-Moore, Weta’s Senior Head of Motion, blocked out the initial cameras from the storyboards against chess-pieced motion during staging early on. This work was all done in Maya, and Ludo Chailloleau – our Animation Supervisor – picked things up from there and continued to develop the cameras and animation throughout the project.

That’s the beauty of real-time workflows. We had time to continue to refine things in motion all the way down to the wire really, whereas with traditional feature film VFX, it’s usually “pencils down” much earlier due to the downstream dependencies. We would even continue to refine the cameras in the engine and add additive tracks in sequencer to nudge them around as required to tweak framing in individual shots.

Since the motion was continuous throughout, animation would work with a single ‘uber camera’ that represented the entire short giving them some real efficiency. Whenever we wanted to update the cameras in the engine, we would just run a script that would chop up the camera into individual shot cameras based on an EDL and bring those into sequencer and into the shot tracks.

Our CG supervisor Thelvin Cabezas did some inspiring lighting work. We studied reference of the Kalahari and tweaked the lighting and Megascans assets to maintain that beautiful red saturation you see in the bounce lighting on the rock structures in similar regions. We opted for Lightmass and baked lighting for the indirect, while the direct lighting was kept live. Shot-specific lighting tweaks were laid out in the sequencer shot tracks, as well as shot-specific controls for various lighting/rendering settings.

b&a: What sort of team was involved? Did you give the piece a ‘name’? What do you think this all means for changing any traditional VFX workflows right now?

Keith Miler: It was a very small core team overall. Once creative was established, it was one main animator throughout, one artist handling modelling and grooms, and primarily one artist handling environment / lighting / FX as required. Oh, and a couple creature TDs for shorter windows handling the eagle / meerkat rigging requirements.

We did not give the piece an official title as we didn’t set out trying to make this a major production. We wanted to emphasize the workflows and processes behind the scenes and just see where we landed.

With respect to those processes, I don’t know that VFX facilities are going to be abandoning their path tracers just yet, at least for traditional VFX. I do, however think we will continue to see an increase in the utilization of real-time technology across a number of areas in the filmmaking process. I think as Unreal continues to improve the technology and embrace and refine the processes, we’ll continue to see widening adoption and ever-increasing potential use-cases for VFX and filmmaking in general.


Subscribe (for FREE) to the VFX newsletter




Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading

Don't Miss

Behind the Oscar-nominated short ‘WAR IS OVER!’

How the short was made, including how Wētā FX used

Go behind the scenes of this virtual production ‘Ghostbusters’ proof of concept

It was presented at SIGGRAPH this year.