befores & afters got an update from Unity itself.
In November 2021, Unity announced that it was acquiring Wētā Digital. This related to its artist tools, core pipeline, intellectual property and engineering talent. Meanwhile, Wētā FX would continue as a standalone entity.
Ever since the Wētā Digital acquisition, the VFX and real-time communities have been keen to know what its impact would be. For example, would tools developed at Wētā become part of Unity? Would they be available to other studios and artists to use?
In the wake of the release of Avatar: The Way of Water, where so much of Wētā Digital’s R&D, tool development and artistry is on display, befores & afters spoke to Allan Poore, Senior Vice President, Wētā Tools at Unity, about the state of play following the acquisition.
Here you’ll read about what’s already been happening and what some of the next plans are with Unity Wētā Tools.
b&a: I think people are interested in where the relationship between Wētā Digital and Unity is at. Can you share some info on that?
Allan Poore: Over a period of time, we’ve made some good progress recently on three tools and productizing those which are going into betas and alphas right now. Those tools are Wig, deep comp and the Eddy tool. Now obviously, that’s just the first initial pushout of these tools that we want to go to. We also want to bring a lot of the other tools from Wētā into customer’s hands.
How we do that on each of these and make them usable outside of the Wētā pipeline and more applicable to other things just than that, is what we’re spending time on. We’re looking at things like Loki, Manuka and these other big pieces of technology that we can bring in, too. We might not bring them over as they currently exist today, but the premise of what they are and what they can do and how they plug into other tools [is how we’ll implement them].
As we’ve gone around, we’ve talked to a lot of studios. The message that we’re really clear about is we want to provide modular pieces that can plug into existing pipelines. We’re not going to force anybody down one particular path. It’s about bringing those tools in where they can plug them into their existing pipelines without major disruption.
b&a: Were there any plans for a release of Wētā Digital tools into Unity or anywhere else?
Allan Poore: There’ll be three or four that we’re planning to roll out by the end of this summer timeframe. You’ll start to see a lot of that. Some of them are already in studios today that we’re doing early testing on now and getting feedback and working through those pieces. So you’ll start to see those roll out in the next Q2 timeframe.
b&a: One thing that people are interested in is, of course, that Unity is a real-time gaming company, and some of these tools are not real-time tools. How will that play out?
Allan Poore: Well, these worlds are converging more and more all the time and we’ll get there. You’re never going to have something like Manuka and what it can output in real-time anytime soon but there’s a lot of things you do up front to get there. We talk a lot about time to first decision for an artist and how you can make that as quickly as possible.
We are spending a huge amount of time seeing what we can do to make these things real-time while preserving the fidelity, where it makes sense, because I think that’s really important in both the visual effects and gaming industries. There’s been a lot of time spent on, how we can improve the HDRP (High Definition Render Pipeline) in Unity to pick up these pieces and do different things at higher scale and fidelity in real-time that can apply to both these markets.
b&a: Specifically out of The Way Of Water, there were so many advancements made, but how do you feel Unity and Wētā Digital might leverage some of the tools that had been developed since literally 2017?
Allan Poore: Yeah, there’s a lot in there. We’re looking at how we can pull some of that tech across. For example, Weta Digital made major improvements to their facial animation system and we have a similar solution called Ziva that is shipping today. We need to pull the best from both these technologies to create even better products for our customers.
We’re looking at, what are the overlaps? Where can we have the best of both worlds in these pieces of technology? By combining them, could they be a better solution long-term for both visual effects and gaming? Those are the things that I see that we’re pulling across. The facial animation and some of the water dynamics are a little, I think, more challenging about how we do that but something that we want to do moving forward.
b&a: Just to re-visit how the acquisition worked, how are R&D and tech projects managed between Unity and Wētā Digital? How does it work?
Allan Poore: Basically, Unity acquired the tools, technology, pipeline–all of those kinds of pieces as part of the acquisition a year and a half ago now. But also, the 275 engineers report up through me in the group. And so, the roadmaps are being defined right now for the next year and we’re looking at, where do we plug in? What are things that we can share? What are those pieces that we can translate over? Where is there alignment? Where is there a disconnect?
So those are the kinds of things we’re looking at in terms of where can we find common ground on what would make sense to go productize or even start to build them in ways that make sense so that we can pull them out of the studio faster and give them to people.
b&a: Just finally, I’ve covered a lot of virtual production stuff done at Unity, and now you have Wētā Digital there too. What can you say about that overall ecosystem in terms of virtual production? Because I think it actually is another exciting thing for VFX artists and filmmakers to be able to grab from it.
Allan Poore: I believe that we’ll continue to see lots of innovation in virtual production. I think that’s also married directly to real-time which Unity has a lot of experience in. And how those two things merge together and what that looks like, we’re working through those pieces. Talking to the folks at Wētā FX and other places, there’s a large investment in this area moving forward and how you even rethink filmmaking.
We’ll be partnering with them to go through that and decide what pieces of technology we can pull in from our real-time engine that could apply to the work that they’re doing and vice versa. And then, the plan is to productize them giving everyone access.