How and why the studio built a proprietary ray tracer
When Gravity was released in 2013, one of the many major technical achievements made by Framestore in crafting the film was to implement the Arnold renderer into the studio’s pipeline.
Now, some six years later, Framestore has followed in the footsteps of several other VFX and animation studios and developed its own proprietary ray tracer. It’s called freak.
befores & afters sat down with Framestore global head of CG Mark Wilson and head of rendering Nathan Walster to find out how the studio embarked on the in-house rendering project, what projects it has already been used on, and just where that name ‘freak’ came from.
b&a: Why did Framestore decide to make its own in-house renderer?
Mark Wilson (global head of CG): Well, when we took on rendering in Arnold for Gravity there was a need to do multi-bounce path tracing for the film. At that specific time, we were evaluating different renderers to see what would be the best ones to give us that look. We were a RenderMan house at that particular point, and PRman was only just starting to develop its ray tracing engine, so it was early days there. We evaluated V-Ray and Arnold. V-Ray and Arnold performed comparatively well then, but from a workflow and pipeline point of view, Arnold, with its pedigree of coming out of Imageworks, had a more attractive workflow that was similar to how we were currently working in PRman. So it fitted into our pipeline quite well.
We made the big decision then to switch to Arnold. Throughout the course of Gravity, we spent a lot of time developing our shaders and workflow in Arnold. That development went quite far in terms of all the things that were developed inside the Arnold renderer. Even after Gravity we carried on developing a lot of shaders and then implementing our own lights, implementing our own ray integrators, to the point where, really, we were using Arnold to tell us where the ray hit a piece of geometry. And then we were taking over pretty much everything, even with our own volumes.
It got to the point where it wasn’t a crazy discussion to talk about, well, could we replace the Arnold part of that with something else? That’s when the idea of of freak was born. And then Nathan went on a voyage of discovery to find a good ray intersector that could replace Arnold.
Nathan Walster (head of rendering): We got to this point where we had Arnold as our ray intersector and we’d built so much technology on top of it; integration, BRDFs, volumes, lights. We questioned, what would we actually need to change to just have the whole thing ourselves? It was also good timing. Embree was coming out from Intel. And some of these open source technologies were maturing to the point where they were production-ready. So we started piecing those things together and fleshing out a renderer.
You might like: Maya 2020 is out
In some ways, freak is a hybrid because it started as something that piggybacks on top of Arnold and has then become its own thing where we swapped in other pieces. So in that respect it remains very modular. There’s no need inside the render itself for it to be based on Embree or anything like that – we can swap these pieces around – maybe that’s why ‘freak’ is a good name for it!
Mark Wilson: Also, there are several driving forces to actually making your own renderer, one of which is obviously the cost of buying someone else’s software. Generally speaking, it quite often works out cheaper to buy software than to develop your own. Even though you don’t have the license costs, you get all of the support and the development and the result of lots of people providing feedback to that software company.
But when you’re doing very large scale rendering, you have some very specific requirements based on certain projects, things you’re trying to get through. Even on Gravity, we ended up developing the precursor to our bi-directional path tracer integrator, which wasn’t available in Arnold at the time. So you start to branch away from where the core software is. Having freak gives us a way to really develop the areas that are important for our productions. Having our own platform where we can develop the latest rendering and shading technology in that particular area just gives us a bit more flexibility.
b&a: Is there a particular way that you describe freak, in terms of a ray tracer?
Nathan Walster: I don’t think there’s anything unique to freak in that way. It’s not a spectral renderer, for instance. We support all manner of integration. So, it could be a path tracer, it could be anything. Our lead engineer on the project, Jose Miguel Esteve, was keen to get across the idea of modularity and how it’s built of components that we can just swap around.
b&a: What’s the general timeline for when freak came into existence?
Mark Wilson: I think like all software projects, we’d hoped it would have rolled out sooner than it actually was. But the first use of it in production was on Alita: Battle Angel where everything was rendered with freak. The first rollout of freak didn’t have support for effects rendering. It couldn’t do volume rendering at that point, and particles were limited. But luckily the work we did on Alita was a lot of environment rendering and not a huge amount of effects.
We did have some effects rendering to do – actually, that was what was cool about developing freak, as we switched over to the renderer, we had our complete shader library and ray integration already in existence to be run in Arnold. So we could actually render the same frame in both freak and Arnold and get a pretty good pixel match from both renderers. It enabled us to render the majority of shots in freak, and then for any effects passes we could actually use Arnold to fill in the gaps.
Then once that first project was in production in freak, everything snowballed and got a lot faster. We were obviously focusing on getting the effects rendering in place. Then all that got done and everything quickly switched to freak rendering. However, it wasn’t a case of, ‘Oh, we’ll just turn off our nodes and then try and render with freak.’ It was very much a transition period and it pushed freak a lot. It had to match Arnold, and that was a significantly high bar to set.
Nathan Walster: It was actually quite a nice way to develop something, though, because you know you’ve got this target that you’ve got to hit, but you’ve also got a safety net.
Mark Wilson: Since we’ve switched to freak fully, i.e., since after Alita, we don’t have that option anymore! But that’s also where the sweet spot of freak really comes in. Now we’ve got everything that the studio needs to be able to deliver big shots. Now we can do the interesting things in terms of developing it further and customizing it for our productions.
b&a: Had you tested freak on any other production earlier?
Mark Wilson: During Thor: Ragnarok, we were taking some of the shots and running them through freak to test how it was going and identifying any areas that we needed to push further and do the Arnold comparison with performance. On Avengers: Infinity War, we were running renders through freak as well. We actually had hoped that we would run Infinity War solely through freak, but a couple of things meant that we only ran some of it through freak, and the majority through Arnold.
Freak was in a really good spot, but then you get down to a discrepancy of sub-division surfaces with displacements and micro-bump kind of in detail.
If there was a slight pixel difference, then people would be flagging it. Nathan and his team would have to go through the pain of figuring out why half a pixel was wrong. Although that meant we really were getting down to the detail, it also gave us a really clear goal to try and achieve. Without that, there could have been the potential where we would’ve rolled freak out with some problem areas in there.
Nathan Walster: Having those shows like Ragnarok meant we could just pick up that data and run shots. That was like a test environment that we were free to try out things in, and figure out why things weren’t working. Having those projects there accelerated the development even though we didn’t use freak directly on them.
Mark Wilson: It’s really important in developing any software internally – and this is something we’ve learned over various projects – it’s really important that when you do roll out your first version of the software, you’ve already generated a lot of data that it’s compatible with. It’s really frustrating to roll out a piece of software and then realize you actually haven’t got any data to test it.
b&a: So have you fully moved onto freak for the whole of Framestore?
Mark Wilson: Everywhere in film, yes. It depends on the scope of the project. We recently delivered His Dark Materials for BBC and HBO, and that was all rendered in freak.
b&a: Tell me about the name – I’m familiar with how Framestore likes to use ‘f’ at the beginning of its tool names – but how did ‘freak’ as a name come about?
Nathan Walster: Yes, we try and make most of our tools begin with ‘f’ at least, or ‘fr’. To be honest, the inspiration was the Chic song, ‘Le Freak’ [laughs].
b&a: I remember covering Gravity and the advent of your use of Arnold and how, of course, that involved a major shader writing exercise. Did you have to do that all over again, or was it a more streamlined process since you’d already been adapting Arnold to your pipeline so much?
Nathan Walster: Actually, the shader development that was done for Gravity when we first moved to Arnold was, in some ways, quite simplistic compared to what we’re doing nowadays. That’s all been re-written and done again as we built on top of Arnold. So we’ve never had to re-write everything just for freak. It was designed in a way where, because we were piggybacking on top of Arnold, we could slowly migrate it across. We’ve ended up with a system which is extremely flexible and where we are able to accommodate the state of the art shading models.
As Mark said, now every show has moved to freak, we can start to reap the rewards of that. I think now’s the time where we’re starting to look at, how can we push all this kind of shading and integration and look forward to the next five years. That’s really exciting for us at the moment.
Mark Wilson: In our roadmap, phase one was to be able to replace Arnold. This next phase is the exciting part where we can implement a lot of very custom optimizations for render times and also workflow, and tailoring it to the types of projects we’re doing. All of the heavy engineering work is out of the way.
b&a: There are a lot new things happening in rendering, such as machine learning, GPU rendering, and interactivity. Where is freak at with these kinds of things?
Mark Wilson: Throughout the course of developing freak, Arnold has improved significantly. We’ve talked about how we replaced Arnold, but although we were aiming to replace it, we are constantly looking at what Autodesk is doing with Arnold and also what Pixar is doing with PRman. In a lot of ways we’re trying to catch them up, because we started from scratch when they already had mature products. They’re already in that sweet spot of adding cool new technology. We’re following their developments very closely and it’s really interesting where they’re going.
There are a few big things in rendering that we’re looking at. One of those is GPU accelerated rendering, which is interesting to us from an artist interaction point of view. We’ve been debating how we can fit some of our production scenes in the memory that is available to the GPU. A lot of our production data is enormous.
Also, rendering on render farms or in the cloud, is something we’re looking at. Generally speaking, GPU solutions are significantly more expensive. So we’re keeping an eye on that, but that’s not an area of development that we’re pushing to catch up with or implement in freak. Our workflow for production doesn’t necessarily demand that so much.
One of the key things that we’re trying to improve massively is our artist interaction for when they’re doing look development and lighting set-up and how we get to that first pixel faster. Currently our IPR isn’t fantastic and we’re looking at adopting USD solutions to make that a lot better. That’s the big thing internally, to make a really nice interactive environment for working with freak.
In terms of machine learning, we have a machine learning department at Framestore. Their goal at the moment – their key projects – aren’t really around rendering. There are things on the periphery but nothing with the actual ray tracer itself just yet.
What we have with freak and our render pipeline at the moment is the ability to deliver large scale. We can deliver a lot of shots with a lot of complexity very efficiently, which I think is a really good first milestone to have. And now we are looking at, how can we make the artist experience as interactive and as fast as possible.
b&a: One thing that I think is interesting about rendering is the community. Do you see freak getting out there more in terms of, say, discussion at SIGGRAPH or in the general CG community?
Nathan Walster: I hope in the future we can have a bit more of a presence at places like SIGGRAPH now that we’ve got this technology. I think maybe it’s the kind of reserved ‘Britishness’ that has kept us talking about it until now. At Framestore we tend to wait until we’ve got something that we consider really cool to talk about before we do.
Mark Wilson: Now that we’ve got the foundation work done, we’re hoping to be in a better place to shout about all the latest things we’re doing. And hopefully next SIGGRAPH we’ll be talking about our amazing interactive lighting in freak!
b&a: Did you have a Eureka moment in the development of freak where perhaps you saw something rendered for the first time and went, yep, we’ve done it?
Nathan Walster: Well, I think the Eureka moment for us was when you see someone other than you, say, an artist who has actually delivered a shot and it’s turned up in dailies and you didn’t realize it was branded with your renderer. There was this cross-over point around Alita where that happened and it was fantastic.
We’ve also got, on the other end of the spectrum, what Jose calls the ‘gallery of horrors’, which is this folder on his desktop where he’s been compiling images as he’s been testing renders. Things like the Stanford bunny, but the head’s missing or something. [Editor’s note: Unfortunately Framestore didn’t supply me with any of these…]
Every step of the way has been exciting. You start off from nothing so it’s really cool to see any images. When you see a shot come out and know it’s going to go in the movie, and that our renderer was behind it, that’s a really cool thing.
Get bonus and early VFX content via a befores & afters Patreon membership