Join the VFX community by becoming a b&a Patreon...and get bonus content!
the details on Unity’s journey into real-time ray tracing.
At GDC 2019, Unity announced that it had partnered with NVIDIA to offer real-time ray tracing in the game engine. This was showcased via a 2019 BMW 8 Series Coupe demo, with a CG car rendered in Unity’s High Definition Render Pipeline (HDRP) with an NVIDIA RTX 2080Ti card.
Ray tracing has been somewhat of a holy grail for real-time game engines – Unreal Engine introduced the capability with its NVIDIA partnership last year – but it now feels like a much more achievable goal than ever before. befores & afters asked Natalya Tatarchuk, VP of Graphics, Unity Technologies, about the decision to pursue real-time ray tracing in Unity.
b&a: Before this real-time ray tracing announcement, what were some of the ways Unity or some of your users were doing real-time raytracing with the engine?
Natalya Tatarchuk (VP of Graphics, Unity Technologies): Last year we harnessed the power of real-time ray tracing technology for accelerating lighting workflows through a real-time path tracer to bake lighting with our GPU progressive lightmapper, which is built utilizing GPU for path tracing execution. With that, we brought the efficiency of GPU real-time ray tracing implementation to content creators workflows.
But this was just the beginning of our real-time ray tracing journey. We needed to make amazing, offline-quality professional graphics accessible to all our users in real-time frame rates, so they could take full advantage of the ray tracing technology. With real-time ray tracing, interactive graphics reaches new levels of realism, allowing for truly dynamic global rendering effects, not possible before at fast frame rates.
This year, we focused on expanding the toolbox shipping with our High Definition Render Pipeline (HDRP) to provide more variety of sophisticated tools to the users. Not every situation calls for real-time ray tracing, so we’ve been considerate when it comes to forcing innovation before the technology is in place to use it. Now, with the introduction of faster more performant GPUs such as NVIDIA’s RTX, we are nearing a place where real-time ray tracing is ready for adoption by developers at large.
In 2017 we introduced the High Definition Render Pipeline in Preview, which is our raster-based rendering solution that unlocks beautiful graphics and high performance for modern hardware. Like the rest of Unity, it was built to be extensible so it makes the perfect foundation for which to build our real-time ray tracing solution. In addition to being highly performant for consumer hardware, it also helps solve one of the biggest bottlenecks in asset creation, which is the pipeline for creating high fidelity content. Our real-time ray tracing solution allows creators to author once, and then determine whether they want to utilize it for raster-based creations, or real-time ray tracing.
Our ray tracing tech is built on top of our High Definition Render Pipeline with C# scripts and shaders, quick to reload and iterate. Our solution is built with choice, flexibility, and power in mind. We provide a wide variety of ray trace effects for you to play with.
C# and shader workflows give developers great flexibility on how to make changes and dig into the source as they have full source code access to our high-definition render pipeline implementation, including the real-time ray tracing tech. With C#, one of the main advantages for the graphics developers is the super quick hot reload iteration loop for quick feedback.
With this approach, we provide a solution that maximizes performance, gives great flexibility and control to the developers themselves. Of course, one of the biggest advantages of Unity’s scriptable render pipeline architecture is the ease with which we make graphics algorithms and pipeline architecture accessible for users’ customization. One such example is the effort the MPC team took to work on real-time ray tracing on top of the High Definition Render Pipeline for their production needs.
b&a: What tools and techniques come together to enable real-time ray tracing with Unity? What were the harder things you and partners had to solve to make it possible?
Natalya Tatarchuk: Last year we introduced the High Definition Render Pipeline, which is our raster-based solution that is capable of producing beautiful visual fidelity for modern consumer hardware. We built this technology to be forward thinking and future-proof, and to that end, we are building our real-time ray tracing solution on top of this scaffolding. This is actually solving one of the hardest parts of asset authoring, which is the creation of a reliable pipeline of gorgeous assets. With our solution, you can author once and create assets for both raster and raytraced situations.
We had to also solve the problem of authoring shaders in our shader graph tool for authoring both rasterization pipelines and ray tracing rendering, as well as making sure that our solution for real-time ray tracing supports all of the variety of production materials that we enable our users to build powerful real-time content within the High Definition Render Pipeline. And, of course, we had to work closely with partners to enable fast, efficient real-time ray tracing in our rendering architecture on the engine side, to enable the ability to handle complex, dynamic scenes at runtime.
b&a: Why did you want to introduce real-time raytracing in Unity? Where do you see users taking advantage of it?
Natalya Tatarchuk: Rasterization brings a fair amount of advantages to rendering efficiency. In our High Definition Render Pipeline, we are already performing physically accurate computations for lighting and shadowing. But in the rasterization world, we only operate on small, local parts of the scene. These approaches are just clever approximations that are fast on console hardware.
A trained eye can easily spot where they don’t match reality. If you don’t see the object on the screen, you don’t see its reflection. It’s just not there. And this is a common set of artifacts with all screen-space implementations, for example, with screen-space reflection algorithms.
Real-time ray tracing moves real-time graphics significantly closer to realism, opening the gates to global rendering effects never before possible in the real-time domain. We know that not everyone will implement this tech in their creations, but for those who do, we wanted to make the option available.
With the addition of real-time ray tracing, developers are going to be able to implement global illumination, reflections, and refractions that further blur the line between real life and real-time. With our real-time ray tracing integration in the High Definition Render Pipeline you truly get accurate global reflections or full-scene lighting bounce. It doesn’t matter if something is visible in the frame or not. You will still see its reflections. We give you the ability to render primary rays with rasterization or ray tracing [like for a full path tracer] for best quality. With that, you can get proper refractions for multi-layer transparent materials, or global dynamic shadows, or global occlusion – we’re now in the global space, where it’s all accounted for. You also get the full gamut of HD materials to play with: opaque, transparent, alpha-tested, double-sided, with full shader graph support. And, of course, dynamic real-time global illumination. And you also get great scaffolding to build custom effects using real-time ray tracing technology on top of our High Definition real-time ray tracing render pipeline.
We see users harnessing the power of real-time reflections to bring sophistication to their lighting pipeline, enabling true area lights soft shadows. We also see our users get superb reflections with our solution, creating incredible fidelity of material responses. The visual quality of transparent that’s available with our real-time ray tracing is also a completely new level of realism previously unachievable in real-time – we know our users will be able to harness that for realistic scenes and incredible art direction. Lastly, we know that many productions would love to rely on dynamic global illumination for fully dynamic, or procedurally mixed worlds – our real-time ray tracing dynamic global illumination solution enables them to harness that and having to author the dreaded lightmap UVs (something that many developers are excited to be done with!)
b&a: One of the obvious challenges with real-time ray tracing is having the hardware/GPUs/etc to be able to produce good results, which can be expensive. What are some of the options available for everything to take advantage of real-time ray tracing?
Natalya Tatarchuk: You’re right. In large part, real-time ray tracing was out of reach of the common creator because the hardware was too expensive. Now, thanks to advancements by partners such as NVIDIA, we are reaching a level of affordability that hasn’t previously existed. And as the prices of GPUs come down, the number of people who have access to and can enjoy the technology will only increase. In particular, having wider access to even mobile parts that support ray tracing technology makes this more accessible to a wider range of our customers.
The preview release of NVIDIA RTX in Unity is planned for later this year, but an experimental release is available right now. Unity has also just posted a breakdown of how they made the BMW demo in this blog post.
Join the VFX community by becoming a b&a Patreon...and get bonus content!