Virtual Production Field Guide
Articles

Exclusive excerpt from ‘The Virtual Production Field Guide’

How previs and virtual production meet, with Happy Mushroom’s Felix Jorge.

Earlier today, Epic Games – the makers of Unreal Engine – released an in-depth field guide to virtual production. It’s available for anyone to download as a 90-plus page resource about how virtual production can impact the work of filmmakers, visual effects artists and others in the industry, and it includes interviews with VFX supes, directors, stunt coordinators, art directors, and previs professionals.

This includes Felix Jorge, co-founder and creative director at Happy Mushroom, who was also a virtual art department supervisor on ​The Jungle Book​. In this exclusive excerpt from his interview in ‘The Virtual Production Field Guide’, Jorge discusses how real-time engines have become part of his workflow, including in terms of the much-hyped LED wall virtual production. Check out the interview below and download the PDF of the field guide at uevirtualproduction.com.

Felix Jorge.

Can you describe your current virtual production workflow?

Felix Jorge​: We’re living inside of Unreal Engine. And so we do photoreal environments, and we interact with production designers, art directors, and directors inside of the engine. The fidelity that we’re getting is changing everything.

Buy Me A Coffee

I was a previs artist for eight years, and we used Maya and MotionBuilder. And part of me deciding to found Happy Mushroom was trying to use the engine in a way that you bring that pipeline into the engine and show it as a stronger way to do previs.

Do you see real-time engines transitioning from previs to production?

Felix Jorge​: Everything we build asset-wise, we do it for the traditional visual effects pipeline and the real-time pipeline. So we build assets that live in both worlds because we were native visual effects artists. But then that same previs asset, if it’s photogrammetric or if it’s photoreal, goes straight into a visual effects pipeline.



Is it easier to adapt assets originally developed for post-production into real-time or the other way around?

Felix Jorge​: It is a lot easier to start from the get-go, knowing which pipeline that you want. When we start on a job and we know that we’re going to build these assets for production, we make sure that our high-res maps are there. We also structure everything in a certain way.

We’ve also built tools to track every single element, so that the moment that it’s time to deliver, we push the tool, or we push a button, and it’ll collect all the higher-res assets and the real-time assets and send it to a post-production house. It’s more manageable if you plan and you know that’s the pipeline that you’re trying to follow.

How do you optimize real-time assets?

Felix Jorge​: We get the highest quality real-time asset because we kick out the highest possible cache maps or 3D meshes. Those aren’t touched, and those are the ones that you would use in a traditional visual effects pipeline. We always start with making an extremely high-res, high fidelity asset that looks photoreal, and then we create a real-time asset from that.

How do you communicate the virtual production workflow to producers who are new to it?



Felix Jorge​: Whenever we create environments, we do it in stages. We typically start with blocking which comes from an art department, and since we live in a real-time ecosystem, before textures and photogrammetry we place cameras with people in headsets in the grayscale environment. This allows us to be informed by the director and the DP at the earliest stages. The real-time engine has provided an iterative process that enables us to build what is necessary and nothing more. We’re able to get better answers way faster than ever before.

Because we have an ecosystem where five people can each jump into a VR headset, we’re doing remote sessions over the Internet. We’re not just making previs assets; we’re making final assets that start at grayscale and progress to final quality assets that can carry through post-production. As a previs artist, I saw my work get thrown away all the time once it came time to do the final version. Now, we can do previs work that is more meaningful in the film.

What technological advancements helped enable the final pixel in real-time?

Felix Jorge​: One big leap is lighting fidelity. We’ve been fighting in the engine for a long time, but this past year with the addition of light baking and the global illumination, we tested between V-Ray and Unreal. We have them posted on our ​website​. The quality of the bounce light that Unreal Engine is giving us is as good as a final render from V-Ray or RenderMan.

How about tech advancements on the hardware side?

Felix Jorge​: We’re using NVIDIA 1070s and 1080s. We also use a swarm server and we bring it with us wherever we go. We have a couple of Intel processors, and they’re churning everything. The rendering has gotten a lot quicker and our video cards are tearing up entire scenes.



Do crew members take higher-quality previs more seriously?

Felix Jorge​: As someone who started in previs and later got into finals, it used to be previs, and finals didn’t mix. And part of what’s happening now is that because the fidelity of the engine has increased, the two areas are overlapping more. As soon as we started using Unreal for previs and improving the quality of our sets and improving the quality of our lighting, we got a lot more attention. People wanted to use the engine to get closer to their vision. They want to be able to inform a full production about their desired look and mood.

How does LED wall virtual production come into play?

Felix Jorge​: It’s going to change the way films are made entirely. We’re doing a lot of tests with it. We’re pushing the engine in a lot of different directions and testing the ray tracing aspects of it. We can hit 90 frames per second on a pretty standard computer and have photoreal, production-ready environments on a decent scale.

I see it as a similar business model to mocap where you can arrange a stage and get practically unlimited amounts of footage. It’s a revolution for asset creation, set design, production designers, and DPs. If you’re shooting in front of an LED screen into Unreal, you can modify your environment live on set. It feels like you’re shooting a production with classic rear projection as they did in the ‘40s.

It can also cut the amount of time by more than half if you’re on an episodic series. For the first season, you build most of your assets. In the second season, you’re going into the stage to shoot whatever you want. The savings per minute is just insane.



How do you align the virtual screen with the live action set for extensions?

Felix Jorge​: We typically do photogrammetry of the area so that we have it exactly the size that it is. But on the day when you bring that 3D environment, we do a ​LIDAR scan​, which gives immediate point data, and then we align it using the point data. It’s virtually impossible to mess that one up once you do the LIDAR.

What are the limitations of the live effects approach?

Felix Jorge​: Big blockbuster shots that are coming out from outer space or long panning shots are not the best shots for this kind of technology. But part of the beauty is because you’re making photoreal environments on Unreal, you can kick out those assets for visual effects. They can then rebuild your environment and create the shot within a traditional visual effects post-production pipeline. The difference with this new technology compared to old school rear-projection is it’s tracked and the parallax is real. So there’s a lot more subtlety and you’re able to get a lot more in camera than you’d think.

How do you involve the creative team in previs?

Felix Jorge​: Directors and DPs are our biggest fans because now we work with a production designer. He’s able to create an environment at a much higher fidelity than it used to be in previs but in a fraction of the time. We do a photogrammetry capture of the space, and then we cut it up, and the first pass looks fantastic. The director can then come in and jump in a headset or literally just stand there and look at an environment that is not only a 2D image, but something that’s representative of what might actually make it into the final product.



We find that directors are very attracted to that and want to direct and show their shots. They’re placing cameras, and everyone can see what they’re framing. It used to have to wait until the previs shot was rendered out, now everything’s happening live. Everyone’s more interested and engaged in working in the same live previs environment.

Part of what we’re working on right now is building infrastructure, making sure that we’re growing at a healthy rate. It’s a technical field, so we’re trying to build our engineering teams so that they could support the content team, which is super important, and maintain the speed at which we’re moving. I’m pretty excited about where the engine and the technology are headed.


Become a befores & afters Patreon for bonus VFX content


Leave a Reply

back to top