Join the VFX community by becoming a b&a Patreon...and get bonus content!
The story behind how Unity made the game in the film.
Shawn Levy’s The Adam Project includes several scenes of 6th grader Adam Reed (Walker Scobell) playing a space ship video game that had been made by his deceased father, Louis (Mark Ruffalo). At one point, the game play is intercut with the flight and crash of Adam’s older self (played by Ryan Reynolds) from the future. Young Adam’s experience flying the game ship also proves to be useful later on in the film.
The space ship game play was crafted, suitably, by Unity Technologies, which actually generated a real-world environment and ship path that could be ‘played’ in order to produce shots for the film. Indeed, in recent times, Unity has not only utilized its real-time engine for game development, but also virtual production, especially for visualization projects. Here, the engine was used to directly devise and iterate on the game play of the ship flying through canyons, and for rendering of the final game play scenes shown in the film.
Habib Zargarpour, virtual production supervisor at Unity Technologies, who worked with visual effects supervisors Erik Nash and Alessandro Ongaro on the game, explores its creation with befores & afters. The video below is actually from a review session for the content.
b&a: What were you asked to do for this ‘game’ played and shown in The Adam Project?
Habib Zargarpour: There was a lot of flexibility in designing what the world looked like, and what Adam would see. We were also tasked with making it look a little like an 80s or 90s game, so we didn’t have to make it look like a super-polished, current-day game. The look of the game had to be slightly retro, which is when Adam’s father made it, and we’d stick to the things that were possible then. Having said that, one of the shots does have motion blur! You wouldn’t expect that in the 90s. But we still had to make it filmic.
What was really interesting was harnessing what benefits you get from an actual game engine to do these things interactively. By the end of it, it was tempting to see if we could actually release a game, since we had the content. I mean, it would’ve been a very short game, but it was just enough material to explore the worlds.
b&a: Did the game need to ‘work’ during filming, or was it something you ultimately did in post?
Habib Zargarpour: It was sorted out in post. When they filmed the scene, they shot with a TV displaying a greenscreen. In the final film, there’s actually shots where this game comes on full-frame. I was surprised to see that–it was great! The other exciting part was that the footage was exported directly from the Unity game engine as industry standard, professional high-dynamic range EXRs with alpha channels. Which means it could then go onto a visual effects studio to place into shots.
b&a: How did the game get designed?
Habib Zargarpour: We received some concept art and then we also launched into more design of the world ourselves. We had to work out what happens in the scene, the different actions that we wanted to pace out, how many enemy ships, where they come from, what happens? And then what the weapons look like? The important part was the path the ship flies and what the camera was doing and what the cockpit was doing.
We did a lot of tests, and this is where interactivity within the game engine really helps. It meant, in real-time, we could be flying around and we could change the path of where the ship was going at any time. What we also used was our virtual camera, in particular, the rail rig. We could fly along and then attach the camera to the cockpit, then do camera moves based on that.
b&a: What fascinates me about that is maybe someone watching the film might think this game animation was crafted in your normal 3D animation tools. But here you’re drawing upon your own previous virtual camera work and virtual production work to make it as dynamic as possible.
Habib Zargarpour: Absolutely. You could have used a fully traditional pipeline for this. But then it would’ve taken a lot longer and with a bigger crew as well. I think it’s a perfect place to harness all the benefits of using a real-time engine, especially since the output had to have that look anyway. You’d have to spend R&D time to make a render look like it’s in-engine.
I think this was a blend of the best of both worlds, where you’re in the real-time engine, so you’re iterating real-time. You’re iterating quickly, and you’re also benefiting from a lot of tools that we now have. Things like a vegetation tool that we used to populate trees over the whole planet. In fact, at one point, each one of the trees in some of the scenes had tens of thousands of polygons. When I was rendering the finals, this was pushing 500 million polygons. My previous record was 50 million.
I would do passes in real-time, and what’s interesting is that I did those while on Zoom calls with Erik Nash and Alessandro Ongaro and other crew on the film. So, we never had any waiting. They’d see the flight path, give me comments directly, and then I’d make the camera changes right there and then. And I’d record all the takes. If it was environmental re-modeling, then we have to do that offline. But a lot of things can be done in real-time, including lighting and tweaking the motion path and the speed of flight.
b&a: What were some of the tougher design challenges, do you think?
Habib Zargarpour: The cockpit was tricky. We had some shadows and reflections on the glass, as if it was a little dirty. One of the things that was fun to implement was, in space you don’t have any sensation of speed because there’s nothing there. So I put that ‘space gas’ in there so that you get this sensation of moving through stuff. It was just a particle emitter around the cockpit.
We also had the HUD. Supervixen did the final HUD design. There was this ‘targeting’ animation and I had to do some crazy math to make that work. I had to script that so that it existed on the actual plane of the window, but then would properly appear to be looking at the target.
b&a: I was wondering, Habib, if you’d had the time in pre-production, and perhaps also had had any time to do rehearsals with the actor, whether you think this is something that could even have been ‘filmed’ in situ? I mean, if you’d made the real-time game before the shoot?
Habib Zargarpour: Well, yes. I mean, it’s very similar to doing an LED shoot. If you set it all up and do the preparations before the shoot.
b&a: It would have been interesting if the actor had had time to practice the game, and go through particular canyons, for example. I suppose it could all still be replaced later if need be, but that would have been fun to see!
Habib Zargarpour: Actually, I think they had some stock footage at some point that they were using. I’m not sure if that was used on-set, maybe during the edit, but yes, in real-time it could have happened. It’s just that maybe we couldn’t have had 500 million polygon trees!
Producer Unity: Ron Martin
Virtual Production Supervisor: Habib Zargarpour
Concept Artist and Previs Artist: Ksenia Kozhevnikova
Unity Technical Supervisors: Patrick Nugent, Jean Blouin
Environment Artists: Chris Nelder, Maritza Louis
Join the VFX community by becoming a b&a Patreon...and get bonus content!