How the director jumped into 3D and real-time tools to help make the film.
Aaron Schneider’s more than four year journey on Greyhound included, early on, him tackling some of the shot design himself in Maya. As the film took shape, he relied on previs teams to help flesh out complex ship-to-ship and ocean scenes, and then finally the VFX expertise of visual effects supervisor Nathan McGuinness and DNEG.
Here, Schneider chats to befores & afters about his involvement with visual effects, starting with his early love of the craft and the kinds of things he could work on himself in planning for the movie.
b&a: I wanted to start with your own journey—knowing that this film would involve a lot of visual effects, what were some of the things that you had to do to get your mind into that visual effects mindset as part of the filmmaking process?
Aaron Schneider: It’s a great question. The [VFX] guys did an amazing job with the sort of limited time and budget that they had. They worked their rear ends off, and it shows on screen. But the truth is, that effort began four-and-a-half years ago when I first came on the film.
When I left engineering at Iowa state to come out to USC film school, I wanted to get into special effects. That’s what they called them at the time. I wanted to shoot the Millennium Falcon against greenscreen. I wanted to build models and work with motion control. And so, I had the ILM book and all that good stuff, and I read Cinefex magazine.
And in fact, my first semester at SC, they held, for the first and only time, at least during my years, a special effects class. It was led by Mark Vargo, who had been the optical lineup printer lineup guy for Return to the Jedi. He knew all of the legends, because he was part of the ILM club, and he would invite guys like Richard Edlund, and even Pangrazio, the matte painter came in, and Phil Tippett, the animator, came in, and brought some of his stop-motion models. And we went over to Trumbull’s studio and got to see Showscan, which he was developing at the time.
So it was this amazing class, but every one of them would say at the end of class, the future of effects is computers. And I had just left engineering and was really turned off to zeros and ones. I said, ‘Look, I want a career in the hammers and nails of filmmaking. I don’t want to sit at a computer.’ And that’s when I gravitated towards cinematography. I came out of SC and came into the business through the MTV music video craze. That’s where I cut my teeth. And that eventually led to television and feature films. Then I made a decision to direct and dumped my life savings into a short film that got my career started back in 2004, when the short [Two Soldiers] won the Oscar for my producer and I.
That was my road, and my road was scattered with the passion for visual effects. My first film was very much a character piece, Get Low, very much an actor piece. We had a few cleanups to do for the period, but not much by way of visual effects. So when I went in to meet Tom and Gary Goetzman, his partner, on Greyhound, I felt the need to pitch that background, that visual effects background, that passion I’d had for so many years.
b&a: How did you get started with planning out the film?
Aaron Schneider: I really dove in. See, there was nobody on the film. Here’s the irony of the story, by that point I had become a self-taught visual effects artist. I got myself a copy of Maya 2005 and started experimenting and watching. Today, you can go on YouTube and find a trillion Maya tutorials. But back then you had to really seek them out. And so, I spent years, as a hobby really at the beginning. But then eventually, I have a director friend, who would every now and then call on me to do a couple of visual effects for his films. A couple of them were even studio films, and one of them was a native 3D. I had to render and comp in 3D.
And so, when Greyhound started, I had the tools and the experience to start getting inside Maya and playing around with how you would make Greyhound. And when I say that, it wasn’t born from any desire to just start making Greyhound in the computer. My first thought was, ‘There’s no way we’re going to be successful with this film, with all these digital effects, if we don’t find a way for the visual effects to be grounded in a sense of reality and verisimilitude, whereby it looks like we were out on the water shooting this ourself.’
I engaged in this long journey, keeping most of it to myself, until I had some crew on that I could share it with, where I wanted to recreate the digital version of open ocean photography. The idea that if you shot this for real, you’d be out in the North Atlantic, with the destroyer, ploughing through these waves and getting thrown around. And you’d be on a camera ship, with the wind in your hair and with spinners on the lens, and the horizon moving around, and the poor bastard camera operator, trying to keep the destroyer in frame, out there on the deck, rolling in the seas, the way a lot of this Navy archival footage is shot, right?
I set about literally recreating open ocean photography. But as you can imagine, it’s very difficult to do in a digital world. Because as soon as you’re designing the way a ship moves, you’re already behind the eight ball. Because nature isn’t involved.
So I did some research, and I found this cool plug-in that Nvidia had developed for its game creators called WaveWorks, which was just basically a plug-in that allowed Nvidia-based game designers to throw in Beaufort scales, which is a way of measuring ocean scale and wind parameters and a few other attributes. And it created a deformed ocean surface that was based on the math of open ocean waves, a live displacement map, that not only created the ocean, but actually floated objects on that deforming surface, based on the physics of their weight and their angular momentum.
If you entered the tonnage of a ship, it would look at the geometry of the hull and calculate its tonnage, look at the matrix of the displacement map and calculate proper ship movement for that deforming surface, just like the real world. That’s what the plug-in did. And it was the only thing in the world at the time, maybe still is, that did that in a live way.
We worked with some really talented coders, and one guy from Nvidia, and a coder who helped him out in his spare time, to port this otherwise Nvidia plug-in as a plug-in for Maya. Where you could create a plane in Maya and attach one of these nodes to it and be able to, with Viewport 2.0 and what Maya’s doing under the hood, to literally float ships on a deforming surface and quite a high-res surface too, by the way, if you wanted it to be. I floated two ships next to each other, and I put a camera on one ship and even experimented with some live operating, so that Maya was rendering the action in real-time, and you could sort of handhold the camera to shoot the other ship.
By recreating the chaos of open ocean photography inside of Maya, you would be grounding the visual effect before the rendering, before the modelling, before the texturing, before the comp, that before all those wonderful artists go to work on creating a photo-realistic image, that the image design itself, right from the ground up, would be firmly based in authentic photorealism.
In fact, that plug-in is something our previs and postvis people used when we create slugs and temporary shots. So that through the whole course of editing the film, we could, just as a team, and along with Nathan McGuinness, who was helping supervise all that stuff, we could spend our time honing what that means, what that would look like, what that should feel like. So by the time it got handed off to DNEG, the film was firmly planted in a visual approach that could help them be successful.
We had [Habib Zargarpour at Digital Monarch Media] with game engine software, where you could import into the game engine. It was a little bit what The Mandalorian does, only it’s not a stage, it’s just a virtual camera, connected to the screen, and then you can shoot footage. And a lot of that experimentation ended up giving us some stuff to work with in our temp cut.
But what we ended up realizing is that Maya gave us much more control over the shot. We eventually got to the point, where if you floated a camera ship in Maya on a Beaufort six scale and looked through the camera, you were out in the middle of the ocean. And then you had all the advantages of Maya’s interface and GUI to sort of experiment with the finer points of the shot.
If we had all the money in the world and all the time, I would have taken each of these shots in Maya, and taken them into the stage, into a mo-cap stage and put a camera operator on the deck. And then exported that camera and used it.
I, even at one point, had this idea. It’d be fun to go into a mo-cap stage, where you could have a virtual environment and put the camera operator on his own gimbal, so that the camera operator is standing on a gimbaled programme surface, that’s going crazy in the water, just like it would if the operator was at sea. So that the operator himself was subject to all the crazy movement of the open ocean, that he would have to counteract and deal with as a camera operator. It’s like method-operating.
b&a: It’s interesting you mentioned that because I wanted to ask you, out of that visualisation work that you’d been doing, how did that inform what you could shoot live-action? You ended up shooting on a stationary ship and on sets but I’m curious how the previs helped you design plates.
Aaron Schneider: Yeah, the ships, they just don’t sail anymore. However, there’s a lot of verisimilitude in shooting that museum ship, down in Louisiana, that we matched to our gimbaled set.
There’s basically three categories in our movie. There’s the pilot house / bridge, where Tom is in and out of that room. And mostly, you’re cutting off what he’s looking at. You’re cutting to 100% digital POVs off that footage.
Then there’s stuff where you see Tom out on the bridge wings of the real destroyer, where there’s smokestacks in the background and guns that are firing in the foreground. That is obviously shot on the USS Kidd and intercut with the stage work. And then there’s shots of the ship ploughing through the ocean, ship-to-ship, that’s entirely 100% digital.
And there’s a price tag attached to each of those. So the film was very much an evolution in terms of through production, on what we could afford to do 100% digitally, how much time we could spend shooting on the USS Kidd, considering that there was a full armada of equipment floating around that ship, just to support us, which was expensive.
And so, we had to very carefully allot the amount of time to create that kind of footage that would give us the production value of that ship. And then there was, of course, the most efficient category, and that was a simple recreation of the bridge on a gimbal, inside a stage, which gave you a lot of control and accessibility to sort of bang off all of that intimate work inside the pilot house and out on the bridge, when you’re up next to Krause and the experience he’s going through.
And so, there were these three categories, each with a price tag, and it was a constant sort of conversation and evolution, through pre-production and even through production, of just how much of each we could afford. If I had five days out of 35 on the USS Kidd, the museum ship, then I had to go into every scene of the film and say, ‘Okay, I’ve got Tom standing out here on the bridge wing in closeup on my gimbaled set. How badly do I want to tie that into the rest of the ship? And is tying it into the rest of the ship critical to the storytelling?’ So it was a bit of a puzzle.
b&a: I’m always curious as you’re making one of these films, that ultimately has a lot of visual effects in it, that maybe there are some budget constraints early on, and then you see what you’re getting from the visual effects, and whether you felt like you could add more, whether you wanted to keep it restrained. I’m curious about that evolution of what you could show and not show.
Aaron Schneider: Well to DNEG’s credit, and to Nathan’s, there was no room for error. In post-production, in the editing phase, we were inter-cutting stage footage with shots of the USS Kidd, with a bridge and a highway in the background, with postvis that was a hundred percent digital, of a ship taking a dramatic turn in the middle of the Atlantic. And because of the schedule at the time, we were racing to get into the theatres. That was before 2020. But, time is money. So even if we had more time, you needed more money to use that time.
So to their credit, it was what it was. It’s like, here’s a shot where we need a 100% digital shot of the ship, and it’s got to be that way, because that’s all there is. And so, there was really no space, no room for error. The film defined what shots we needed as it evolved. And then those shots defined how they would be executed, whether they were a sky replacement, or putting water in the background, or whether it was a 100% digital shot of a ship crashing through unbelievably realistic water sims. That was pretty much baked into the cut. There was no time to say, ‘Hey, this is working better than this is working, so let’s do it this way.’ So Nathan and DNEG had to find solutions, and they had to deliver.
b&a: There’s so many impactful scenes in the film. I wonder whether you can talk about some of your favorites. I just love when you’re looking over the bow, and this water comes crashing at you. I have to admit I wish I’d been able to see it in a cinema for those kind of shots.
Aaron Schneider: Well, I think I agree with you. The points of view, when Tom is staring out the front portholes at his own bow, and the bow is crashing through the water, all this big tonnage coming down, and you’ve got the sound design, and the bottom end, sort of rumbling, and you’ve got all this water work. It’s just all zeros and ones.
And I’ll tell you something else. That footage, out the front of the bow, in the rough cut, I found on YouTube. I remember kind of yelping when I found it. There was a view through a glass porthole, where someone stuck a camera through a glass portal, and there was a windshield wiper, just like ours. Of course, this was a modern ship, but it had a windshield wiper, going back and forth in the foreground, out of focus. And it was off the bow of a cargo ship, but it had a bow that pointed very similar to ours. And it was rough and tumble ocean, with spectacular water bursts. So this was something I found on YouTube, and I ripped it and brought into the editing room. And we used it as temp footage, to cut off of all those same POVs you’re talking about.
And I remember every time I would see that sequence, thinking, ‘Oh my God. This is going to be a 100% digital. God help us.’ When I saw those shots, I was amazed. There’s a couple in there that, even to the most trained eye, are indistinguishable from reality.
But that’s what I mean. It’s like, we needed a shot of Tom looking, his POV, over the bow, with water spraying up. And that was the story, and we could not go out onto the ocean. We could not take the museum ship out into ‘Beaufort 7’ North Atlantic seas. So by definition, it’s got to be a 100% digital, which means they’ve got to deliver or fail. And this was the achievement of the visual effects on Greyhound. They nailed it.