How ILM’s work on ‘Hook’ led to ‘Phantom Menace’s’ podrace

Phantom Menace

Projection mapping played a key role in making those racers go fast. real fast.

Six hundred miles an hour was how fast the podracers in The Phantom Menace needed to travel. But how do you film such a thing? The racers had to hover just above a sandy surface, zoom through canyons and sometimes crash spectacularly. It wasn’t something that could be filmed live-action, and a miniature for the backgrounds would need to have been enormous to cover the area of land those podracers would travel in mere seconds.

Such was the task set for ILM visual effects supervisor John Knoll (now also chief creative officer at the studio) as production on The Phantom Menace ramped up. He and his team would ultimately look back to one of the VFX studio’s first instances of projection mapping – on Steven Spielberg’s Hook – to help conceive of a method of combining photos of miniatures with low-proxy geo to synthetically realize the photographic quality of a real desert landscape.

Knoll shares his thought process behind that groundbreaking work with befores & afters as part of our 20th anniversary coverage of Episode I, plus how those racers were made to skim along the surface with ‘springy’ action, and then crash with an early version of Maya.

Here’s how you can support befores & afters, via our Patreon page.

The problem to solve

John Knoll (The Phantom Menace visual effects supervisor): Just as an overall overarching thing on the entire show, I was worried about how massive it was and how to distribute the work appropriately across different departments. We had a pretty good size computer graphics department by that point, but we hadn’t really done CG terrains before.

Just looking at the kinds of backgrounds that needed to exist for the podrace, I’d seen all these production paintings and designs that came out of the art department that was beautiful, had these big mushroom shaped rocks, and these big fields of arches, and all these exotic looking terrains.

Anakin's podracer speeds along the desert landscape
Anakin’s podracer speeds along the desert landscape. Source: Alamy.

It was apparent you’d have to really compromise if you wanted to try and find a real location somewhere to shoot, so there wasn’t anything that was really quite like that anywhere in the world. But even if you could find places that were sufficiently exotic looking, how would you shoot them?

At that point, I’d already done a number of aerial shoots from fixed wing planes and helicopters. A helicopter can fly, at most, 120 knots. You’re really pushing it at that. And even then, they don’t fly exactly straight. If we’re trying to make stuff that looks like it’s going 600 miles an hour, we were going to have to speed this footage up pretty dramatically, or under-crank the camera, and helicopters just don’t fly straight enough for that. So I was worried that even if we could find locations that were right, we couldn’t really shoot them. And then there were all these very dangerous things being depicted, like flying through arches.

So, pretty quickly, shooting live action plates for this on some location went off the table. There wasn’t really any good way of doing that. And then the next thought was miniatures. ‘How much can we do with miniatures?’ Some of the more closed environments, like we did Beggar’s Canyon, we did do those with miniatures. And there was a partial cave we did with miniatures. So there were a handful of things that I figured I could do in miniature, but some of the more wide open areas were a different story.

Cockpit shots made use of a live action greenscreen shoot
Cockpit shots made use of a live action greenscreen shoot. Source: Alamy.

We were travelling so fast that I would just have to build this gigantic miniature, because something that’s way on the horizon at the beginning of the shot was going to be under camera a few seconds later, and I just couldn’t imagine how many miniatures we have to build to be able to pull something like that off. The wider terrain areas just didn’t really lend themselves to that kind of approach, so then that left synthetic terrains of one kind or another.

Hook to the rescue

I remembered that we had done work on Hook, the Peter Pan movie, where we did some 3D matte paintings, the two 3D matte paintings of Neverland. I helped out a little bit with those shots and remember at the time, thinking, ‘This is a really amazing and super powerful technique of being able to take a 2D image and project it onto 3D geometry and then move the camera.’ Suddenly you’ve transformed this flat thing into something that looks fully three dimensional and it’s very compelling.

Matte artist Yusei Uesugi
Matte artist Yusei Uesugi at work on the Neverland painting in this ILM photo.
Matte artist Yusei Uesugi FINAL SHOT
The final shot.

But for whatever reason, after Hook, despite what I thought was the tremendous success of those shots and this revealing of a very powerful technique, we just didn’t do it much anymore after that. And apparently it was kind of a pain to set it up, and because it was a pain to set up and hard to do, nobody was really excited about doing a lot more of those. But I kept thinking how powerful a technique that was, and I talked to a friend of mine at Electric Image into writing a camera projection tool for Electric Image. And I used it on Mission: Impossible, and Star Trek: First Contact, and the Special Editions. And as soon as we started doing that, our matte department just went nuts over it because it was such a super powerful technique, and the tool that got written into Electric Image to do that was super easy to use. Basically, you just take this geometry and hook it up to this camera and stick this texture onto it, and there you go, it works, and it rendered really fast.

Making mushrooms

Then, Paul Huston at ILM and I got talking about the use of that technique, and we did a couple of shots on Mission: Impossible that way. And then when the podrace came up, Paul said, ‘I think I want to do a test where I’m going build some pieces of geometry, some rocks, just out of plaster and tin foil, and I’m going build them up as nice looking miniatures, but then I’m going to digitize them, essentially scan them in, so I’ve got a lightweight CG version of them. But I’m going to take those same models out in the parking lot in the sunlight, photograph them with the sun coming at the right angle, and I’m going to project the photography of those models back onto the CG model.’

The mushroom shaped rock
The mushroom shaped rock formations began life as miniatures that were then digitized and photographed, and replicated in the scene via projection mapping. Source: Alamy.

The idea was, we weren’t actually rendering anything other than projecting the photo back onto the model. So what you end up with is this thing that looks completely photographically real because it is a photograph. Essentially, you’re just using the computer graphics to distort the photograph. So Paul put this test together. We built a couple of mushroom-shaped rocks, and we did this little fly through of part of the podrace called Mushroom Mesa. And it was pretty stunning to see that first test, because it succeeded in every way that you could want.

It looked pretty photographically real. We were travelling at 600 miles an hour. You could control the flight path to be exactly what you want, and we could manufacturer as much of this imagery as we wanted. So as soon as I saw that, that became the preferred technique for at least all the wide open terrains where we had a lot of scope and scale.

New tools to fly and destroy podracers

Then we needed to make these podracers crash. I brought up with the R&D folks, ‘I think we need a rigid body system. I think we need a cloth simulation package.’ And then I was told that, ‘Hey, well, pretty soon we’re going to have our first betas of Maya, and that’s got a rigid body engine in it, so we should look at doing that.’ Habib Zargarpour was one of the beta testers, and he was really excited about it.

3d modelNow, I had felt like even apart from the crashing, I felt like the podracer motion itself should be a simulation. And I had a 2D rigid body animation package called Working Model. It was a fun. It was kind of meant as an educational tool, but you could draw things in 2D, rectangles and circles and irregular polygons, and you could attach forces to them, springs and gravity and propulsive forces. And then you could set up these initial conditions and hit a go button, and they would collide against each other and the springs would spring, and all that sort of thing. I had done exploration by building a podracer from top down, and the two engines and the cables that went back to the cockpit, and I hooked them all together with a crisscrossed spring network, and then I would fire a cannonball at it.

The one engine would hit the ground and then it would spring around and bounce, and it would transmit the vibrations or the energy from that impact to the next engine, but it was damped and had this wonderful complexity to it that I thought, ‘That’s what I want the podracers to look like’. Almost if you imagine an invisible framework and the engines are suspended inside that framework by springs, and then you shake the frame around, that’s what I wanted them to look like, where they had that springy thing like Luke’s speeder. And so, I’d done those tests with that 2D program, and I showed them to Habib.

So, we built a similar thing in Maya, where it was a framework of nulls, and we had springs that went from those nulls that suspended the engines. And we did pretty much the full 3D version of that 2D test that I did, and that’s how actually all the podracers were animated. You’d animate the frames, and the motion of the engines was all that kind of bouncy stuff that came from the simulator.

Anakin catches up to Sebulba
Anakin catches up to Sebulba.

Then for the crashes, in the podrace animatics, George had started off with what we called a ‘Ripomatic’, where he had pieces of other movies cut in for vibe and the general feeling of the shots. So there was stuff from Ben-Hur, and Grand Prix, and Le Mans. And what he was using for the crashes was Formula 1 car crashes, just real ones from sports footage. And the thing that George was really pointing to me with those references is the way these things blow up, it’s all about kinetic energy – it’s not so much about there being a big fireball.

When a Formula 1 car crashes, there’s very little fire. It’s mostly just that it’s going so fast, it takes a long time for it to slow down, and it just tumbles and tumbles, and pieces tear off it. And it gets torn apart and shredded by this process, but it’s mostly a mechanical process. And the speed and kinetic energy that takes so long to scrub off is really what he wanted to capture about that. So Habib and I spent some time picking those shots apart, the reference of the Formula 1 car crashes, and we worked out a plan for how we wanted it to do that.

I think most of the crashes are seen in at least two angles, and we had this idea that, ‘All right, we should only simulate one version of each crash, and we’ll photograph it from multiple cameras. So we’ll run the simulations long enough to cover the full sequence, and then we’ll put multiple cameras on it. It will be the same sim seen from each different angle so that we contain how complicated the work is on putting those things together.’

Those sims were all done in Maya. It’s a combination of rigid body stuff and soft body things. A lot of the tearing metal is actually using the cloth simulator. A couple of the engines have whipping licking flames coming out of them, and those are cloth simulations as well. They’re cloth strips, but they’re shaded like they are fire.

Explore more of our in-depth ‘The Phantom Menace’ 20th anniversary coverage during #phantommenaceweek including an upcoming piece on the podrace previs.

Leave a Reply