How mocap – and motion blur – were critical for these early digi-doubles.
One of the landmark effects sequences in Danny Cannon’s Judge Dredd, released in 1995, sees Sylvester Stallone (as Judge Joseph Dredd) and Rob Schneider (Herman ‘Fergie’ Fergusson) together on a flying motorcycle being pursued – aerially – by a group of ‘judge hunters’ through Mega-City One.
The scene was realized predominantly via close-ups of the actors atop a gimballed motorcycle prop against greenscreen, composited into motion control footage of Mega-City One miniatures (the film’s visual effects supervisor was Joel Hynek, from Mass.Illusion). Wider shots, however, made use of CG flying motorcycles and digital stunt doubles crafted by Kleiser-Walczak Construction Company.
It was still early days in terms of digital doubles, but Kleiser-Walczak had already established itself as a leader in that area with other projects that had employed motion capture, cyberscanning and animation. They were brought on board for this particular work.
“Sylvester and Rob had filmed the live action in London, so they were cyberscanned over there,” details Kleiser-Walczak’s Jeff Kleiser, who, with his partner Diana Walczak, now operates the company as Synthespian Studios. “With those scans, we were able to model them to create ‘moveable’ IK bodies.”
Looking to mimic the kind of actions Stallone and Schneider were performing on the bike gimbal, Kleiser-Walczak used motion capture as a starting point for animating their digital doubles. “First we built a scale model of the motorcycle and then we put it on a gimbal so we could rotate it forward and backwards and sideways,” explains Kleiser. “Then Diana figured out that in terms of proportions, her body was the closest to Sylvester Stallone’s body. So she became our motion capture subject. She put all these magnetic trackers all over her [a Flock of Birds electromagnetic set-up from Ascension Technology Corporation] and we put her up on the motorcycle.”

The next step was to shoot Walczak replicating the right moves. “We’d watch a laser disc playback of the scene on a loop which had a previs of the motorcycle flying through the shot,” says Kleiser. “The motorcycle would be rotated to approximate the attitude and pathway for each shot. Diana was just kind of holding on, but her body was getting the real dynamics of what would happen.”
The resulting data was applied to the digital puppets of Stallone and Schneider in Wavefront. That had been Kleiser-Walczak’s go-to toolset for digital human work, but on Judge Dredd they knew Wavefront would need more accurate rendering capabilities. Luckily, Wavefront had purchased TDI, which had the Explore renderer with its Interactive Photorealistic Renderer module. “With this,” recalls Kleiser, “you could actually move lights and see the effect in almost real-time. You didn’t have to wait an hour to see the lighting.”

That solved the issue of making the digital doubles appear realistic on screen (hair and cloth sims were not needed in a big way for the shots). But there was another problem: motion blur. The CG motorcycles needed to fly past camera at something like 90 miles an hour and would look out of place without the right blur. “There was no motion blur available through Wavefront,” states Kleiser. “Well, there was, but it was a 3D motion blur and took six hours a frame to calculate! We said to Wavefront, you have got to come up with a better solution.”
Incredibly, Wavefront sent a number of software developers to Lenox, Massachusetts where Kleiser-Walczak was based. Says Kleiser: “We put them up in our house, and they wrote a bunch of code for us to do vector-based motion blur. Rather than calculating all the 3D objects and the blur, they were taking a 2D sample of that 3D object and figuring out what direction each of the pixels was going and then blurring each of those pixels along that vector.”
“It was flat motion blur rather than a 3D motion blur, but we went from six hours a frame to three minutes a frame,” continues Kleiser. “They saved our butts and we got this vector-based motion blur to work for the rest of the show. Those were the two big software hurdles we had to overcome and we were really proud of how it looked.”