The amazing engineering behind that eyeline system on ‘Avatar: The Way of Water’

March 1, 2023

The tech of the Spidercam-like set-up that allowed live-action actors to interact with CG ones.

In one of the many stunning sequences in James Cameron’s Avatar: The Way of Water, the imprisoned child Spider (Jack Champion) is talking to Colonel Miles Quaritch (Stephen Lang), who is in avatar form.

Spider is a live-action actor, but Quaritch is a much taller CG character realized by Wētā FX. Yet, it feels like they are both there in the same room, properly looking at each other with the correct ‘eyelines’.

Indeed, convincing eyelines have long been a challenge in the world of visual effects and CG characters, especially where the live-action actor is playing against nothing, or just a tennis ball on a stick, with the CG element to be added later.

On The Way of Water, the filmmakers wanted the eyelines to be as accurate as possible for human/CG interaction, so they engineered a system that effectively mounted a monitor and a speaker on a Spidercam-like apparatus–the ones you see used to film sporting events–and positioned it and moved it around at the correct height on set. On that monitor was the previously captured performance capture footage of the actor that would eventually be CG.

How the eyeline system came about

Ryan Champney.

The idea for the eyeline system emanated from the experience on the first Avatar (released in 2009), where virtual production and simul-cam approaches were still back then completely novel. “There also wasn’t a lot of human/Na’vi interaction, just a handful of scenes,” observes Lightstorm Entertainment virtual production supervisor Ryan Champney, who orchestrated the eyeline system on The Way of Water.

“On the first film, one of the big scenes was when Jake Sully first goes into his avatar body and he gets off the bed and then he’s kind of frantically stumbling around. That was the first real use of the simul-cam system where Jim could be ‘live compositing ‘and framing the shots with both the CG and the live-action characters together.”

“Before that,” continues Champney, “there were often these conservative CG shots where you do a slow pan, or a locked-off camera, and you say, ‘Okay, the CG will be there, let’s leave some negative space…’. Jim wanted something that felt more organic and natural where it could jump from one character to the other.”

Champney relates that it became challenging on 2009’s Avatar for the actors acting against something that would be CG to maintain their marks and maintain eyelines.

“It was very quickly evident to Jim that the system broke down because he could see where everything was [in the simul-cam], but the actors couldn’t. Even with the ADs in there with their stick and the tennis ball, it was still very hard.”

So for the live-action portions of The Way of Water, where there would now be an abundance of human/CG character interaction, it was deemed that a new system, which was both repeatable and incorporated the already filmed performance capture, would be built to help actors hit their marks and eyelines.

Flying drones: what could have been

While a cable-cam setup was ultimately utilized for the eyeline system, one initial idea was actually a flying drone. “We tried a series of different technologies and one of them was drones that were kind of autonomous,” shares Champney.

“We already had a mo-cap system going as well, so we thought we could just give the drone a point in space and the mo-cap system would self-correct it, i.e. you could push it away and it would fly back. We ended up not going that route just because of sound.”

Still looking for something programmable and repeatable that would also be positioned safely away from all the gear typically located on a film set, Champney and his team instead gravitated towards the cable suspended aerial camera systems offered by Supracam. These are commonly used at sporting events in which a camera housing mounted on four cables is rigged to offer sweeping motion controlled camera moves around stadiums and other locations.

“We contacted them to see if they could miniaturize it and customize it to our needs and give us a real scale-down version of just the components so we could build them ourselves,” says Champney, who then set about applying Supracam’s tech to a film set world.

This short breakdown showcases the eyeline system, as well as a glimpse at the depth compositing process employed by Wētā FX on the film.

One more challenge

The idea was now in place, but as Champey recalls, one further hurdle remained: “How do you track where four cables are going to go and the device and all the equipment and all the set pieces and not have it run into actors or anything else?”

Enter Casey Schatz from The Third Floor. The experienced visualization, virtual production and previs artist–his credit on The Way of Water is simulcam supervisor, while Champey describes him as a ‘techvis genius’–ensured the on-set operation of the eyeline system was as smooth as possible by mapping out its movement in virtual space.

“Casey was doing almost a kind of live Lidar on set,” explains Champney. “He was constantly doing these quick Lidar scans of the set and then readjusting where the pick points and the cables should go. In Maya and through his techvis system, he could have the move going through the computer ahead of time to see where it was going to collide. He always had a 3D up-to-date version of the live-action set at any given time.”

Maya screenshot of Casey Schatz’s rig used to keep track of the eyeline system.

A feat of engineering

On a practical level, the eyeline system (Champney says some people called it Eyeliner–“I wasn’t a big fan of that one”) was made up of Supracam pieces including motors, motor controllers, cables and machined parts that were then ‘bolted together’ by Lightstorm.

Champney himself re-jigged a version of Supracam’s Linux control software used to control the camera at sporting events. “All that software was based on having a joystick or a controller or someone flying it around. We came up with a file specification that dealt with having a series of frames with time code, and X, Y, Z and rotational positions.”

Since it was hung on cables, the positioning of eyeline system meant, of course, that it came in above the set. This in turn meant no ceilings. Champney and Lightstorm liaised with production designers Ben Procter and Dylan Cole on that side of things, too. “They were like, ‘So Jim, you don’t want us to design any roofs on any of this stuff?’ And he’d say, ‘Nope, all the roofs are going to be CG.’”

The system was also designed to form part of other virtual production systems used during filming, which occurred both at Manhattan Beach studios in Los Angeles and in Wellington, New Zealand.

“It wasn’t just the eyeline system that we were triggering,” notes Champney. “We had to have a universal trigger that had everything go at once. This included the simulcam playback, any lighting effects on stage, the motion bases, etc. I created a UDP (user datagram protocol) network so that when all these devices would come on, they would announce themselves to the stage, like when your printer shows up on a network.”

Safety was key. In addition to the crucial work undertaken by The Third Floor’s Casey Schatz, the eyeline system was engineered so that triggering it would involve, as Champey details, a steady ‘ramp-in’ as it got to where it needed to go.

“During testing, we’d hit go, and the system would just jump to where it thought it needed to be. And it would literally just launch! All the batteries would fly off of the thing, and they’re not light. We said, ‘Alright, let’s rewrite this software so we know that that can never happen again…’.”

James Cameron during some of the live-action shooting.

Eyeline evolutions

Displayed on the eyeline system’s monitor was the facial capture footage from the performance capture session of the particular actor who would end up being a CG creation by Wētā FX. This included the audio performance as well. The rationale for both the video and audio was clear, according to Champney.

“We’d gone to all this trouble to have the eyeline system there as something in the space that we thought–and this was Jim’s idea–why don’t we just have a video of their performance as well so you can actually see the actor that you’re performing against with the audio from that point of the space? The idea was that even if the character is walking behind someone, that actor can still hear and have an awareness of where it is spatially when they’re not looking at it.”

Asked whether any consideration in testing had also been given to rendering out a CG Na’vi character for placement on the eyeline monitor, Champney says there had been some discussion of that, but actually having the actor interact with the other actor that was playing that part was important. “Instead of what is an interpretation of what the eyes and the emotion would be, you’re actually just seeing straight away what it is.”

There was even some early debate about whether the footage on the eyeline monitor could be ‘live’, as in, having a performance captured actor act out scenes on the volume while the live action was simultaneously being shot.

“There was no technical reason we couldn’t do that,” states Champney. “We’re just sending it a stream of data. Whether it’s a stream of data reading from a text file or if it’s a stream of data coming from the mo-cap system. But the problem was we’d be negating all of our pipeline, and Casey wouldn’t be able to predict what they’re going to do.”

“Imagine if an actor tripped,” adds Champney. “It’s not an actor tripping, it is now a nine-foot tall creature [as the eyeline system] tripping and crashing through things unpredictably on a live-action stage. All those cables were Kevlar. I’ve heard a bunch of war stories from the guys that built these systems and that Kevlar line will actually slice like a bandsaw through a steel I-beam. So I was like, ‘We’ve got to be careful with it.’”

The eyeline system in operation. Source: Vanity Fair.

What a live-action actor saw

The eyeline system monitor was a small HD monitor that could pan and tilt relative to the head motion of the character. Champney says some of the actors embraced it more than others. “I think some were like, ‘This is great.’ And others were more, ‘There’s a mechanical thing flying at me at the scale of a nine-foot tall character!’”

“In fact,” Champney adds, “everyone was convinced at first that it was going too fast. Even Jim was like, ‘You have it going too fast.’ But we would do the composite and it would be right on. It’s just that when a nine-foot tall character leans into you, it comes at you pretty quickly.”

The facial capture footage was presented vertically. Since the black and white infrared footage from the capture stage is inherently fish-eye warped, Champney and his team wrote a piece of software–some OpenGL-like filters–that would de-warp the video and also align the video to the relevant time code of what was being played back.

“If Jim said, ‘Go to this time or this frame,’ it would put the motorized system in that place, but it would also synchronize the video so that those things were all moving together,” describes Champney. “The video, the audio and all these things were locked in. We wrote our own kind of FFmpeg playback system that de-warped the video live and applied a blue grade to the black and white footage and then sent it out to the eyeline system.”

“I have to admit, the blue grade was unnecessary,” says Champney. “But, I thought, well, I’ve already put these OpenGL filters on there to de-warp it. I might as well put a little bit of an avatar hue to it.”


Subscribe (for FREE) to the VFX newsletter




Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading