In the volume with gladiators, chariots and horses 

September 23, 2024

A round-table discussion on the virtual production challenges of Those About to Die. An excerpt from befores & afters magazine.

Roland Emmerich’s Peacock series, Those About to Die, tells the story of the Roman Empire’s games. The show relied on virtual production techniques to bring audiences into key locations, such as the 80,000 seat Colosseum to witness dramatic chariot racing.

Production filmed at a revolving LED volume setup at Cinecitta Studios in Italy. Visual effects supervisor and second unit director Pete Travers and visual effects producer Tricia Mulgrew collaborated with Dimension Studio, DNEG and ReDefine to deliver VP and VFX services, ranging from Roman environments to crowds and even pre-rendered horses.

In this round-table with befores & afters, Travers is joined by virtual production supervisor James Franklin (Dimension Studio) and visual effects supervisor Izet Buco (ReDefine) to break down the VP aspects of the show.

b&a: Pete, how were you and Roland Emmerich planning to use virtual production on the show? Tell me about the early conversations you had about this with him. 

Pete Travers (production visual effects supervisor and second unit director): Well, the first kinds of conversations–and these happened months before pre-production–were about determining where we shoot. We ended up visiting Cinecitta Studios, which is the famous studios in Rome. There were tremendous advantages to shooting at Cinecitta, in particular, because they had shot the HBO miniseries Rome back in the day. There was also the recent Ben-Hur movie and that came into play because they had put a chariot racetrack at Cinecitta World, which is a Universal Studios-like amusement park. What it meant was there was a backlot of the Forum and the track from Ben-Hur.

Then, the pièce de résistance was that they had actually built an LED stage there that had been barely used. And it was big. The joke we made was that it was a little bit like Field of Dreams: ‘If you build it, we will come’. It was eight meters tall and had a total circumference of approximately 51 meters, in a U-shape. So, we got there and we were like, well, there’s nothing else that comes close to this place.

Now, there was the LED wall, but there were no guts, so to speak, for the wall. Tricia Mulgrew, my visual effects producer, and I had to aggressively find a partner not just for visual effects but for virtual production and previs as well. DNEG and Dimension were unquestionably our best choice. With this being a television program and with the appetite of Roland Emmerich to do 10 hours of television at Roland Emmerich ambitions, we needed a partner that was very, very willing to work within the budget that we had. I had a number of great conversations with DNEG 360/Dimension Studio managing director Steve Griffith. It was almost like we instantly had a shorthand, and that’s how we got started.

THOSE ABOUT TO DIE — Episode 101 — Pictured: (l-r) — (Photo by: Reiner Bajo/Peacock)

b&a: What were some of the things you knew virtual production would be most useful for?

Pete Travers: Roland really wanted to do chariot racing on the LED wall. The idea was that we were going to film the charioteers riding on chariots with no horses in the foreground for the close-ups of the sequence, and then in the background we’d need horses. We had to figure out, how do we render the horses and how did we do crowds? The crowds, well, they have costumes. That means that the costume department had to complete the design of their costumes well before they usually have to do because we had to do a whole volumetric capture setup in London months before we started shooting. ReDefine would then have to render it all, including the stadium, and have it ready for the wall.

It really takes the entire typical schedule of film production, and completely reorders it. Typically in CG when you say, ‘We need this quick,’ that means a week or two. With virtual production, it means, ‘We need it in an hour.’ The only way to do that was to manufacture calendar time and there were some pretty harrowing moments. One thing James had to deal with right out of the gate was that we had to build another LED wall because we weren’t ready on the main wall. It was because of actor availability with Anthony Hopkins. He was only available much earlier on, so we had to build a smaller wall for his scenes. It was like, ‘Welcome to the show, you’re behind already.’

James Franklin (virtual production supervisor, Dimension): [laughs] I think with hindsight, though, that was actually quite good because it was a good dry run for the big wall. I think if we’d gone straight into the big wall, it would’ve been daunting straight off the bat. So we were lucky that we first had a go at the smaller wall and got our feet wet. The smaller wall had a resolution of approximately 6K. It was a flat wall and we primarily used it for views outside of windows and off balconies. It provided a lot of lighting to the set itself. These sets are mainly made of marble and stone so you get a lot of bounce light. We were stopping down on the wall because the ISO was so high that the wall was just blowing out.

Pete Travers: That’s an important point in that virtual production has a very important relationship with the camera and the camera that you pick. We were testing a lot of cameras, but the Sony VENICE 2 blew everything else out of the water when it came to its low light capabilities. We wanted the walls to do the work–the interactivity–with the shiny marble. When you typically do this kind of environment work in post you usually just throw up a bluescreen and you don’t get any of that subtle bounce/interactive light that you would get on a VP stage. So, there’s some things that would never look as good if we threw up a bluescreen and just did it as a comp in post.

THOSE ABOUT TO DIE — Episode 102 — Pictured: (l-r) — (Photo by: PEACOCK)

b&a: Tell me more about chariot racing.

Pete Travers: Well, we knew early on that there was no way that we were going to get real-time horses. So we had to pre-render those. For Izet, that was his trial by fire I would say, and make that work for the big wall which was 16K. And then there was the crowd. On a typical VFX show, for designing something like the Circus Maximus, you have until about halfway in post to complete the design of any kind of major thing that you’re doing, whether it’s a creature or an environment. But here we had to have the design completed in pre-production and then rendered while we were shooting other things. Luckily, we had some sense about ourselves that we pushed all the chariot racing stuff to the very end of the VP schedule.

Izet Buco (visual effects supervisor, ReDefine): To be honest, I never had a plan to do anything with virtual production. I’m a traditional visual effects supervisor. I went to Pete and Tricia’s office and they said, ‘Well, we need to do this in VP.’ So we had to design all the creatures and all the Circus Maximus. We had to do crowd work with volumetric capture. All upfront. For the horses, we spent quite a bit of time scanning them. They of course needed to match the horses from the practical shoot. As Pete mentioned, we couldn’t bring the horses into Unreal and start moving things around. It had to be pre-rendered and the challenge was it was 16K. Also, it had to be rendered as a cylindrical or lat-long because the whole wall was built as U-shape. And for the crowds, it all needed to be loopable so that it worked at any point in a take.

The problem becomes, if you try to make a loop and the clips don’t match, that introduces all kinds of problems. The crowd had certain kinds of clothing and we designed particular tools for the clothing and the kinds of reaction to the chariots, where they might stand up and clap. It was quite tricky to nail but we managed to fill that whole stadium with real-time crowds.

issue #21 – virtual production

James Franklin: For the crowds, we had 90 actors in London and out in Rome, and we captured 500 individual volumetric performances captured on the Polymotion Stage (a Dimension/MRMC partnership). For the actual Circus Maximus, which is live in Unreal, we had 32,000 animated characters. Normally in VFX, you have cards and you have the performance on the cards and you offset the timings so that the crowd agents all look different. Instead, because we shot volumetrically, we were able to extract the normal maps from the volumetric captures and apply those to the cards, which means we could relight the cards in real-time. So, they’re self-shadowing, and if you change the time of day, they would look correct in terms of lighting–that was a new thing to us. We’ve done some tests since then and we’ve got that up to 80,000 characters, so we can fill a large stadium quite comfortably.

What we did in VP had to also match with what Izet’s team was doing later in VFX, because when you see the wides and the aerials, the fans watching the racing were in different factions. So they were grouped in terms of colors but not religiously grouped because some people venture into different areas.

Pete Travers: I had the opportunity as second unit director to direct all of the chariot race stuff, the practical stuff around the track. And of course, from all the nightmares that I’ve worked on in the past where you don’t have the vital reference that you need to do something in post, I’m like, okay, well, while I’m shooting the direct photography that is going to go into the show, I’m also shooting reference of horses charging in chariots because I knew that Izet was going to need that reference, too.

The dynamics of the riding turned out to be the toughest thing. It’s always surprising because you’re thinking, okay, all the horses are all going to be super coordinated. But then we get there and we’re watching it, and it’s chaos. Those horses are bumping into each other, some horses are biting other horses and dirt’s flying everywhere. It’s a mess, but it looks really good. All of that stuff for VP had to be synthesized and mimicked and it worked.

Izet Buco: It was always great to have the real reference to what you’re matching to. We followed a traditional process for the VFX by building the horses, going from bones to groom. Then it was just about how fast we could build them and render in such high resolution. We knew upfront where the priority was so we could in the early stages start spreading teams across from environments to the creatures to effects.

James Franklin: That’s a good point actually. On a wider note–and Pete and Tricia were very clear about this from the start–you are one team. There was a good chance that Izet was going to go and build environments that we were building in Unreal and vice versa. We also had a fantastic production designer, Johannes Muecke, who was designing interiors as well. But we don’t want everyone building things twice in their little silos. We live in a digital age where we can quite easily share assets. So, you have to have that oversight to say, okay, if Johannes has built a fantastic interior, which he did on one of the scenes, can we use it? Why would we go off and rebuild it?

Pete Travers: Yes, and in terms of the chariot racing, which is the pre-rendered scene, the vast majority of the rest of the work involves Unreal and doing things with a Rome model. We acquired a Rome model from a company in Germany and quickly handed it off to DNEG/Dimension so that they could start using it. I would say there was a tremendous amount of modification to it. But nevertheless, it gave us a great head start. We had to figure out ways to manufacture calendar time and we knew if we didn’t do something like this–if they were designing Rome from scratch–we wouldn’t have made it. Then of course we also gave it to both Izet and James so that James could start getting these environments ready in Unreal on the wall.

There is another component of the VP thing and the rescheduling that’s worth mentioning, and that is with plate shoots you typically do it at the end of production. But all of our plate shoots had to be done before we started shooting.  Tuscany was our key landscape location for Ostia Harbor, which is Rome’s main harbor, but the real Ostia is not what you want to shoot. You need to find some pristine area that you can add ancient Roman buildings to. So, the scheduling of the plate shoot and also the value of the plate photography was monumentally important because it helps you not just completely rely on Unreal but rely more and more on plate photography. In fact, there are some shots in the show that I don’t think people will realize were shot indoors.

Johannes the production designer had built this beautiful tent for the scene in episode 10. I am looking at it going, ‘That just totally looks like it’s outside.’ And that was actually a very important conversation Roland and I had at the very beginning, saying, let’s try to rely on as much plate photography as we can and put it on the wall because if you shoot it right and you get it right, then it works great.

The trick with it, and this is the big part about VP and where the advantage of VP lies, is for magic hour, I think. Even that term is a lie because magic hour lasts for about 15 minutes. It’s not an hour. But if you can shoot these plates and get these things, then you can walk into magic hour and magic hour becomes magic day; it can last all day long. That being said, we also heavily relied on HDRIs and Dimension/DNEG 360 built this awesome cloud mover tool. That was proprietary, right, James?

James Franklin: Yes, because we had these beautiful skies with amazing clouds in them and we didn’t want them to be static. So we came up with a way of using flow maps and various other techniques to very gently animate the clouds in a believable way, but more importantly on a loop so that when we were shooting, we didn’t have to go, ‘Oh, sorry, you have to cut there because we’ve had a jump in the clouds.’

It was essentially a two-and-a-half-D scene. We had these HDRIs that were being animated in Unreal. We had some 360 degree photographic captures that Pete’s team had gone out and captured for us that were placed in front of the sky. In the mid-ground, we added 3D geometry and things like the sea as well to give it parallax, and then Johannes’ set dressing in front. So you had all these layers so it didn’t just look like a flat plate.

Get the full interview in the print magazine.

Leave a Reply

Don't Miss

VFX Notes reviews ‘Ex Machina’ and its AI Alicia Vikander

The Rise of AI Robots season continues.

Paul Franklin reflects on the VFX of ‘Interstellar’

Plus, a special heads-up on the huge film he is

The craziest sequence in ‘Murderbot’ and how it was made

DNEG breaks down the worm and how it ate the

Discover more from befores & afters

Subscribe now to keep reading and get access to the full archive.

Continue reading