How that stunning ‘House of the Dragon’ series one finale dragon fight was brought to life with new approaches to virtual production.
Season one of House of the Dragon, the prequel to Game of Thrones, ends with a vicious aerial dragon battle between Lucerys (riding Arrax) and Aemond (on Vhagar).
The dynamic sequence was heavily previsualized by The Third Floor and then set up for filming on the LED wall ‘V’ stage at Leavesden, where actors rode a pre-programmed motion base against virtual environments built by Pixomondo (which also crafted the final dragon animation and visual effects).
Lux Machina operated the stage on a custom branch of Unreal 4.27, including managing color calibration for the LEDs and camera, and the motion capture set-up which was a Vicon based camera system with custom tracking devices.
During the shoot, too, actors were pummelled with rain, smoke and wind, something rarely attempted on an LED wall set.
To find out more, befores & afters spoke to series visual effects supervisor Angus Bickerton about a revamped motion base, and matching to pre-built virtual environments on the LED wall.
b&a: For that final episode dragon battle, how did you come to the approach of shooting actors on motion bases and using the LED wall?
Angus Bickerton: Well, one of the things we did do here was that we re-designed the motion base as it had been used on the original Game of Thrones. In fact, when I first joined, I had the good fortune to do a couple of calls with Joe Bauer, who was the visual effects supervisor for the last six series and extremely kind in sharing his knowledge. Over the series they’d honed their methodology for shooting dragon riding sequences, and I learnt a lot about their process.
One of the key things we tried to do was produce longer moves. This was at [executive producer] Miguel Sapochnik’s request–Miguel had been obviously a director on that series–and he said he just hated the fact that you turn up and you go, ‘Okay, we’re about to do shot six…great, got that, thank you very much. And moving on.’ It was all pieces of shots. He said, ‘I want to do continuous shots. I want to do one minute long takes where the actors get to feel the ride. I want them to experience the ride much more.
Sometimes to achieve the moves, it’s an aggregate of motion base moves and camera moves as well. But Miguel said, ‘No, I want to put the emphasis on the motion base.’ So in very early discussions with Mike Dawson, special effects supervisor, we talked about a different motion base and we came up with a new configuration.
Previously it had been an eight-point hydraulic base. We went to a servo motor driven configuration, which gave us a lot more range of movement. We had plus or minus 45 degrees on the roll. I think we had about 35 degrees pitch forward, about 45 degrees pitch back. And then we put that on a slew ring so that we could rotate as much as possible as well. And we tried to make the ride experience as physical as possible for the actors.
b&a: How did the LED wall add to the mix here?
Angus Bickerton: On top of that we then took it a step further and stuck it in a volume stage in Leavesden. It was a big build, it was a big operation, it was a big investment on the part of Warner Bros. And again, thank the Lord that they had a head of virtual production called Ryan Beagan. He came in and he oversaw the construction of the volume and the management of it.
Then we put the motion base in the middle of it and figured out an optimum position for it. We were then talking early on to Pixomondo, who are Game of Thrones veterans. And I know they had their own volumes in Vancouver and Toronto. So it seemed like the perfect coming together, really, that they could potentially be a post house, but also because we needed to bring on someone early to build our environments for Unreal Engine.
b&a: Tell me about those environments.
Angus Bickerton: Yes, that’s the next thing–we tried to create interactive environments for the riding sequences. One thing that was always reported was that they were plagued with the difficulty of trying to make the lighting interactive. They learned that through the course of the eight series of Game of Thrones. One of the big problems, of course, is moving key lights. If someone’s banking to the left and you really want to move, or you want to suggest that not only are they banking but they’re arcing around 90 degrees, then you have to move your key light relative to the artist.
So, how do you do that? Traditionally you can put a light on a track, you can put a light on a crane, you can do a set of sequenced lights. We hoped that suddenly by putting it in a volume we could create light panels and we could fly those things around at any speed we wanted unhindered by any physical problems.
On top of all that, you don’t shoot a single Game of Thrones/House of the Dragon sequence without smoke or some sort of physical atmos. And for episode 10, we wanted smoke and driving rain. That sends everyone into conniptions, ‘Whoa, whoa, wait a minute, you can’t do that in a volume. One, you’re going to coat our screens with oily smoke. Two, you’re going to get our electric LEDs wet.’
And then on top of that, of course we did all the other scenes where we always had flambeaus as well. So the next concern was that flambeaus would be pockets of heat that would start to either melt the LEDs or leave blackened stains on the screen. We had to do extensive testing for that to really make sure that we were not going to be leaving the screens in an unusable state at the end.
b&a: What did you end up doing for the rain and smoke? I feel like you just contained it as much as possible.
Angus Bickerton: We did. In the case of flames, the good news was that we had a ceiling panel that was broken into about six or seven sections. And they were all on programmable winches. They could either be varied in height or they could even be angled if necessary. If we had a big brazier or a large flame source and we were worried about a heat spot, we could lift the ceiling panels up in that area to get a greater gap. And in fact, special effects were charged with doing tests where we literally turned on a brazier and left it going for six hours and we put white cards at different heights above it to see what effect there would be on those.
From those tests we determined a height and worked out things that we could do. The downside is that if you have to do that it starts to defeat the whole principle of the volume, which is that you’ve got this sort of omega shape of screen with a capping LED ceiling to create a wraparound environment. If you’ve suddenly got to move the ceiling up to create a gap or to pull yourself away from the heat, suddenly it’s not quite working as well as you’d like. We’d literally be on set going, ‘Raise that ceiling, it’s not in shot.’ There’s a constant sort of backwards and forwarding on that.
In the case of wind and rain, wind obviously is not so much of a problem but driving rain, well, we had to configure our shots on the motion base so that we were blowing the rain along the longest axis, as it were. And then in addition to that, we would have additional wind machines set up along the bottom of the screen to blow the rain away from the screen as well. If we were blowing across at 90 degrees, there’d be a line of fans on the floor blowing perpendicular to that. We’d have a line of us standing around the screens going, ‘No I’m not feeling any rain.’
b&a: Just to go back to the motion base and filming the action, obviously for many years now previs and techvis has often driven the motion of a motion base and the movement of cameras on set. But then when you’ve got the added complexity of what might be shown in the LED wall panels as a virtual environment, what approach did you take to previs or techvis of the sequence?
Angus Bickerton: One legacy that I picked up from Game of Thrones was that by the end of series eight there were greater and greater demands on visual effects and so consequently visual effects said, ‘You must do things the way we want,’ which is quite unusual these days. Often productions drive the way things are being done more so than visual effects. It meant that when I joined, production said they already knew the rules. They said, ‘Yes we will storyboard, yes we will previs, yes we will sign off on that previs and that will be what we will shoot on the motion base.’
I always try to find a balance where visual effects are sensitive to the fact that directors can’t always decide things so early. In fact, I challenge anyone to be able to guess perfectly what the end result of a show is going to be. It evolves. So you need to have some latitude for the filmmakers because we’re all figuring it out as we go along.
What we did was we storyboarded, we previs’d with The Third Floor, who were also veterans of Game of Thrones. Then once we got the previs where we liked it, we would then cut it in the cutting room with our editors, and once we liked that, then the methodology was that we would send it to the post house that was going to ultimately do the shots.
By then, we’d got our dragon designs to a certain point. This was another thing, we had to sign off on dragon designs fairly early for this. The Third Floor had built their dragons but they were sort of simplistic dragons for the previs. Ultimately, we then initiated the higher-res dragons with the post houses. In the case of episode 10, it was Pixomondo again, and they then took previs, we exported all the scene files, which had been done in Maya and Unreal by The Third Floor, and gave them to Pixomondo. Pixomondo then fine-tuned the animation to make sure that the movement of the dragons was truly being given to the actors when they’re on the motion base.
Once Pixomondo had fine-tuned the animation, they then gave the scene files back to Third Floor. And then Third Floor translated that into a motion base move combined with the camera move.
b&a: For the virtual environments on the LED walls, were they retained as ‘previs’ level environments or ‘final pixel’ ones?
Angus Bickerton: If we take episode 10 as an example, I asked Pixo to create a 30-second travel through stormy clouds, giving us a 90-degree view. We then took that 90-degree view and flopped it to make a 180-degree view. And then we flopped that 180 degrees to get, effectively, a full 360 view.
For every shot in that sequence, we would always start off by doing it with our 360 degree environment in the inner frustum. We usually shot with two cameras when we were doing motion base flying sequences. We were able to track two cameras with the onset tracking system. And then obviously you’ve got an inner and an outer frustum. So we had the overall scene playing in the outer frustum and then we had the camera tracking data relating to this scene in the inner frustums. If the cameras overlapped, we’d choose an A camera and that would be the camera we’d choose to override the other one.
Now we could see the shot through the camera, we could blow the rain past the camera and we could have smoke atmos in the volume as well. We ended up putting huge amounts of Vaseline on the lens as well. Just nose grease and Vaseline, putting it on the lens to smudge things. Once we were getting into the mood of things and started to get the feel right, then we would do takes with blue in the inner frustum. We would have the interactive environment in the outer frustum, but we’ve got blue in the camera frustum.
We would shoot bluescreen if it was a big shot where we started super wide and we ended up moving into the actor because it would be going to be a big visual effects shot. But we were hoping that we might be able to get rider shots in-camera as much as possible. The problem with that always is that you have to go with pretty darn long lens and tight on an actor to not see the wings flapping on the side views or the tail flicking in the back views.
I knew that what might happen, too, is that editorial would go, ‘For performance reasons, we’d like this take with the actor with the background in the frustum,’ and we’d have to roto. But by definition those would be four to five-second shots and doable.
b&a: It’s interesting where LED wall shoots have gone, which is maybe that to begin with, the goal was to try and do in-camera visual effects with them. Now it’s still that, but also the fact that you’re getting great interactive light, the actors and crew kind of experience the sequence on set and you’re getting the benefit of that. But I think we’ve moved on somewhat from saying we’re only trying to get in-camera shots.
Angus Bickerton: You’re absolutely right. It can be a hard thing for the bean counters to get their heads around. ‘What, so we’re going to shoot in the volume and then you’re still going to do a little bit of effects to tweak it? We haven’t saved have we, because you’re still doing visual effects?’ However, budgetarily, when we set off for these sequences, we said, well, because we’re shooting it in the volume, are we going to save 100% of shots? No. We figured we might save 20% of the shots. We figured 80% of the shots might still need a tweak of some sort. Not just that you shot off the ceiling, but you might just even want to just brighten certain areas or tweak something in the imagery or make the flames just a little bit brighter or something like that. For a lot of that we had an in-house team.
So, I think you’re right. I think I prefer that as a methodology, to have a reasonable expectation of what the volume’s going to give you. Because for me, I still want to shoot things as much as possible in-camera. I want to try and get a lot of those shots for real. And if you can get 80%, 90% of the shot looking pretty darn good, there’s a ripple effect all the way down the line. The crew see the shots on the monitors. I think there are challenges for the actors, still, but they do see an environment, they’re not standing in a blue void. The editors see the shots as well and they cut in a different rhythm.
b&a: The choreography of the flying scenes was great. How did you tackle that?
Angus Bickerton: Well, firstly, dragons have pretty big wing spans and if you’re doing air to air, if you’re not on the dragon, you should be on a long lens because you’re quite far away. And if you’re doing a rider shot and you’re not on the dragon, you’ve got to be on a 100mm lens plus. If you’re on the dragon then you’re adopting more and more like a GoPro language where you need to be close and wide and fixed relative to the rider, and then the background needs to move. So we tried to follow those rules.
b&a: What were you finding was working well in terms of the animation language of flying and battling dragons in the episode 10 sequence?
Angus Bickerton: That was another thing that was impressed upon me, was how important character animation was to get sorted out very early on. Because obviously if you just put everything into post, you defer that decision a little bit. But by having to make sure that our motion base was doing appropriate moves, it means that you do have to sort out some character things early.
Now, our dragons are wild, they’re feral, not even semi-trained. They have this connection with their riders, but other than that they’re really quite feral. This means that they fly in a different way. One of the things that I loved about working with Pixo, and I had such a great experience with them, especially with Sven Martin, who is also a veteran of the series. They get it. They know the series, they know how it works.