A visual breakdown of Weta Digital’s killer shots from the final season.
Weta Digital handled a lot of complex visual effects work in the final season of Game of Thrones, totalling some 600 shots. To get a sense of how some of that broke down, befores & afters talked to visual effects supervisor Martin Hill about three particular aspects of the work: creatures, fire and atmospherics. Come with us on a visual journey through this work to find out how it was accomplished on a very large scale.
1. Creatures: the death of Rhaegal
Martin Hill (visual effects supervisor, Weta Digital): Originally the actual death shot in episode four was going to be three separate shots, but after we talked about it with [one of the production visual effects supervisors] Stefen Fangmeier and our animation supervisor Dave Clayton, we came to this idea that if we combined all the shots into this really sweeping, graceful, orbit move around Rhaegal as he gets hit by these arrows, it could have a lot more impact.
As the graceful camera move is happening it just gets punctuated by the arrows hitting him and then an extra design piece we added, which was his last gasp. This is because the first arrow hits him in the chest – and we have seen Drogon get hit in the chest before in the Loot Train sequence in season seven – so we know that doesn’t necessarily kill the dragons, but the one through the neck is very much the death blow.
We thought it would be great as the camera swoops around towards his head to have him cough up the blood that’s filled his lungs from the first arrow and you see it pulse up his neck and just spray right into the cameras as the camera comes around. I think it just gives real punctuation to the fact that this has actually just taken him out and killed him and then he just folds up and becomes limp as he then crashes into the water.
2. Atmospherics: the tendril storm and the dragons take flight
For episode three there were some shots where we have the tendril storm – the big wall of storm that comes in when the Night King arrives. It has this almost ‘wave’ look that we designed to really have it towering over the armies of the living.
One of the things we thought was, well, if Drogon is flying and he is flaming the ground and of course Wights are flying everywhere within the Massive army sims, we can use that light source to uplight the dragon and cast this enormous shadow of the dragon onto the underside of the big storm wave.
In episode three, there were quite a lot a atmospherics. We had a baseline wind direction – wind was coming towards Winterfell from the north and so as soon as we tracked the camera for every shot we ran three simulations of the various speed and vorticity and turbulence of the general mist and fog, and also a snow pass that lighters used in the same direction. That was pretty much automatic at the start of the shot, so when comp’er came to the shot, they had all these things ready.
The comp’er would look and different speeds and we would pick one that was most suitable. It actually worked in a lot cases. Often you want to art direct it, so then the lighter would move the snow sim and then all the mist and fog, which is also run through Eddy [see more on Eddy below], would get updated automatically for the comp’er and then they could just pick another speed or adjust the speed themselves. So because all that extra fog that’s constantly moving through the environment is lit by the environment lighting and not necessarily any of the active shadows, they can be re-lit on the fly by [our proprietary deep compositing tool] Shadow Sling and other volumetric tools we use in deep compositing.
3. Fire: dragon breath, and how Melisandre lit up those Dothraki swords
For our fire, we used a combination – some shots were plate fire, some shots were CG fire, some were a combination. Even when we used the plate fire what we needed to do was a full simulation anyway for the destruction and impact and also for the re-lighting of the environment.
For example, in episode five where we destroyed the boats and all the water is being kicked up we needed to do a fire sim to light up the interface between the water, the water spray, the aeration and turbulence under the water as the flame is under the water even though we were actually compositing in a plate element of the fire or in amongst the water layers.
One of the big things was just watching the big wave of fire that [Melisandre uses to] light up the swords of the Dothraki. As a CG piece, we know that if they were Massive assets and they are holding their swords up, we know where each of the swords are. We knew we could simulate a flame that starts up for each of those and illuminate the surroundings, but doing a similar thing to all the plate shots actually was quite a challenge.
The plates we were given for those guys, you would have maybe one or two swords that were on fire and then there were a few other swords that were lit up by LED lights which turned on when they got to the right point in the progression. The rest were just prop swords. So, rather than having to track and matchmove every sword and then attach a sim to it or simulate the situation like that, what we did was use an optical flow tool to stabilize all the swords and then painted all the ones on a still frame that didn’t have the LEDs to something that was equivalent to the brightness of the LED swords. Then we could unwind that with optical flow and the brightness would travel around with the sword.
[perfectpullquote align=”right” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]’What that meant was the compositors could completely have control over the simulation.'[/perfectpullquote]
We used a piece of software that was developed by a couple of guys who work here called Eddy, which is essentially a fluid simulator within NUKE. So what we did was, since we knew we had these bright swords moving around frame, we could assign each of them a depth in the same way we would to add fog around these objects, and then use the brightness, the luminance as well as its position, as a fuel source for the 3D solver within NUKE. What that meant was the compositors could completely have control over the simulation, the simulation only took two or three seconds per frame, so there was no simulation and then caching and then rendering and then seeing what it looked like, they could see it pretty much on the fly.
If they needed to make any adjustments or move the horses around they could re-simulate and then the compositors also had control of how much wind or turbulence was in the fire elements themselves. So in that way for the close-up fire, we were able to get really good control of the flames of the Aurochs. That worked for all the charging shots as well. You are taking into account the motion of the horse in one space – there’s a full 3D sim but it’s powered by the plate or the augmented plate.