Behind the visual effects on season two with VFX supervisor Raymond McIntyre Jr., including shooting on a greenscreen court with rollerblades (!), developing a specific tracking marker approach, adding crowds, and dealing with film and even broadcast video stock.
The penultimate episode of season 2 of HBO/Max’s Winning Time: The Rise of the Lakers Dynasty includes a dramatic 1984 playoffs montage featuring the LA Lakers and the Boston Celtics which plays out essentially as a ‘oner’.
As players take shots or pass the ball around, the action shifts seamlessly between games inside the Lakers’ Forum arena and the Celtic’s Boston Garden. This was made possible via the stitching of several game-play plates together.
The game-play was conducted on a purpose-built stadium in a Warner Bros. soundstage surrounded by greenscreen and filmed by many camera operators, one of which was on roller blades.
The stitching, stadium and crowd builds and final compositing were overseen by visual effects supervisor Raymond McIntyre Jr. (Buf, one of several vendors on season two, handled this playoffs sequence).
Here, McIntyre Jr. tells befores & afters how the playoffs scene was designed and executed, and what the VFX supervisor also brought to the show for season two, such as a re-think of the tracking marker approach on the greenscreens and the lighting for the arenas.
We also dive into the challenges of shooting and compositing on film and 1980s-era broadcast camera footage.
b&a: From an overall point of view, effectively VFX needed to craft crowds and craft stadiums for the basketball matches. How were those matches in season two filmed?
Raymond McIntyre Jr.: It was not too dissimilar from season one. We built a basketball court on a stage at Warner Bros. We had the court floor, and then behind the basket hoops, we had four or five rows of seats. On one side of the court, we had three rows of seats, and on the other side we had eight. During shooting, we did not have enough extras to fill all those seats at any one time, so we had to move the extras around to best fit the cameras that were shooting on the tightest lenses. Many times we would just cover the seats with greenscreen due to lack of extras. Sometimes we were capturing the action with more than six cameras at one time, so it was a juggling act to move around the extras.
That approach was similar to season one, but one of the main things we did differently was, in season one they used laser tracking markers, which sounds like a great idea, but in practice it didn’t work so well. You can move the lasers and position them so they’re not in somebody’s hair if you’ve got a near static shot, which is nice. But the problem is, as soon as you move the camera, laser marks are too small to be seen by matchmoving software. The motion blur just kills them. This time around, we fixed one foot by one foot squares and patterns that were permanently attached through the fabric of the green so that they would last the entire season–otherwise when you just stick them up there, in a week or two, they would fall off.
Next we did a geometry scan of our stage set build. My VFX team put tracking markers up once production built the basketball stage. Then we performed an in-house photogrammetry scan and solved for the geometry in house. This was done for several reasons.
First, we provided this geometry to all the VFX vendors to be used during post.
Second, my VFX team created the hero Forum geometry incorporating the stage scan from above and generated the final Forum .obj.
Third, with so many non-static production cameras, including a rollerblade cam skating through the court, I was concerned that once we got into post, it could be difficult to tell where the camera was physically in the arena and what part of the arena needed to be generated for a specific VFX composite. To solve this, my VFX team created markers of different shapes and affixed them in unique patterns to the 38’ tall, 360 degree greenscreen. I figured if you could see a particular pattern of the tracking markers later when looking at a VFX pull, then you could go to the LIDAR scan and say, ‘Oh, I see those four tracking markers here. I must be looking in that direction.’ It really helped with orienting yourself to the world.
b&a: When I talked to the team on the first season, I think one of the challenges was definitely working with different film and video stock. What challenges did you face for season two, and what different cameras and stock were involved?
Raymond McIntyre Jr.: We shot on everything from a RED at 600 frames per second to 35mm film for the majority of the series. We also shot on 16mm and 8mm film. There was even some footage shot on the sports broadcast 525 Ikegami cameras from the day and VHS camcorder, as this was the desired look.
The film footage was interesting. I’ve been around/doing this a long time, from the days of film prior to digital cameras. I thought I remembered that film was great! But you quickly realize today how far digital cameras have progressed and how bad film can be in comparison. The exposure latitude that you don’t have with film and the heavy noise and grain are very noticeable. So, shooting on film certainly made pulling keys harder. Even more so when we shot on 16mm and 8mm stocks.
Thankfully, we didn’t do very many visual effects shots with the 525 Ikegami cameras, but we did some. And boy, when you start looking at that and try and pull a key, whoa. Then even if you could pull a good key, your edge might look different than the real edge that’s there in the video camera. So now you have to dirty up your edge to make it look like or match the practical 525 edges.
Frequently, we would have a person partially on greenscreen and partially not on greenscreen due to camera angles. If the practical edge that’s below the greenscreen has this ugly kind of edge thing going on (due to the source material), and the keyed green edge up higher does not, the composite doesn’t look good or natural. Frequently, we would have to add a matching ugly edge treatment to the keyed edge in order to make it match the practical non greenscreen edges. All the varieties of original source material made this a challenge and added an additional step to the VFX comp’ing process.
Ultimately, there was a tremendous amount of extra rotoscoping done. Cameras were frequently pointing up off the greenscreen and seeing the stage ceiling.
b&a: Which companies did you bring on as your vendors for this season?
Raymond McIntyre Jr.: There were five interior arena and crowd vendors. First, our internal visual effects department built the hero geometry for the Forum. This was shared with all vendors. Since we didn’t see the Boston Garden until a later episode, FuseFX modified the VFX supplied Forum model into the Boston Garden obj. This Boston Garden model was then shared with all vendors as well.
I deliberately staggered episodes and vendors. For example, I used Buf who worked on the second episode and then they didn’t work on another episode until episode six. I would give a vendor a lot of shots in an episode and then didn’t give them shots in the episodes in between. If any one vendor got behind, they could only make that one episode late, with no effect on other episodes.
The five vendors included BUF, Pixomondo, Pixel Magic, PFX and FuseFX / Folks VFX who all did a fantastic job as well as delivering on schedule.
b&a: How were you approaching the interior stadium build this time around?
Raymond McIntyre Jr.: On the first season, on the side of the court with six or eight rows, production built a big practical gray wall immediately behind the last row of seats. I’m not exactly sure why that was the case. I think the idea was that you didn’t have to blend CGI people and CGI seats immediately behind practical people and seats on stage. You’d have that separation of the wall like it was a divider, so no direct comparison of practical to CGI.
My issue was that the arenas in season one ended up looking more like a high school arena, because at a high school the seats go right up to a wall and seating stops. It took me out of every stadium shot in season one to see this big gray wall immediately behind the eighth row. The added CGI seats above felt disconnected from the seats below. Seating sections in NBA arenas of the day were much larger than what was built on stage. So, one of the first things I said was, let’s not put the gray wall in our stage build. Our practical seats now butt right up the digital seats. The result is the CGI arenas of season two match the seating look of the NBA arenas much better now.
The other thing was that the season one stadiums tended to be lit brighter and flatter with less contrast. When you look at pictures from the Forum in 1980, which are exposed for the court, the seats/crowd/stadium falls off into darkness pretty quickly. In season one, the arena in most shots is brighter than it should have been, in my opinion. Fortunately all the creative parties in production agreed to make these changes for season two right from the beginning. This affected the physical stage build as well as lighting design and placement for better realism. My goal was to make each shot look like the real NBA arena from the 1980s.
There was actually a moment in the playoffs sequence where we had to stitch from a darker interior to a brighter arena. The Forum is the darker arena and the people in the arena are lit darker than the Boston Garden. There’s actually a little bit of story behind that. Apparently, [LA Lakers owner] Jerry Buss and the Lakers in 79-80 were the first team to ever do that. He darkened the arena and where the fans sat and made it all about the court. So the court is lit brightly and the fans are not. Other arenas started doing it in that season as well, but Boston did not. So we creatively made a distinction between the look of the CGI Forum and the CGI Boston Garden by playing the Boston Garden arena brighter.
Also, the Boston Garden could be very hot inside. Apparently they didn’t have air conditioning. In the playoff series, there’s a thermometer shot showing the interior temperature at 94 degrees. That’s a visual effects manufactured shot entirely. This was done to illustrate the heat and humidity inside. There was haze inside of the actual arena. So all VFX comps of the Boston Garden had various levels of haze added.
What it meant for some of the transitions was that we had to go from a well lit arena to a dark area. Initially, it just looked odd when we did the transitions. Even though the camera move was perfect–everything was smooth–but it just didn’t look right. So I actually started having to introduce the upper portion of the Forum to replace the upper portion of the Boston Garden sooner so that the shot could take longer to transition from light to dark within the same number of frames. This made the density transitions look better.
b&a: How was the scoreboard handled in scenes?
Raymond McIntyre Jr.: On the stage we had a green 12-foot box that could be lowered down on chains that represented the placement for the scoreboards. The scoreboard was much higher in the Boston Garden, a bit lower in the Forum. This allowed us to position it in our shots on stage creatively. For all our darker shots, we put lit tracking markers on the box for VFX as well as this allowed the camera operators to see the box and frame for it creatively.
b&a: How did your vendors tackle crowd replication or generation?
Raymond McIntyre Jr.: Well, in season one they used a lot of sprites. What tends to happen there is that you see the same people in the multiple shots at different games if you pay attention. You also have very limited actions. So, what I requested from everybody was that if they used sprites, they only used them for the first few rows behind the practical people. After that, I wanted them to use digital people.
Pixel Magic, for example, used Reallusion’s Character Creator. It will produce a digital human that looks plenty good for this distance from camera. It also comes with built-in mocap actions as well as rigging. You spend the time to texture. You can scan your own heads and apply them. So, if you have a bunch of photogrammetry-acquired people, you can apply them to a Character Creator human, which is very useful.
When you use sprites, you are stuck with the lighting that the sprite character was lit in. As soon as you have to either change the lighting or put a directional light on a sprite, you can’t do that very well in comp. You can darken them overall. You can try and pull down the specular highlights and things like that independently from the overall, but you can only do so much to them.
With a digital human, we can match every lighting situation. I wanted the vendors to use a digital crowd for the majority of the arena work. Now, with that said, when you’re on a 120mm or 150mm lens and you’re pretty tight with only three to four rows of real people, then you see these people very clearly. That’s a stretch for a digital human simply because it costs considerably more to make a digital human look good, that close.
So, it didn’t make sense to say you could only use digital humans. It was just that I did not want the vendors to only use sprites to fill an arena because you end up in wider shots that are sharp, or no motion blur, seeing the repeated people, seeing the repeated action and picking them out.
b&a: While I was watching the show I noticed there were plenty of camera flashes, perhaps intended to be from the crowd or from press. Was there an attempt to incorporate that into live-action photography or was it all a post effect?
Raymond McIntyre Jr.: Yes, that was built into the live-action by our DP Todd Banhazl. The intent was that it’s somebody in the crowd doing flash photography. I would have to say if we did it again we would tone it down a touch because it ended up looking brighter than a specific person with a camera flash. The lighting crew had the flash mechanisms all around the stage. Every practical flash was then matched during the CGI lighting of the arena so that the added CGI crowd flashed as well.
b&a: For that playoffs montage, I’m really curious about what your brief in visual effects was for that. What were the earliest conversations you had about how it would be brought together?
Raymond McIntyre Jr.: The director of that episode was Salli Richardson-Whitfield and the DP was Todd Banhazl. That pair had done the first episode and the last two episodes, actually. The reason why I bring that up is they were the perfect pair to do this with because they were very organized. They made shotlists and we would talk through things. You don’t necessarily get that with every director and DP combo where either they’re not in tune or it never filters down to visual effects. So, I’m very, very glad that it was this pair. They were fantastic to work with.
Todd was very articulate in saying, ‘This is what we would like. How do we achieve it?’ Because, as a visual effects person, that’s what I rely on. I rely on somebody saying, ‘Okay. This is what I want and let’s talk about how to get there.’ If the creative team does not articulate or communicate that early on, then VFX is standing there on the day and all of a sudden you’re only then starting to realize what the creatives want.
In the script it just reads as ‘the Lakers and the Boston Celtics are each progressing in the playoffs’. From reading the script alone, I did not get that we were showing it as one continuous scene until talking to Todd and Sally. Then they explained to me that they wanted to follow a pass or follow the action from one arena and game to the other arena and game in a seamless pass or transition. There’s 10 of those transitions in that one shot–10 stitches of passes and teams.
b&a: It’s quite subtle in the end. I enjoyed watching it a few times to work out who was passing to who.
Raymond McIntyre Jr.: Thanks, yes, the stitches could end up being fairly complicated; for one of them, you’re in the Boston Garden, you’ve passed from the Lakers to Boston, and then Boston Celtic’s Larry Bird shoots and scores, and as the ball is coming down, the Celtics are now playing a different team inside of the Boston Garden. It’s a subtle moment where most people wouldn’t realize there’s a stitch there.
For the stitches, I would have my VFX team standing at the side–hiding as best they could–to record all the information. Remember, every shot in that sequence is a guy on rollerblades skating around with a camera. The camera might see 360 degrees in every shot or at least 270. So, finding a place where someone could stand was tricky. But what I wanted was everybody to understand, ‘Okay, this is the stitch for this piece on the A side, and two weeks later we’re going to come back and shoot the Lakers side, or the B side.’ That two week break was because they needed to swap the floor out depending on which arena it was meant to be.
In terms of recording the information, I needed to know exactly where the rollerblade camera operator was on the court and where he had been pointing the camera during the A side shoot so that when we came back to shoot the B side weeks later, we knew how to line it up again. We’d say, ‘Okay, you have to be here pointing this direction at this time for the stitch to the B side.’ We could put a mark on the court and show him, and then the rollerblade operator could practice. We’d watch it and say, ‘Yep, that should work.’
We actually laid out on paper each of the stitch points and then showed the operator. Just to reiterate, this operator was on skates and moving constantly, so he’s not paying attention to where he is on the court, he is capturing the basketball play as it is happening. He’s looking at his camera monitor. For all the B side stitches, the operator practiced the matching move a bunch of times to get it right, just so it became ingrained in his action prior to the actual shoot.
We tried to do something different and unique for each transition. One time it’s a bounce pass, another is a pass across, next it might be a long high wide pass following the ball, and yet another time it’s after a basket has been made. Each one of the actions that generates the next stitch is different.
b&a: There’s obviously a huge editorial part of this. How did the editor tackle this in terms of temporarily lining things up?
Raymond McIntyre Jr.: VFX would note the takes that worked the best while shooting, and I would go over to the script supervisor and say, ‘Okay, the first three takes, they’re no good. Visual effects can’t use them. Take four is good and take five is good.’ But we also had to pick one before shooting the matching B side. From the A side of the stitch, we (director, DP, VFX) had to pick the one we would like because each A side take, the camera wasn’t in the same position or pointing in the same direction due to the movement mentioned above. After collaborating with the creatives, we might pick take 5 from the A side, and lock that in. Then VFX would set up and place the roller cam operator in the appropriate place for the B side stitch and rehearse as needed until the camera movement would work. Once we were in editorial, our hero takes were already defined.
b&a: What did you find were the challenges of making those stitches as seamless as possible, apart from in the shooting of them?
Raymond McIntyre Jr.: Once the shooting was done, I approached Buf specifically for this playoffs sequence. I like how they think creatively about something before they just jump into it, and that was definitely the main thing here. Buf was an excellent partner for this.
One of the biggest problems we had–and this was really for all of the basketball scenes for all the vendors–was that whenever the camera tilts up, there’s all these lights in the stage ceiling. They became a real problem because they’re not the correct distance away from the camera, the stage ceiling is much closer than the arena ceiling would be. You can’t leave them there because then they would float incorrectly in z-space once the camera is looking up at the digitally added ceiling.
So, one of the biggest things we decided to do was to lose every practical light unless it causes a flare / interaction with a FG player. Here, I don’t mean a camera flare, I mean where it might make the side of a player’s face really bright as the player passes through it. If we got rid of the light that was making the interaction, then the side of the person’s face would not look good since there’d be no light to justify why it got bright.
As soon as we realized this, we would put matching CGI lights in our geometry, which is easier said than done due to parallax shifting. The CGI ceiling light would not move like the stage ceiling light did. So, we had to do a lot of cheating. If a light starts on the left side of a player’s head, then as the camera moves, the light goes behind the head and comes out the right side, well, the one that comes out the right side is a different cgi light than the one that started on the left in order to help with parallax and distance issues from camera. We’d have light #1 positioned for a few frames, and then it gets blocked by the head, and then we would add a second light in a different position for the remaining frames to get the tracking to look correct.
That was one challenge. Another was playing temporally with the footage to match stitches as seamlessly as possible. I made the decision that we would stretch out a stitch if we needed to, or shorten one, to make them work better. For example, a stitch that crosses right to left from one game/arena on the right to the other game/arena on the left in mid pass, we needed to get out of the A side sooner than the shot initially did in order to introduce the B side without a visible lighting pop during the transition. To do this we rotoscoped all the players on the court on both plates, which gave us the ability to induce a virtual camera move adjustment along with a temporal timing change that made for a seamless camera transition as well as a seamless lighting transition.
It was a very fun sequence to design, shoot and achieve. And thanks to Buf, it’s a beautiful sequence.
VFX Team for Winning Time s2:
Raymond McIntyre Jr.: VFX Supervisor
Victor DiMichina: VFX Producer
Reno Warmath: VFX Coordinator
Ray McIntyre III: VFX Data Wrangler / Coordinator
Rachel Chang: VFX Editor
Tanya Phipps: Assistant VFX Editor
Cherise Pascual: VFX PA / Junior Coordinator