The virtual production, character anim and FX behind Robert Zemeckis’ ‘The Witches’—including how ‘BobZ oners’ were crafted.
Visual effects supervisor Kevin Baillie has worked with Bob Zemeckis for the past 14 years, indeed, he’s been part of every single one of the director’s films over that time span. “People occasionally ask me,” relates Baillie, “does it ever get boring working with the same director for so long? It only takes me a split second to emphatically respond with ‘NO!’”
That’s partly because the more-than-a-decade partnership has seen a continuous evolution in the kind of work and the approach to the work that Baillie and Zemeckis have undertaken. For example, adopting the latest virtual production techniques has been a key development—“it’s certainly one of things that keeps the relationship spicy!” suggests Baillie.
“In all seriousness,” the VFX supervisor adds, “every project of his is so different, meaning that each tends to call for a unique bag of tricks. And, since Bob never shies away from new technology or techniques, it keeps the door open to combining the best of what we did on previous shows with the latest and greatest in the industry.”
A case in point is The Witches, Zemeckis’ adaptation of the Roald Dahl novel that sees a young boy and his friends encounter actual witches, and get turned into mice in the process. The film took advantage of several virtual production techniques Baillie and Zemeckis had helped pioneer individually on previous films.
“The Walk was one of the first feature films to adopt on-set optically-based simulcam,” details Baillie. “For Allied, we used LED wall technology at a scale that was a first at the time. On Welcome to Marwen, we utilized Unreal Engine to pre-light/block the entire film, and collaborate with the production designer on sets from the earliest stages of prep. With The Witches, we combined all of these techniques and, since nothing every goes perfectly the first attempt, we were able to use lessons learned in the past to significantly improve on each stage in the process!”
“Ultimately,” adds Baillie, “each one of these technologies all help to ensure that Bob has the best opportunity possible to tell the story he wants to tell – that he’s able to get as close as possible to ‘touching the final pixels’ on the screen. To paraphrase Bob, the end goal of virtual production is for it to act like a typewriter that, instead of producing a screenplay, ‘prints’ his vision into viewers’ eyeballs.”
Visualization and virtual production
Nearly every shot in The Witches was visualized, mostly prior to and during production, using a range of virtual production tools. Baillie explains how it was approached.
“In prep, we worked with production designer Gary Freeman to get a clear idea for his set designs; whether they were locked solid or still being worked out. The teams at Method Studios and NVIZ, led by Ryan Beagan and his operator Mooj Sadri, worked to build those sets and optimize them for real-time viewing in Unreal Engine.”
“NVIZ animators then set to work bringing to life our film’s hero characters—three kids-turned-mice and a bunch of nasty witches—in those sets. Each scene’s action was designed start to end, without any concept of ‘shots’ in the process yet. Bob then shot virtual cameras in those scenes using a custom camera-like mount with a Vive puck on it, which fed into a custom NVIZ VCam plugin in Unreal Engine, spearheaded by Hugh Macdonald, that did everything from take management to lens swapping to character attachments to focus pulling. Bob was able to shoot up to 200 setups in a single 4-hour session using this toolset, which then fed to editorial to create the previs cuts.”
A major benefit of this approach to the previs was, according to Baillie, getting Zemeckis to be heavily involved. “I can’t overstate how much value it brings when the director is able to oversee and ‘touch’ the previs like this, as opposed to having a team of animators do it through verbal direction. It helps to genuinely engage the creative muscles of every live-action department very early in the process, and served to inform everything from lighting to lens copies to changes to the designs of live action sets which were yet to be built.”
During the shoot, NVIZ operated an Ncam system that allowed Zemeckis and director of photography Don Burgess to see the CG mice and set extensions ‘through the lens’ of the live action camera, live-comped onto the backgrounds. “This let us complete complex shots that required multiple interlinked setups,” details Baillie, “and gave us confidence that our plates would work once they were turned over to Method’s or NVIZ’s VFX teams in post.”
“Virtual production obliterated the opaque curtain that traditionally sits between live action departments and VFX,” continues Baillie, “inviting pre and post production teams to mingle and benefit from each other’s ideas, gaining an understanding of one another in the process. That, to me, is the shining success story of The Witches.”
In post, key animators from Method Studios, NVIZ and Day For Nite worked hand-in-hand with Zemeckis, Baillie and the editorial team to complete postvis for any live-action shots. This process was pretty ‘normal’ as far as postvis goes, remarks Baillie. “While the team had far more work to complete than originally anticipated, there were relatively few surprises due to the success of our shooting team’s early planning efforts.”
Crafting shots: the transformations
Some of the signature kinds of visual effects in the film are the transformations, in which humans morph into mice. Baillie observes that the director wanted the transformations to feel like they were physically plausible, rather than ‘glowy magical’ or ‘sci-fi.’
“In this version of the classic Dahl tale, the transformation is a one-way affair—unlike the 90s film—so we were free to imbibe them with a sense of permanence. The shot of our hero character is the best example of this: his skin bubbles and pops, emitting purple vapor and a furry skin underneath. Method’s talented FX Animation teams, led by VFX supervisor Sean Konrad and his ‘right hand’ in Montreal, Christian Emond, walked a fine line between realistic and grotesque to create the effect, which had to comply with the film’s PG rating.”
Konrad describes the creation of this transformation effect further for befores & afters: “We created a set of blend shapes on our Hero Boy asset that was morphed into the shape of the mouse’s head, which we then groomed using the hero boy mouse as reference for the relative lengths of the fur. This was put into a custom rig so that animation could control various parts of the face—the ears pop at one rate, the nose at another, hands shrink, eyes bulge, etc. A reference plate was shot for his facial performance and we initially considered using that as a projection, but the timing and head angle of the final image became a barrier.”
For the skin, a simulation was conducted in Houdini, with primvars exported into the cache so that artists could pick this up in Katana/RenderMan. “We wanted the bubbles to look painful but not grotesque,” says Konrad, “so we made it look purple as it expanded, kind of like bubblegum. This became the justification for the purple smoke that bursts out after. We discussed adding veins or other details but ultimately we felt like this would be unappealing.”
“As the bubbles get larger the primvars not only allowed us to drive the color, but the apparent translucency, which you can feel in the subsurface. A small amount of particulate was added as the pops occurred to give the sense of some of the skin detaching.”
Meanwhile, the fur was groomed in XGen. A fur length multiplier with a noise breakup was used to make the hair ‘grow’ from the root. A mask of where the skin had burst was also used to drive the length. “The fur guides were then sim’d in nhair vs. the popping geometry to drive collisions,” explains Konrad. “The guide length discrepancy meant blending a few different sims together. There were some issues with bits of fur intersecting the skin geometry and the cloth, but once motion blur was applied, we didn’t see a ton of issues, and a quick paint pass dealt with the minor ones that were left.”
For shots of Hero Boy’s eyes going back, Method initially discussed carrying out a simulation but ultimately a compositing artist mocked up a temp solution for a trailer version. “Everyone really liked it and leaving it in 2D meant we were able to quickly adjust timing easily up to the end,” discusses Konrad.
“Throughout the shot we see a smoke trail coming from Hero Boy, too,” notes the VFX supervisor. “The directive on this had been the way that when you see cowboys/ranch hands in a field after they’ve been rolling around in dirt, there’s almost an aura of turbulent smoke coming off of them. A funny idea from Kevin Baillie was this was meant to look like what a fart would look like if you could see the smell, which fit with the initial pyro event that launches Hero Boy coming from his butt.”
“This shot,” says Konrad, “also happens to take place as a variable speed retime—Bob wanted a ‘Fast and Furious’ moment—while animators were working on the shot they animated at 96 fps, and then handed this to production editorial and internal compositors to play with different speed options. The variable speed combined with changing point count in the face topo, and Houdini/nhair/ncloth simulation on various pieces of the rest of the digi-double, meant there were some motion blur problems. Most were resolved with some creative hacking, and then a non-motion blurred version of the render with some Nuke Kronos applied helped resolve any issues that remained.”
Shooting mouse scenes
To film sequences that would ultimately feature CG mice, the filmmakers orchestrated a number of methods to help with actor interaction and to sell scale.
“While we relied on simulcam for shooting some of the more complex mouse-driven shots, we used stuffies of the mice and stand-in ‘poles’ to frame and shoot actor-driven moments,” explains Baillie. “A physical proxy is always helpful to ensure a consistent and believable eyeline, since it’s remarkably difficult for most people to lock in on, and focus their eyes towards, an object that only exists in an imaginary space 3ft away from your face!”
“We also made a ‘bean bag’ stuffy of the film’s sentient cat, Hades, for the Grand High Witch played by Anne Hathaway to pet and hold,” says Baillie. “This stuffy, which helped to sell the physical contact between Hathaway and her virtual cat, was later painted out and replaced by Glenn Melenhorst’s team at Method Studios Melbourne.”
Stuffies worked well for framing and interaction, notes Baillie, but not necessarily lighting reference since the synthetic fur “never really looks quite right on the camera.” For Zemeckis and Baillie’s next film, the visual effects supervisor hints that they are “forgoing stuffies in favor of something much better, which I think y’all will get a kick out of. See ya’ in mid-2022 to talk about that one – haha!”
An additional aspect of shooting the mice scenes involved recording kid actor voice performances. This was handled with real-time facial motion capture using Cubic Motion’s head-mounted cameras and their real-time solving software. Says Baillie: “This data was used to produce basic animation on our previs mice and, in some shots, to give our VFX animators timing guidance for the mouse performances. A lot of the inspiration for the final mouse performances came from our wonderful actors Jahzir Bruno, Chris Rock, Codie-Lei Eastick and Kristin Chenoweth, as well as the talented animation teams at Method Studios in San Francisco, Montreal and Vancouver.”
Making CG mice
The mice characters had to perform like real mice would, albeit with some very specific behaviors. This meant they were always considered as CG creations, and Method Studios drew upon initial concepts in making them. However, a real mouse for the character Daisy was used, and this meant that the mice had to be grounded slightly more in reality to match.
“We used that real mouse to give us the proportions of key features like the legs, arms and hands,” outlines Konrad. “The anatomical limitations were incorporated for their arms and legs, slightly limiting their mobility and forcing them to act more broadly with their whole upper bodies rather than just their arms when handling objects. We also incorporated some of real mouse twitchyness, especially early on for Daisy.”
An important aspect was to still identify the real actors through their faces, even as mice. For example, notes Konrad, “Hero Boy’s eyebrows and lips became features we tried to emphasize, as well as Bruno’s forehead, freckles, and larger body. Initially we kept the facial movements a little bit more limited like the real actor expressions, but found that especially in wides their expressions were hard to read, so we overcranked the base shapes so that animators could push them to extreme levels.”
“This ended up being used in closeups as well in the end,” adds Konrad. “We found that the characters were funnier when the expressions were really cranked. The real actors were also used as reference for their body performances, Bruno is quite stiff, Hero Boy sort of walked with extremely long strides. But we also wanted them to become comfortable with their mice bodies quickly so those distinctions are mostly visible when they’re bipedal early on, and unique gaits were used for each of them when they’re bipedal, drawing from the unique anatomical proportions of each mouse (higher cog on Bruno from his weight, wider hips with Daisy).”
One complication proved to be the heavy black eyes of the mice, making their eyelines sometimes hard to read. To solve this, Konrad says Method Studios artists tried to rely on classic stage actor approaches and “use their noses to really sell where a character was looking. When they talk to Grandma, her face is so large compared to his and would occupy a large field of vision so we actually had him scan between her eyes.”
‘Gags’ for the mice in most scenes were an element pushed by Zemeckis, says Konrad. “That wasn’t always humorous moments although wherever possible we amplified humor. It was sometimes just a small physical movement that communicated the mice’s mental states in each scene and shot. Bruno clutches his tail as a security blanket in a moment where Grandma tries to make an antidote while Daisy rubs her hands and hero boy bounces excitedly. Bruno gets stuck in things because of his size, Daisy cleans herself, Hero Boy spins while explaining one of his plans, etc. All these things created moments of characterization outside of dialogue.”
An assortment of animals
In addition to the mice, the visual effects crew delivered CG cats, snakes, chickens and other creatures for the film. “This makes me think about the unsung heroes of many character-driven VFX films: the animation supervisors and their teams,” attests Baillie. “Even with a brilliant actor performance to build off of, the translation from actor to character is a subjective and artistic endeavor—one that rarely gets the credit it deserves.”
“For an animator to understand an actor’s intent well enough to faithfully transpose it onto a CG character, and elevate that performance using the CG character’s unique features, is a very special skill indeed,” adds Baillie. “A big shout out to Jye Skinn, Randall Rosa, Patrick Heumann, Nicholas Tripodi and Marc Chu for their leadership of these talented artists.”
Augmenting the Grand High Witch
The Grand High Witch sports an evil grin that ‘rips’ from ear to ear., with early concept art defining the look that would be achieved with the aid of visual effects. “Once we knew the look we were aiming for,” notes Baillie, “we scanned Anne Hathaway in a full FACS session with a 3D Systems photogrammetry-based rig to serve as a baseline for her inhuman abilities.”
No facial markers or HMCs were worn by Hathaway during principal photography, since, as Baillie advises, Zemeckis wanted flexibility during editorial to decide when the grin appeared and vanished. “Just Anne performing on camera like she would in a normal film! That turned out to be a great decision, since Anne’s inspired performance itself ended up being what defined when the rip happened—something we could never have predicted ahead of time.”
“After the shoot,” says Baillie, “Method’s facial animation team, led by Traci Horie at Method Studios in San Francisco, painstakingly embellished Anne’s face with the gruesome grin, featuring razor-sharp teeth and a forked tongue. A complex comp process in Nuke, which involved re-projections of the moving plate onto 3D geometry and UV-space warping, ensured that the digital prosthetic blended seamlessly with our plate photography.”
How Method shared shots around the world
To complete the visual effects for The Witches, which was released in October 2020, Method Studios relied on its network of studios around the world, including in Vancouver, Montreal, Melbourne, New York, Los Angeles, Pune and San Francisco.
Some of the approaches to sharing shots between studios depended on the infrastructure set-up at each studio. “There was a small amount of asset sharing and that was achieved in the usual ways—handing off alembics, textures, etc,” says Konrad. “One shared shot where we see the cat and the Grand High Witch rat was a particular challenge. We sent blocking animation back and forth as both playblasts and alembic caches so that we could make our characters react to each other. Ultimately the cat being the predator in this situation did a lot of the driving and our rat mostly just reacted to her. We re-projected the cat render onto the geometry so that we could get accurate reflections, they also provided a render of the same thing so comp could mix the two.”
Meanwhile, Konrad notes that Vancouver, New York, LA, Pune, San Francisco and Montreal all work on the same pipeline. “We’ve invested heavily in syncing systems, the backbone of which is Signiant, with our asset management tools all being multi-site aware. From a collaboration point of view, COVID presented some challenges but also some opportunities that benefitted multi-site collaboration—remote reviewing and screen-sharing tools needed a shot in the arm, so we started providing multiple options for tools that exist within the secure VPN for ad hoc artist reviews and screensharing such as Jitsi. Linked RV reviews were also used, but their netcode needs a pretty significant rework so isn’t a great solution for large groups.”
So, Konrad says that for high resolution full frame rate remote review, they used Evercast. “This allowed teams in Montreal, San Francisco, LA and Vancouver—where the most collaboration was happening—to see each other’s work in real-time regardless of whether or not the image data was sitting local in each facility. We also set up boxes for artists working remotely so that they could get into any of the individual facility’s networks in case there was a specific technical reason that was needed (large FX caches, debugging TD scripts).”
“This meant,” continues Konrad, “that we could also shift shots around if capacity or skillsets became a challenge in any of the facilities. One example of this was the shadow puppet sequence—early in the story Hero Boy sees the outline of parts of the evil witches projected through the rain on the wall of a hotel room. The concept evolved into something that became more appropriate for a commercials pipeline approach. While we share pipelines, commercials use a lot of non-standard software, and our New York facility ended up taking on this work, which also allowed us to feed them some titles at the end of the film.”
Konrad says that as a VFX supervisor having to watch over the bulk of 600+ shots mostly out of Vancouver and Montreal (which split the work largely by sequence), he ‘virtually’ lived in a persistent Evercast room for a majority of the days, crossing Pacific and Eastern time zones.
“Myself and other supes were able to annotate on images so that artists could see areas of interest, and Shotgun provided us the ability to easily do comparisons and solve problems on the fly. Teams would have their own internal reviews per site, but also communicated frequently to make sure that setups for the mice, which were shared, were looking and behaving the same.”
Trademark Zemeckis shots
“In every Zemeckis movie,” advises Baillie, “there are trademark long Zemeckis concept shots’ – surprise! Bob loves the challenge of telling a story that may take other directors several cuts in a single shot, which is evidenced by the fact that the shot count for his films almost always comes in at under 1,000. And that’s not VFX shots I’m taking about—it’s total shots in the film—whereas most films these days have well over 2,000 shots.”
For Baillie, there were a couple of shots like this in The Witches that stood out in particular, shots he calls ‘BobZ oners.’ The first of these features the cat Hades, who is sitting on a rainy ledge spying on the protagonists. “As the shot progresses,” outlines Baillie, “Hades jumps down onto the ledge of a balcony, slides down a column to the balcony below, and runs into the Grand High Witch’s suite to report his findings.”
“This shot required 3 individual setups in 3 different sets, a cat visualized through simulcam to help guide the camera action, CG rain and environments to help bridge everything together. The final result took 393 internal takes to complete and was over 2,800 frames long—that’s 2 minutes—for that single shot! Kudos to Method Melbourne for sticking with that one.”
The second shot Baillie highlights follows the mice running covertly, Mission: Impossible-style, through the hotel’s lobby. “This shot also took 3 separate setups to achieve due to camera and set requirements. Planning through previs was crucial to pulling off this shot. Those previs mice were then able to be seen through the lens of our live-action camera, thanks to simulcam, and gave us guidance in terms of the camera’s physical pacing and positioning.”
“The ‘glue’ between all of these plates,” states Baillie, “was accomplished through 3D reproductions, digital environments, comp warping/stabilizing and fine artistry. The result is really seamless, which makes me super proud—especially since Bob entrusted me with the task of directing that shot on set!.”
Baillie calls out 1st assistant director Lee Grumett and his team as a crucial part of making these oners possible. “I can’t overstate how big of a difference it makes to have an AD who not only recognizes the importance of the VFX process, but thoroughly understands it and is looking out for us at every opportunity. Lee made sure that we had the chance to get every plate, LiDAR scan, reference photo, HDR, chrome ball and bit of data that we needed, and liaised between departments anytime a little bit of translation or motivation was necessary.”
Indeed, Baillie goes further to argue that “the degree to which Lee and his team helped to make the shoot productive and enjoyable just goes to show how important it is for the VFX and physical production teams to build a bridge early in the process; be it via concept art, virtual production or simply having conversations.”
“Not every team will be so lucky as to have a ‘Lee’ on their side,” says Baillie, “but, especially with advancements in virtual production, we VFX folk can increasingly bring physical departments into our virtual world. By doing so, we’ll help them to help us, and I’m confident that films will be better off because of it!”