The art and tech behind how the visual effects studio fills out stadiums and battle scenes.
Hybride Technologies was founded in 1991, gearing up quickly to deliver visual effects on feature films from 1995. Since that time, one area the Quebec-based visual effects studio has specialized in is crowd replication.
This has come in handy for filling out stadiums with cheering fans, augmenting live action photography to add more attendees, and generating completely digital crowds for battle scenes.
Here, we take a look at three different films where Hybride’s crowd duplication and simulation came into play; Jappeloup, Unbroken and Warcraft. Key crew members from those productions break down the process.
Olympic-sized crowds on Jappeloup
For Christian Duguay’s Jappeloup, about the great showjumping horse, Hybride delivered 427 stadium crowd shots within a 4-month deadline. While VFX artists were delivering final shots, R&D engineers developed crowd-generating tools and pipelines, making sure the new features wouldn’t compromise any work that had already been completed up to that point.
Original Olympic event reconstitution was Hybride’s main challenge. Using the original footage provided by television station that covered the relevant horse events as reference, Hybride recreated and matched the environments and crowds from those events. The diversity of clothing was one of the early tricky encounters for the studio. “Since show jumping is not a team sport, spectators are all dressed differently as there are no team jerseys to wear,” outlines Hybride president and head of operations Pierre Raymond. “If that had been the case, the crowds would have been much easier to recreate. “
Different technical approaches were adopted for the five stadiums represented in the film. Since the stadiums varied in size, and each hosted a different type of competition, the conceptual approach for crowd multiplications varied according to each environment.
Two main techniques were used. The first involved projecting live-action footage onto cards and then using a variety of techniques to control lighting and animation. “Actors were filmed in different sections of a stadium and the footage was used for references as well as for projections,” explains Raymond. “Originally, entire sections of the stadium were to be rotoscoped, then projected onto grids per section. We quickly realized that the camera movements were too extreme so we used a lot of parallax effects to achieve the desired results.”
“Early tests being inconclusive,” adds Raymond, “we decided to rotoscope individual agents and repopulate the stadiums with cards containing the individual rotoscoped agents. The results were immediately conclusive and we knew that was the way to go. Since it was based on live footage, the motions were very natural and the detailing, acute.”
An issue Hybride did come up against, however, was consistent lighting. A tool was ultimately designed that automatically created animated normal maps from cutout mattes to detect the edges and create smooth round edges for the agents. “It allowed us to relight and render proper rim lights and produce highlights based on the plates we were using,” says Raymond. “We created controls allowing users to paint where agents were required in the stadium and select the action type per agent—we’d filmed several assorted actions which would cover a wide range of crowd actions depending on the type of competition being watched—and we also had controls ensuring there would be no duplicates sitting side by side.”
Since the technique was card-based, render times proved short with a maximum of 2 to 3 minutes per frame. “Ideally,” notes Raymond, “we would’ve wanted to render in real-time, but due to tight deadlines, we didn’t have the time to fully optimize this rendering technique.”
The second technique centered around generating 3D crowd agents. Here, Hybride rendered fully digital characters based on complex instancing techniques using Softimage ICE, rendered in SolidAngle’s Arnold. “We used Ubisoft’s motion capture stage to build extensive lists of various pre-defined actions based on the expected behaviors of a crowd,” details Raymond. “Cloth and hair simulations were baked onto the animated clips using every type of hair and cloth an agent could have. Based on various per shot probability inputs—animation cycles, piece of clothing and hair style selection—we used Softimage ICE to distribute the proper mixture of instance pieces per agent, and sent the information to Arnold for rendering.”
“We originally thought the geometry’s high resolution would be too much for the renderer to handle,” continues Raymond, “so we explored using a deep image compositing approach, but Arnold proved able to manage heavy geometry so we didn’t need to go down that road. Also, since everything was based on instances, we made sure that the same geometry would be used as much as possible so the amount of memory required to render these shots was minimal. Render times where also surprising since they allowed us to render a majority of the shots in less than 30 minutes per frame. In the end, this allowed us to deliver hundreds of shots in a very tight deadline, using a limited render farm, with just over 50 render nodes.”
The two separate techniques—cards and fully 3D crowds—could actually be switched between by artists via a unified pipeline. This pipeline also obeyed the same rules defined in the shared crowd parameters, including crowd placement, stadium section occupation level and animation zoning.
This allowed Hybride, says Raymond, to easily switch and select the best technique for each situation. “In cases where lighting changes were extreme and agents far enough to not notice the 3D, we used full 3D agents and switched to cards when scenes required that the agents be shown closer than a defined threshold, and lighting management was minimal. Simplifying the pipeline meant that the bottle necks where essentially limited to tracking and layout, and the compositing department’s final tweaks.”
“The only thing lighting artists needed to do was refer to a document prepared by the supervisors detailing the required crowd actions for a specific shot, enter the matching probability, and then tweak the lighting and various probability options until the desired results were achieved,” states Raymond. “This allowed each artist to manage entire sequences within a few hours.”
Back to the Olympics, and more, for Unbroken
In collaboration with Industrial Light & Magic, Hybride produced a total of 50 visual effects shots for Angelina Jolie’s Unbroken. The film is the story of US Olympian and army officer Louis ‘Louie’ Zamperini’s plight in a Japanese prisoner of war camp, after earlier surviving a water crash-landing on a raft. Hybride’s VFX shots included large-scale crowd simulations and a virtual replica of the Berlin stadium for the 1936 Olympic games.
The massive crowd of extras was shot live action, in period costumes, from a number of angles to populate the stands for the Berlin Olympic shots. “The creation of CG crowds is a complex technique that we’ve developed over several projects,” discusses Hybride visual effects supervisor Philippe Théroux. “Since no existing software allowed the flexibility we needed back then, we created a tool that would allow us to combine all kinds of variations, therefore avoiding a ‘cloning’ effect.”
Many of the live action shots took place on the sides of the field, which meant the camera was relatively close to the CG crowds. “In fact,” notes Théroux, “the CG crowds were close enough that we were able to read the facial expression for each individual in the crowd so we chose the ‘real people on cards’ solution here.”
First, Hybride’s artists went through the sequence and decided on all the actions needed for the crowds—people standing up, sitting down, applauding, waving flags, and so on. All of these actions were then incorporated into a very specific choreography that was comprised of 12 different actions.
A total of 48 extras were required to create the entire crowd. They would film 12 extras at the time, sitting in a row, in front of a blue screen. The extras that were not being shot went through a costume change while waiting to maximize crowd variations. “When they performed,” says Théroux, “the people on set would follow a video playback of the choreography that lasted a little over 4 minutes. Once the choreography was finished, the extras would rotate in their places, and the sequence was performed one more time to capture a different angle of the same action.”
“In the end,” says Théroux, “every group of 12 performed the routine 5 times, from 5 different angles. Then they would start over, but this time with the camera placed at a lower position looking up, so that it would look like the people were sitting in the higher sections of the stadium.”
The next step consisted of placing a grid for every seat in the stadium. Hybride’s software analyzed the position of every grid in relation to the position of the camera, and chose which angle of the choreography needed to be projected on each grid in the stadium. “This proved to be really helpful in the curved sections of the stadium where a single angle shooting will look totally unnatural and 2D,” attests Théroux.
“Using multi-angled shooting we achieved a crowd effect that looked three-dimensional,” he adds. “We could also decide of a point of interest to have our crowds looking toward a specific part of the stadium when necessary. Using sliders, we could set parameters for the crowd, designating it to be 60% male, 40% female, for example, or with 20% of the characters wearing hats, and 5% waving flags. When the crowd performed a specific action such as transitioning from sitting idly to standing up cheering, we simply needed to select the corresponding section from our choreography.”
Artists could offset the timings from the original choreography displayed on each card to create randomness in the transition. “Our crowd system has a lot of other built in tools,” notes Théroux, “so it’s super easy to make modifications and procedurally control the behavior of the clips projected on each card. Since the crowd is made up of cards and textures, edits and renders are lightning fast.”
The end result was around 1,444 elements, which were cleaned up and introduced into Hybride’s crowd system. 3D characters were also used in places where 2D cards were not practical.
Théroux marvels at his teams intricate work in rotoscoping live action plates that had to be mostly filmed without greenscreen, since the size of the environment and camera movements were so vast.
“This technique allowed for greater speed and freedom in shooting the foreground action,” he notes. “The goal was to always have the effects shots fit seamlessly into the shots that made up the live action portion of the film so tracking and layout work required the highest accuracy to be able to perfectly blend the CG elements, and final rendering of the shots needed to be done in accordance with the rest of film’s look and feel.”
On the Warcraft crowds war path
Hybride’s extensive experience in crowd simulations was also deployed to create and produce gigantic crowds in Duncan Jones’ Warcraft, based on the hugely popular video game series. Crowd shots made up almost half of the 329 visual effects shots the VFX studio worked on for the film.
To tackle this workload, ILM handled the foreground characters and entrusted the characters in the background to Hybride, with the studios agreeing on some rules of thumb to determine where to hand off the action, as Hybride Senior R&D – TD Mathieu Leclaire sets out. “Whenever we needed to see chest hair, ILM did those characters and whenever characters needed super-great cloth simulation, we would let ILM do the first rows, and then drop in our crowds to fill in the background.”
To get character performances, a two-week motion capture shoot took place on Animatrik’s grey volume stage, with performers enacting a wide range of combat actions for both orcs and humans.
“We did two passes of motion capture,” says Leclaire. “First, we did the technical locomotion files, like walk, stand, jog, run, and all the transitions, like 90-degree crouch and stand-to-walk. Then we had all the actors doing vignettes—two, three or four characters fighting or doing other actions.”
A major challenge for Hybride on the Warcraft crowds was having to work with high-resolution assets at all times. There were 72 different orcs and 12 different humans, with thousands of textures and millions of polygons per character. The crowd management solution used here was built around Fabric Engine, and designed to run the performances of over 200 high-resolution orcs simultaneously in real-time.
“Crowd numbers varied between 50 and 1,000 per shot, with wide shots presenting as many as 6,600 individual combatants,” details Leclaire. “For big shots, we’d take about 900 million polygons. We’re not talking about instanced objects—the system loaded specific animation per character, which was pretty incredible.”
Only simplified environment data came from ILM, since all the vegetation and rocks on the ground was too much geometry to share. Hybride then used Arnold to generate final renders of their crowd elements, which were sent back to ILM for final compositing. “We gave them deep holdouts. We also generated low-res Alembic files of our crowd geometry, which they used to do all the effect passes, like atmospherics and dust,” says Leclaire.
Hybride spent around six months re-vamping its pipeline to deal with the high-fidelity nature of the assets and to enable the efficient sharing of data with ILM. For Leclaire, a significant achievement of the project was that the team could swap clans, weapons and actions on very tight turnarounds. “Our crowd system could respond quickly to specific direction from Duncan Jones for smaller shots while scaling to handle thousands of orc and human knights fighting each other. Since it was all about the orcs on this show, we’re quite proud on how we efficiently handled the workload that ILM entrusted to us.”
To see more of Hybride’s work over its more than quarter-century history, check out the studio’s website.
Brought to you by Hybride:
This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.