Weta Digital and 2n Design, and some neat real-time tech, combined for the Childish Gambino’s Pharos festival.

Yesterday at SIGGRAPH 2019, Weta Digital visual effects supervisor Keith Miller and 2n Design co-founder Alejandro Crawford discussed their collaboration on the immersive music event, Childish Gambino’s Pharos festival.
It was here that an intricate stream of imagery was displayed inside a 160-foot wide inflatable dome. To do that took more than just projection mapping; it also involved planning in VR, dealing with and streaming real-time rendered imagery, and making it interactive to the music on the day as Childish Gambino (aka Donald Glover) performed on the stage.
Here’s a visual look at how the VES Award-winning work was achieved, care of images from the event.
First, the visual content – produced largely via classic visual effects modeling, animation and rendering means – was ‘scored’ to the music in Unreal Engine. Here, the imagery was rendered in real-time at 60fps, then delivered to 13 laser projectors under the dome. Unreal Engine’s nDisplay utility enabled the artists to combine visuals from multiple projections into a single scene, in real-time.
Individual streams were stitched in real-time downstream, resulting in a high resolution fisheye projection. A 5.4K by 5.4K image was rendered by five frame-locked machines (via NVIDIA Quadro P6000 cards) before it was split into the fisheye imagery and delivered with the 13 projectors.
Importantly, the content had to flow on the underside of the dome, but also work from any vantage point (since an audience member could be at any location, and move around). The stage was actually an ‘in-the-round’ shape with Glover at the center. To ensure the imagery produced would work in that design, a VR prototyping system was developed to allow the 360 degree content to be mapped onto a virtual dome and then previewed in VR goggles.
During the event, the visuals actually integrated with the musical performance. This was enabled by generating audio reactive events mapped to MIDI inputs. Motion, lighting, shaders and dynamics could be influenced by the performance. For an audience member, that appeared as seamless navigations through environments in the display and transitions.
Interaction as critical, as visuals needed to tightly integrate with the overall performance during the live event. Audio reactive events and parameters were designed and mapped to MIDI inputs, with everything from motion and lighting to shaders and dynamics able to be influenced by the performance.
You can find out more, and see footage from the event, in Unreal Engine’s project spotlight video, below.
This week at befores & afters is #realtimewrapup week, with reports on real-time tech and virtual production, direct from SIGGRAPH 2019.