How VR painting and animation tool Quill helped make this watercolor-like short

Behind the scenes of Baobab’s latest film, ‘Namoo’.

The latest film in Baobab Studios rich history of immersive and experimental projects is Erick Oh’s Namoo. This name translates to ‘tree’ in Korean, and the film tells the story of a man’s life from birth to death.

To help make the film, Baobab utilized Oculus Quill’s VR toolset for modeling, animation and lighting. Importantly, a desired watercolor and painterly aspect was retained within Quill. Here, Oh and lead Quill artist Nick Ladd tell befores & afters how Namoo was produced.

b&a: What was the main reason behind choosing Quill to help make Namoo, as opposed to other techniques Baobab could have used?

Erick Oh: Quill was a perfect tool because of many reasons to make Namoo. This was my first time telling a story in a VR medium and Quill helped me skip all the technical difficulties because it enables an artist to draw, paint and animate intuitively. That helped us achieve a hand painted watercolor and oil painting look right away in VR space. It was very important because NAMOO is a very poetic piece, talking about life and growth. Delivering warmth visually was the biggest element. Furthermore, we were able to achieve this look with a small talented team of only 6 key artists: Nick Ladd, Dan Franke, Jon Brower, Eusong Lee, Javier Moya and myself. If we were going to achieve this look and style through a traditional CG pipeline, we would need a bigger team and budget.

b&a: Can you break down how it worked? How did artists craft characters, prop and environments with Quill — what was the process?

Nick Ladd: The process starts from Eusongs 2D concept art which we import into Quill. In Quill, we use the concept art as a reference and then draw and place colored 3D strokes to form each character, prop, and environment.



It’s a lot like creating miniatures and dioramas; there’s no complex modeling, UV unwrapping, or texturing. It’s just colored strokes and much less technical than other 3D pipelines. Once we have our first passes of the characters, Erick typically does draw-overs so we can fine-tune the positioning and shape of everything.

b&a: For animation, how did this work in VR? What kind of ‘testing’ could you do before final’ing shots? How could you also go back into scenes and make any changes?

Nick Ladd: Quill has an animation timeline, but no interpolation like you find in 3D software like Maya. Instead, the animation exists in a middle ground between 2D animation and Stop Motion. Each object/character is on its own layer, and each new frame is a copy of the previous one, which you can then change the geometry on to create animation. There’s no rig, so you can manipulate the geometry however you want and even add or remove strokes from one frame to the next. Quill has a red/blue onion skin that helps you see your previous and upcoming frames, similar to 2D animation.

The process for animating is similar to 2D; the animator will block out key poses, then set breakdowns, and then manually do inbetweens once everything is approved. We were also working with Erick, who was providing rough 2D animations as a timing guide.

b&a: In terms of lighting, what did you need to develop or do to produce that very painterly lighting design and change things for time of day etc?

Nick Ladd: All of the lighting and cast shadows were hand-painted in Quill. The software is entirely unlit vertex color with no build-in light rendering. The lack of software lighting gives the artist total creative freedom to light or paint however they want to. In our case, the Quill artists worked with Eusong Lee, who provided us with 2D painted images of the exact lighting we wanted for each scene and character. From there, we painted the scenes in Quill based on Eusong’s painting. The same objects were recolored multiple times in the film to suit the different lighting scenarios.



Painting by Eusong Lee.
Photoshop painting by Eusong Lee and Quill painting by Dan Franke.

b&a: Since it was crafted in Quill, what challenges then arose in terms of editorial, especially making this work as a 2D narrative? (How also was ‘VR’ interactive version challenging from any editorial viewpoint?)

Erick Oh: From the beginning, we planned to make both versions : 2D narrative and VR. The same story and idea but executed differently fully taking advantage of the strength of each medium. For 2D narrative, we had to come up with a whole new creative way to export Quill footage to Unity, work on the camera and then re-export it to Premiere Pro where we did the final editorial. That being said, the editorial itself was not too different from a traditional filmmaking pipeline and we were able to focus on pacing and timing for the right emotional arc. On the other hand, we cannot do any of those things in VR. VR is all about experience and the audience becomes a part of the story. You can look around or lean forward to observe details more closely. You can also experience climate change very realistically, almost having this illusion that you are right there under the rain or in the middle of the thunderstorm. Those are something we cannot do in 2D narrative. So after all, the key to be able to make both versions was understanding the strength and characteristics of each medium.

Wireframe character and final.

Join the VFX community by becoming a b&a Patreon...and get bonus content!

Leave a Reply