
All this week we’ve been running advice from freelance 3D artists about, well, freelancing. But how do you get to become a freelancer? There are many paths; some artists are self-taught, others might receive some kind of formal training.
Schools, colleges, and other institutions are very varied, but I thought it might be interesting to look at one school – Animationsinstitut at Filmakademie Baden-Württemberg – and project students from the school produced.
It’s called Tiefenrausch, and actually combined 3D, animation, and VR (the final results were both a VR/eye-tracking experience and a fully CG trailer).
Here’s my interview with the team behind Tiefenrausch, which tells the story of a diver who finds something more than expected in the deep.
b&a: What were some of the intentions and ideas behind the VR experience and trailer?
Monja Dietrick (director, VR experience): The team started with a common fascination for underwater mystery. We gathered around this setting first and developed our character and the emotional journey after. Researching oceanic creatures was a lot of fun and our excitement for the ambivalence of danger and beauty down there had a big impact on the trailer and game. The setting not only made for an interesting atmosphere and tonality, it also gave us a lot of freedom in the execution.
With the trailer and the VR experience we wanted to transport the viewer into a world that is equally exciting and terrifying. The deep dark waters are a great playground for exploring this dynamic. An estimated 95 percent of the world’s oceans are unexplored territory filled with otherworldly wonder and life threatening horror. We are looking to spark curiosity and provide our viewers and players with a variety of discoveries and the thereof resulting consequences beyond our control.
In addition to that, the VR experience seeks to test how eye tracking technology can be employed as an attentive and invisible user interface allowing people to use reflexive and emotional behavior as a game controller.
b&a: For the VR experience, how did you implement eye tracking into the project?
Vincent Suttner (game designer, VR experience): Fear has always been very important for survival, since ancient times. But it has also come to establish unnecessary barriers for people, especially in modern life. We wanted to reward the player for actively overcoming fear by creating scenarios that are alternately comforting and threatening.
With eye tracking as the core mechanic and therefore the eyes as a controller we had to work very cinematically. We took a lot of inspiration from horror films, because comfort is much more rewarding after enduring threatening scenarios. Also for the experience to be as immersive as possible we wanted to make the player learn intuitively about the mechanic while playing. We used a very simple structure of different stages that guide the player deeper into the ocean.
In the first stage the world and the eye tracking controller gets introduced in a safe environment. There is a school of fish that follow your gaze, but stay at a respectful distance. If you look down in the deeper waters, you involuntarily sink into this darkness. This takes away the notion of control that you had over the fish and creates nervousness or even fear.
In the next stage you get rewarded for daring to look down. The player learns how to use their gaze to light up jellyfish and create a harmonic soundscape, brightening up the darkness. This creates the illusion of being back in control. However after you light up 6 Jelly fish the creepy viperfish is spawned and starts circling you, getting closer.
Once you notice the viperfish, the jellyfish disappear. In this stage the same principles you learned in the previous stages hit you with a twist to challenge you. When you look at the viperfish, it gets really close to your face and rushes past you, unlike the school of fish or the jellyfish. You eventually discover its “weak spot” to make it disappear again.
Finally you reach the last stage. You touch the ocean floor and create a trail of light, looking at the plants around you. You are in control again. This stage works as a reward for overcoming your fear.
b&a: How did you approach building assets and animating them for the underwater environments of the VR experience?
Enzio Probst (technical director, trailer and VR experience): As the project was driven by a certain fascination of underwater worlds and creatures in general, we did a lot of research first. We looked at lots of documentary images and videos, first just selecting what we liked, and later analyzing why we were fascinated by it.
After having collected images in mood boards we started to draw concepts for how our characters should look like and (for the experience) defined what their “weak points” were. “Weak points” would be where the player’s gaze can affect the animal or plant and trigger a certain event.
Texturing was done in Substance Painter which gave us a nice preview for how the assets would look in the game engine. Speaking of waving motion, there are two different kinds of such motion. First originating muscle movement of fish, and second originating water turbulence, mainly seen on thinner fins and seagrass.
For the animation system of the long snake-like viperfish, we actually built three totally different animation systems until we reached the point where we were happy with the results. We would build on the first prototype, where the viperfish was just a block moving around.
We tried to match pre animated cycles and poses to the movement of the fish, so when taking a corner, the body would bend accordingly in the right direction.
But it still felt like a big piece of styrofoam quickly moving through thin air with no inertia and resistance.
So in the second approach I used rootmotion, which is a system where only the motion that is in the animation is used to move the creature as a whole. This worked a little bit better, but made the turning radius of the fish very big, so it was unable to turn quickly when reacting to the player.
The last approach, which we ended up taking, was to procedurally make the body of the fish follow the head with a delay. Kind of like a rope being pulled around. On top of that we additively layered animation cycles with only small waving motion and fin flaps.
This enabled us to let the fish move around freely without its tail making unbelievably fast turns and provided us with the responsiveness in motion we needed for a direct reaction and interactive impression.
In addition to muscular or propelling motion we added small waves through a vertex shader deforming the mesh only on the thin parts that were prone to being moved around by water turbulence.
Our ideas for the environment were less specific and detailed: We simply wanted it to glow, a lot. So we mainly used Megascan assets and changed them for our needs. We made simple cutout planes of the textures and stitched them together in a star shape, then twisted them to give them more depth.The plants all had a similar vertex shader like the viperfish, which made them look like they were being waved around by water turbulence. For the bioluminescent effects, we developed a process to create special textures with a gradient going from black, at the roots of the plants, to white, at the tips of the plants, while following the shape and twigs. This allowed us to simply create waves of light pulsing through the plants and also the viperfish.
Our Lead Animator Lukas von Berg animated three cycles with different swimming speeds. By blending between them the creatures can speed up and slow down. We used four loops for swimming: up, down, left and right and added fin and body movements for when the creature is turning. To make the creatures feel alive Lukas also animated breathing and chewing loops to have them breathe in the rhythm of the fin strokes.
b&a: For the trailer for the immersive experience, what approach did you take to rendering an interesting view of the underwater world?
Marvin Sprengel (director, CG trailer): When the light stops getting through the water it’s not only getting darker but also nature gets very creative. It’s a fascinating and horrifying wildlife down there. But for humans the conditions in deep water are very threatening. There is no breathable oxygen, low temperature and increasingly high pressure. Especially the pressure causes the eponymous raptures of the deep which basically intoxicates you and equips you with additional euphoria.
With this in mind we wanted to to translate the balance between curiosity and fear to the visuals. In the end both, the trailer and the vr experience are meant to feel explorative. We tried to embrace the negative space and reduce what’s visible in the frame to a minimum to help us with that. We used limited light sources, a thick atmosphere and elements like floating particles or air bubbles to disturb the view and make it feel more and more claustrophobic the deeper we get. Sound and music helped a lot to fill the vastness of the ocean outside the frame.
We struggled a lot with the bioluminescent finale and the final shot in particular. We tested different approaches on how to effectively reveal the colorful new world to the protagonist as well as the viewer. With time we realised that again showing merely a reaction of our protagonist creates an even stronger finish than explicitly showing what awes her.
This week at befores & afters is #freelance3dartist week, a look at various ins and outs of working as a freelance 3D artist.