Get in there and visualize real-world filmmaking now.
There’s a lot of buzz about virtual production and real-time game engines right now. These tools and techniques are being directly used on television, commercials and films – the best tools replicate the real-world cinematography experience.
That’s what Matt Workman, the founder of Cinematography Database, has also done with Cine Tracer, a real-time cinematography simulator made in Unreal Engine and available on Steam.
Here Matt outlines exactly what Cine Tracer is, why he built it, and why the ‘game’ is already being utilized for actual production planning.
b&a: Why did you make Cine Tracer? What was it in your previous work or wish list for a game that made you jump into it?
Matt Workman: I worked as a commercial cinematographer in NYC for 10 years and by the end I was working a lot VFX heavy projects. One day I decided that I wanted to be able to my own ‘Production Previs’ and I started to work in SketchUp and Maya. Five years ago I stared a company called Cinematography Database and sold a Cinema 4D previs plug-in called Cine Designer. It did really well, for a very niche market, but the high prices of the DCC kept it from being main stream.
About a year and a half ago Epic Games approached me and asked if I would be interested in porting Cine Designer into Unreal Engine. After a few weeks of tinkering I was hooked with being able to work in real time and I spent the last year of my life dedicated to building what is now Cine Tracer.
b&a: Can you describe, from an overall point of view, how a player plays Cine Tracer? What things did you keep considering, as you’ve been developing it, to make sure it stayed fun, and like a game?
Matt Workman: My main goal with Cine Tracer is to make it simple to learn and fun to use/play. While it is technically a game made with Unreal Engine and sold on Steam, it is really an app that uses familiar ‘game mechanics.’
I’ve used almost every 3D animation program for previs and all of them have a steep learning curve. The simple task of moving a 3D camera around a viewport is not immediately intuitive. In video games you use WASD and the Mouse or a Game Pad (we support both) and most children can figure it out in seconds. So we use that for moving around.
The fun part is that I can combine the best of video games like Minecraft Creative and Fortnite Creative with the 3D applications like SketchUp, Cinema 4D, etc.
b&a: What were some of the very specific challenges of implementing different ‘systems’? For example, how do you implement a dolly with track to make it act like that, or how do you incorporate the realistic lights?
Matt Workman: The first challenge was learning how a game engine works. I had been developing plugins for Maya and Cinema 4D, but Unreal Engine is a completely different animal. Epic Games gave me some private one on one tutoring to get me going and my first project was to make a camera dolly that you could ‘drive.’ I also had a 3D scan of myself and I wanted to me a playable character. Once I figured those two things out, the rest was just a matter of grinding.
The videos I posted of the pre-alpha game went viral, so I knew we had something that was at least novel.
b&a: This game is made with Unreal Engine – what things in Unreal were particularly beneficial to you in making it? Were you and are players taking advantage of particular lighting setups and abilities in Unreal to simulate cinematography?
Matt Workman: Unreal Engine has a visual scripting language called Blueprints. It’s very similar to Resolve Nodes and Cinema 4D Xpresso. That language makes programming so fast and even fun, that I was able to go from 0 knowledge of the engine to shipping a commercial early release build in less than a year.
As we are a little further along now, we are about to release our first build with RTX Real Time Ray Tracing technology. So we have the ability to turn on ray traced shadows, reflections, AO, translucency, and global illumination. It’s very early days, but we have a lot of excited users willing to be at the front lines.
RTX makes all of the lighting more accurate and that is what my core user is looking for. They want to be able to express lighting and framing in pre production.
b&a: To me it feels like this could of course also totally become a way of producing storyboards/previs for an actual project. Do you see it that way at all? Are any users already doing that? What would it take to make it more of a virtual filmmaking tool, if you wanted to go down that path?
Matt Workman: Cine Tracer has been embraced by the filmmaking community and it’s being used to plan feature films, television shows, and commercials around the world. I believe that for the production world, it will be the standard in a few years.
Earlier this year I was invited to Cupertino to work with Apple on the ‘new’ Mac Pro. Apple has doubled down on making hardware for professional filmmakers and they saw Cine Tracer as an important part of that eco system. You can read my ‘Pro App Developer’ quote on the Mac Pro press room post.
My vision for Cine Tracer includes ‘indie virtual production’ but that term is still pretty foreign to most filmmakers. I think that introducing it as a planning game/tool, helps people be less intimidated by it. I plan to incorporate Apple’s ARKit to do live facial motion capture and now pose estimation for an indie live mocap setup later this year. Once people see that they can ‘puppet’ 3D humans live, I think a light bulb will go off in their heads that this is possibly more than just a planning tool.
Check out more new VFX tools in our #vfxtoolsweek series.Get exclusive content, join the befores & afters Patreon community