Articles

Get ready for Real-Time Live! at SIGGRAPH 2020

One of the coolest events at SIGGRAPH.

Each year at SIGGRAPH, one of *the* events to be at is Real-Time Live!, where presenters have 6 minutes to give a demo of their real-time project. Because it’s live, anything can happen, and often does.

While SIGGRAPH 2020 is going virtual this year, Real-Time Live! is still happening…live! It’s on Tuesday, 25 August 2020 from 4pm – 6pm PT. Here are the details of the presentations, below. I seriously can’t wait for this session.

AI-SYNTHESIZED AVATARS: FROM REAL-TIME DEEPFAKES TO PHOTOREAL AI VIRTUAL ASSISTANT
We introduce a real-time deep learning-based facial synthesis technology for photoreal AI-avatars, and demonstrate two novel applications. We showcase the first zero-shot real-time deepfake system allowing anyone to swap their faces with another subject. Then, we demonstrate how this technology can enable an AI-based photoreal virtual assistant.

CHROMA TOOLS: AI-ENHANCED STORYTELLING FOR MOTOR RACING SPORTS
Typically, TV producers of motor-racing programs manually overlay visuals to provide on-screen context, such as a driver’s name, position, or photo. Chroma Tools, deployed on live Formula E broadcast, automates these tasks and enables dynamic overlays that track the racers as they appear on the screen in strict real time.

DRAWMATICAR – AUTOMAGICAL AR CONTENT FROM WRITTEN WORDS!
DrawmaticAR is an AI-AR app capable of creating interactive 3D animated experiences from story words written in the designated section on an AR marker (Magic Paper).

Buy Me A Coffee

INTRODUCTION TO REAL TIME USER INTERACTION IN VIRTUAL REALITY POWERED BY BRAIN COMPUTER INTERFACE TECHNOLOGY
How can VR/AR platforms personalize the user experience? I discuss how Looxid Link, the most VR-compatible, brain-sensing technology, can connect users’ minds to VR by visualizing, interacting with, and analyzing the users’ minds in VR by demonstrating real-time examples using users’ EEG feature and mind indexes.

INTERACTIVE STYLE TRANSFER TO LIVE VIDEO STREAMS
Our tool allows artists to create living paintings or stylize a live video stream using their own artwork with minimal effort. While an artist is painting the image, our framework learns their artistic style on the fly and transfers it to the provided live video stream in real time.

THE TECHNOLOGY BEHIND MILLENNIUM FALCON: SMUGGLERS RUN
Demonstrate the technological innovations behind the delivery of Millennium Falcon: Smugglers Run interactive attraction. This will show and explain the pieces of technology we had to write to achieve our goals of high fidelity, high resolution and high frame rate for the experience.



LIMITLESS DYNAMIC LANDSCAPES OF 1 MM PER PIXEL DENSITY, ZOOMING INCLUDED
The new ObjectLandscapeTerrain system in UNIGINE Engine fulfills a very challenging combination of requirements: very dense details (down to 1 mm per pixel) together with huge terrain size (up to 10,000 x 10,000 km), real-time terrain geometry modification, non-destructive team collaboration, and binoculars/scope support (up to x20 zoom).

SKETCH-TO-ART: SYNTHESIZING STYLIZED ART IMAGES FROM HAND-DRAWN SKETCHES WITH NO SEMANTIC LABELING
Sketch-to-Art is an AI tool that allows creatives to sketch an idea and get fully rendered images, stylized the way they want in real time. Users can define a style by either choosing a reference image, or a group of images, and selecting an artist, or an art movement.

VOLUMETRIC HUMAN TELEPORTATION
Existing volumetric capture systems require many cameras and lengthy post processing. We introduce the first system that can capture a completely clothed human body (including the back) using a single RGB webcam and in real time. Our deep-learning-based approach enables new possibilities for low-cost and consumer-accessible immersive teleportation.


Become a befores & afters Patreon for bonus VFX content


Leave a Reply

back to top