DeepMotion | Optimize CPU Performance with Intel VTune Profiler

Using Intel® VTune™ Profiler to optimize DeepMotion's Motion Brain for smoother character animations ...learn more

Project status: Published/In Market

Game Development, Graphics and Media

Groups
GDC 2020

Intel Technologies
Intel vTune

Links [3]

Overview / Usage

Back to GDC 2020 Projects

Since the dawn of gaming and interactive software, creating realistic, believable 3D character motion has been a costly and time-consuming effort. Industry techniques such as key-framing are repetitive, hand-crafted, non-reactive, and hard to scale. By utilizing Physics Simulation and Machine Learning, DeepMotion's Motion Brain transforms your digital characters from animated to alive, empowering your applications with new levels of immersion, interaction, and realistic character motion.

To achieve this feat, massive computing power and resources are required to run deep reinforcement algorithms. Before collaborating with Intel, our Motion Brain handled single input motions at a time, could only mimic the reference motions, and might take a week or more to train. Utilizing Intel's 192-core SDP server enabled us to train our new Generative Motion Brain that can handle multiple inputs and even generate new behaviors "on-the-fly", all while drastically reducing our development time and required resources.

Methodology / Approach

Used Intel VTune Profiler to optimize performance

Technologies Used

Intel VTune Profiler

Collaborators

There are no people to show.

Comments (0)