Poster

NeMF: Neural Motion Fields for Kinematic Animation

Chengan He · Jun Saito · James Zachary · Holly Rushmeier · Yi Zhou

Hall J #109

Keywords: [ neural fields ] [ Motion Modeling ] [ Implicit neural representations ]

[ Abstract ]
[ Poster [ OpenReview
Thu 1 Dec 9 a.m. PST — 11 a.m. PST
 
Spotlight presentation: Lightning Talks 2A-3
Tue 6 Dec 6 p.m. PST — 6:15 p.m. PST

Abstract: We present an implicit neural representation to learn the spatio-temporal space of kinematic motions. Unlike previous work that represents motion as discrete sequential samples, we propose to express the vast motion space as a continuous function over time, hence the name Neural Motion Fields (NeMF). Specifically, we use a neural network to learn this function for miscellaneous sets of motions, which is designed to be a generative model conditioned on a temporal coordinate $t$ and a random vector $z$ for controlling the style. The model is then trained as a Variational Autoencoder (VAE) with motion encoders to sample the latent space. We train our model with a diverse human motion dataset and quadruped dataset to prove its versatility, and finally deploy it as a generic motion prior to solve task-agnostic problems and show its superiority in different motion generation and editing applications, such as motion interpolation, in-betweening, and re-navigating. More details can be found on our project page: https://cs.yale.edu/homes/che/projects/nemf/.

Chat is not available.