Timezone: »

 
Poster
Modeling Deep Temporal Dependencies with Recurrent "Grammar Cells"
Vincent Michalski · Roland Memisevic · Kishore Konda

Thu Dec 11 11:00 AM -- 03:00 PM (PST) @ Level 2, room 210D

We propose modeling time series by representing the transformations that take a frame at time t to a frame at time t+1. To this end we show how a bi-linear model of transformations, such as a gated autoencoder, can be turned into a recurrent network, by training it to predict future frames from the current one and the inferred transformation using backprop-through-time. We also show how stacking multiple layers of gating units in a recurrent pyramid makes it possible to represent the ”syntax” of complicated time series, and that it can outperform standard recurrent neural networks in terms of prediction accuracy on a variety of tasks.

Author Information

Vincent Michalski (Université de Montréal)
Roland Memisevic (Qualcomm)
Kishore Konda (Goethe University Frankfurt)

More from the Same Authors