Timezone: »
Poster
Modeling Deep Temporal Dependencies with Recurrent "Grammar Cells"
Vincent Michalski · Roland Memisevic · Kishore Konda
We propose modeling time series by representing the transformations that take a frame at time t to a frame at time t+1. To this end we show how a bi-linear model of transformations, such as a gated autoencoder, can be turned into a recurrent network, by training it to predict future frames from the current one and the inferred transformation using backprop-through-time. We also show how stacking multiple layers of gating units in a recurrent pyramid makes it possible to represent the ”syntax” of complicated time series, and that it can outperform standard recurrent neural networks in terms of prediction accuracy on a variety of tasks.
Author Information
Vincent Michalski (Université de Montréal)
Roland Memisevic (Qualcomm)
Kishore Konda (Goethe University Frankfurt)
More from the Same Authors
-
2023 Poster: Deductive Verification of Chain-of-Thought Reasoning »
Zhan Ling · Yunhao Fang · Xuanlin Li · Zhiao Huang · Mingu Lee · Roland Memisevic · Hao Su -
2019 Demonstration: One-on-one fitness training with an AI avatar »
Roland Memisevic · Guillaume Berger · Tippi Puar · David Greenberg -
2016 Poster: Architectural Complexity Measures of Recurrent Neural Networks »
Saizheng Zhang · Yuhuai Wu · Tong Che · Zhouhan Lin · Roland Memisevic · Russ Salakhutdinov · Yoshua Bengio -
2014 Workshop: Deep Learning and Representation Learning »
Andrew Y Ng · Yoshua Bengio · Adam Coates · Roland Memisevic · Sharanyan Chetlur · Geoffrey E Hinton · Shamim Nemati · Bryan Catanzaro · Surya Ganguli · Herbert Jaeger · Phil Blunsom · Leon Bottou · Volodymyr Mnih · Chen-Yu Lee · Rich M Schwartz -
2010 Poster: Gated Softmax Classification »
Roland Memisevic · Christopher Zach · Geoffrey E Hinton · Marc Pollefeys