Poster
Ordered Memory
Yikang Shen · Shawn Tan · Arian Hosseini · Zhouhan Lin · Alessandro Sordoni · Aaron Courville
East Exhibition Hall B, C #194
Keywords: [ Memory-Augmented Neural Networks ] [ Deep Learning ] [ Applications -> Natural Language Processing; Deep Learning -> Attention Models; Deep Learning ] [ Recurrent Networks ]
Stack-augmented recurrent neural networks (RNNs) have been of interest to the deep learning community for some time. However, the difficulty of training memory models remains a problem obstructing the widespread use of such models. In this paper, we propose the Ordered Memory architecture. Inspired by Ordered Neurons (Shen et al., 2018), we introduce a new attention-based mechanism and use its cumulative probability to control the writing and erasing operation of the memory. We also introduce a new Gated Recursive Cell to compose lower-level representations into higher-level representation. We demonstrate that our model achieves strong performance on the logical inference task (Bowman et al., 2015) and the ListOps (Nangia and Bowman, 2018) task. We can also interpret the model to retrieve the induced tree structure, and find that these induced structures align with the ground truth. Finally, we evaluate our model on the Stanford Sentiment Treebank tasks (Socher et al., 2013), and find that it performs comparatively with the state-of-the-art methods in the literature.
Live content is unavailable. Log in and register to view live content