Timezone: »

 
Poster
Skip-Thought Vectors
Jamie Kiros · Yukun Zhu · Russ Salakhutdinov · Richard Zemel · Raquel Urtasun · Antonio Torralba · Sanja Fidler

Thu Dec 10 08:00 AM -- 12:00 PM (PST) @ 210 C #6

We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice. We will make our encoder publicly available.

Author Information

Jamie Kiros (University of Toronto)
Yukun Zhu (University of Toronto)
Russ Salakhutdinov (University of Toronto)
Richard Zemel (University of Toronto)
Raquel Urtasun (University of Toronto)
Antonio Torralba (MIT)
Sanja Fidler (University of Toronto)

More from the Same Authors