Skip to yearly menu bar Skip to main content


Poster

What the Vec? Towards Probabilistically Grounded Embeddings

Carl Allen · Ivana Balazevic · Timothy Hospedales

East Exhibition Hall B + C #71

Keywords: [ Algorithms -> Representation Learning; Deep Learning ] [ Embedding Approaches ] [ Natural Language Processing ] [ Applications ]


Abstract:

Word2Vec (W2V) and Glove are popular word embedding algorithms that perform well on a variety of natural language processing tasks. The algorithms are fast, efficient and their embeddings widely used. Moreover, the W2V algorithm has recently been adopted in the field of graph embedding, where it underpins several leading algorithms. However, despite their ubiquity and the relative simplicity of their common architecture, what the embedding parameters of W2V and Glove learn, and why that is useful in downstream tasks largely remains a mystery. We show that different interactions of PMI vectors encode semantic properties that can be captured in low dimensional word embeddings by suitable projection, theoretically explaining why the embeddings of W2V and Glove work, and, in turn, revealing an interesting mathematical interconnection between the semantic relationships of relatedness, similarity, paraphrase and analogy.

Live content is unavailable. Log in and register to view live content