Skip to yearly menu bar Skip to main content


Poster

Beyond Vector Spaces: Compact Data Representation as Differentiable Weighted Graphs

Denis Mazur · Vage Egiazarian · Stanislav Morozov · Artem Babenko

East Exhibition Hall B, C #66

Keywords: [ Embedding Approaches ] [ Deep Learning ] [ Representation Learning ] [ Algorithms ]


Abstract:

Learning useful representations is a key ingredient to the success of modern machine learning. Currently, representation learning mostly relies on embedding data into Euclidean space. However, recent work has shown that data in some domains is better modeled by non-euclidean metric spaces, and inappropriate geometry can result in inferior performance. In this paper, we aim to eliminate the inductive bias imposed by the embedding space geometry. Namely, we propose to map data into more general non-vector metric spaces: a weighted graph with a shortest path distance. By design, such graphs can model arbitrary geometry with a proper configuration of edges and weights. Our main contribution is PRODIGE: a method that learns a weighted graph representation of data end-to-end by gradient descent. Greater generality and fewer model assumptions make PRODIGE more powerful than existing embedding-based approaches. We confirm the superiority of our method via extensive experiments on a wide range of tasks, including classification, compression, and collaborative filtering.

Live content is unavailable. Log in and register to view live content