Timezone: »

Beyond Vector Spaces: Compact Data Representation as Differentiable Weighted Graphs
Denis Mazur · Vage Egiazarian · Stanislav Morozov · Artem Babenko

Tue Dec 10 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #66

Learning useful representations is a key ingredient to the success of modern machine learning. Currently, representation learning mostly relies on embedding data into Euclidean space. However, recent work has shown that data in some domains is better modeled by non-euclidean metric spaces, and inappropriate geometry can result in inferior performance. In this paper, we aim to eliminate the inductive bias imposed by the embedding space geometry. Namely, we propose to map data into more general non-vector metric spaces: a weighted graph with a shortest path distance. By design, such graphs can model arbitrary geometry with a proper configuration of edges and weights. Our main contribution is PRODIGE: a method that learns a weighted graph representation of data end-to-end by gradient descent. Greater generality and fewer model assumptions make PRODIGE more powerful than existing embedding-based approaches. We confirm the superiority of our method via extensive experiments on a wide range of tasks, including classification, compression, and collaborative filtering.

Author Information

Denis Mazur (Yandex)

Badge Text

Vage Egiazarian (Skoltech)
Stanislav Morozov (Yandex)
Artem Babenko (Yandex)

More from the Same Authors