Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimal Transport and Machine Learning

Sliced Multi-Marginal Optimal Transport

Samuel Cohen · Alexander Terenin · Yannik Pitcan · Brandon Amos · Marc Deisenroth · Senanayak Sesh Kumar Karri


Abstract:

Multi-marginal optimal transport enables one to compare multiple probability measures, which increasingly finds application in multi-task learning problems.One practical limitation of multi-marginal transport is computational scalability in the number of measures, samples and dimensionality.In this work, we propose a multi-marginal optimal transport paradigm based on random one-dimensional projections, whose (generalized) distance we term the \emph{sliced multi-marginal Wasserstein distance}.To construct this distance, we introduce a characterization of the one-dimensional multi-marginal Kantorovich problem and use it to highlight a number of properties of the sliced multi-marginal Wasserstein distance. In particular, we show that (i) the sliced multi-marginal Wasserstein distance is a (generalized) metric that induces the same topology as the standard Wasserstein distance, (ii) it admits a dimension-free sample complexity, (iii) it is tightly connected with the problem of barycentric averaging under the sliced-Wasserstein metric.We conclude by illustrating the sliced multi-marginal Wasserstein on multi-task density estimation and multi-dynamics reinforcement learning problems.

Chat is not available.