Timezone: »
Poster
Dimensionality Reduction for Wasserstein Barycenter
Zachary Izzo · Sandeep Silwal · Samson Zhou
The Wasserstein barycenter is a geometric construct which captures the notion of centrality among probability distributions, and which has found many applications in machine learning. However, most algorithms for finding even an approximate barycenter suffer an exponential dependence on the dimension $d$ of the underlying space of the distributions. In order to cope with this ``curse of dimensionality,'' we study dimensionality reduction techniques for the Wasserstein barycenter problem. When the barycenter is restricted to support of size $n$, we show that randomized dimensionality reduction can be used to map the problem to a space of dimension $O(\log n)$ independent of both $d$ and $k$, and that \emph{any} solution found in the reduced dimension will have its cost preserved up to arbitrary small error in the original space. We provide matching upper and lower bounds on the size of the reduced dimension, showing that our methods are optimal up to constant factors. We also provide a coreset construction for the Wasserstein barycenter problem that significantly decreases the number of input distributions. The coresets can be used in conjunction with random projections and thus further improve computation time. Lastly, our experimental results validate the speedup provided by dimensionality reduction while maintaining solution quality.
Author Information
Zachary Izzo (Stanford University)
Sandeep Silwal (Massachusetts Institute of Technology)
Samson Zhou (Carnegie Mellon University)
More from the Same Authors
-
2022 : Data-driven subgroup identification for linear regression »
Zachary Izzo · Ruishan Liu · James Zou -
2022 : Provable Re-Identification Privacy »
Zachary Izzo · Jinsung Yoon · Sercan Arik · James Zou -
2022 Spotlight: Lightning Talks 4A-2 »
Barakeel Fanseu Kamhoua · Hualin Zhang · Taiki Miyagawa · Tomoya Murata · Xin Lyu · Yan Dai · Elena Grigorescu · Zhipeng Tu · Lijun Zhang · Taiji Suzuki · Wei Jiang · Haipeng Luo · Lin Zhang · Xi Wang · Young-San Lin · Huan Xiong · Liyu Chen · Bin Gu · Jinfeng Yi · Yongqiang Chen · Sandeep Silwal · Yiguang Hong · Maoyuan Song · Lei Wang · Tianbao Yang · Han Yang · MA Kaili · Samson Zhou · Deming Yuan · Bo Han · Guodong Shi · Bo Li · James Cheng -
2022 Spotlight: Learning-Augmented Algorithms for Online Linear and Semidefinite Programming »
Elena Grigorescu · Young-San Lin · Sandeep Silwal · Maoyuan Song · Samson Zhou -
2022 Panel: Panel 3C-2: Rethinking Knowledge Graph… & Faster Linear Algebra… »
Sandeep Silwal · Haotong Yang -
2022 Poster: Faster Linear Algebra for Distance Matrices »
Piotr Indyk · Sandeep Silwal -
2022 Poster: Learning-Augmented Algorithms for Online Linear and Semidefinite Programming »
Elena Grigorescu · Young-San Lin · Sandeep Silwal · Maoyuan Song · Samson Zhou -
2022 Poster: Exponentially Improving the Complexity of Simulating the Weisfeiler-Lehman Test with Graph Neural Networks »
Anders Aamand · Justin Chen · Piotr Indyk · Shyam Narayanan · Ronitt Rubinfeld · Nicholas Schiefer · Sandeep Silwal · Tal Wagner -
2021 Poster: Adversarial Robustness of Streaming Algorithms through Importance Sampling »
Vladimir Braverman · Avinatan Hassidim · Yossi Matias · Mariano Schain · Sandeep Silwal · Samson Zhou