Timezone: »
Poster
Fast Convergence of Langevin Dynamics on Manifold: Geodesics meet Log-Sobolev
Xiao Wang · Qi Lei · Ioannis Panageas
Sampling is a fundamental and arguably very important task with numerous applications in Machine Learning. One approach to sample from a high dimensional distribution $e^{-f}$ for some function $f$ is the Langevin Algorithm (LA). Recently, there has been a lot of progress in showing fast convergence of LA even in cases where $f$ is non-convex, notably \cite{VW19}, \cite{MoritaRisteski} in which the former paper focuses on functions $f$ defined in $\mathbb{R}^n$ and the latter paper focuses on functions with symmetries (like matrix completion type objectives) with manifold structure. Our work generalizes the results of \cite{VW19} where $f$ is defined on a manifold $M$ rather than $\mathbb{R}^n$. From technical point of view, we show that KL decreases in a geometric rate whenever the distribution $e^{-f}$ satisfies a log-Sobolev inequality on $M$.
Author Information
Xiao Wang (Singapore university of technology and design)
Qi Lei (Princeton University)
Ioannis Panageas (UC Irvine)
More from the Same Authors
-
2022 Spotlight: On Scrambling Phenomena for Randomly Initialized Recurrent Networks »
Vaggos Chatziafratis · Ioannis Panageas · Clayton Sanford · Stelios Stavroulakis -
2022 Workshop: NeurIPS 2022 Workshop on Meta-Learning »
Huaxiu Yao · Eleni Triantafillou · Fabio Ferreira · Joaquin Vanschoren · Qi Lei -
2022 Poster: Optimistic Mirror Descent Either Converges to Nash or to Strong Coarse Correlated Equilibria in Bimatrix Games »
Ioannis Anagnostides · Gabriele Farina · Ioannis Panageas · Tuomas Sandholm -
2022 Poster: On Scrambling Phenomena for Randomly Initialized Recurrent Networks »
Vaggos Chatziafratis · Ioannis Panageas · Clayton Sanford · Stelios Stavroulakis -
2019 Poster: First-order methods almost always avoid saddle points: The case of vanishing step-sizes »
Ioannis Panageas · Georgios Piliouras · Xiao Wang