Skip to yearly menu bar Skip to main content


Poster

Double Randomized Underdamped Langevin with Dimension-Independent Convergence Guarantee

Yuanshi Liu · Cong Fang · Tong Zhang

Great Hall & Hall B1+B2 (level 1) #1209

Abstract: This paper focuses on the high-dimensional sampling of log-concave distributions with composite structures: p(dx)exp(g(x)f(x))dx. We develop a double randomization technique, which leads to a fast underdamped Langevin algorithm with a dimension-independent convergence guarantee. We prove that the algorithm enjoys an overall O~((tr(H))1/3ϵ2/3) iteration complexity to reach an ϵ-tolerated sample whose distribution p admits W2(p,p)ϵ. Here, H is an upper bound of the Hessian matrices for f and does not explicitly depend on dimension d. For the posterior sampling over linear models with normalized data, we show a clear superiority of convergence rate which is dimension-free and outperforms the previous best-known results by a d1/3 factor. The analysis to achieve a faster convergence rate brings new insights into high-dimensional sampling.

Chat is not available.