Skip to yearly menu bar Skip to main content


Poster

Optimal Underdamped Langevin MCMC Method

Zhengmian Hu · Feihu Huang · Heng Huang

Virtual

Keywords: [ Generative Model ]


Abstract: In the paper, we study the underdamped Langevin diffusion (ULD) with strongly-convex potential consisting of finite summation of $N$ smooth components, and propose an efficient discretization method, which requires $O(N+d^\frac{1}{3}N^\frac{2}{3}/\varepsilon^\frac{2}{3})$ gradient evaluations to achieve $\varepsilon$-error (in $\sqrt{\mathbb{E}{\lVert{\cdot}\rVert_2^2}}$ distance) for approximating $d$-dimensional ULD. Moreover, we prove a lower bound of gradient complexity as $\Omega(N+d^\frac{1}{3}N^\frac{2}{3}/\varepsilon^\frac{2}{3})$, which indicates that our method is optimal in dependence of $N$, $\varepsilon$, and $d$. In particular, we apply our method to sample the strongly-log-concave distribution and obtain gradient complexity better than all existing gradient based sampling algorithms. Experimental results on both synthetic and real-world data show that our new method consistently outperforms the existing ULD approaches.

Chat is not available.