Asymptotically Optimal Exact Minibatch Metropolis-Hastings
Ruqi Zhang, A. Feder Cooper, Christopher De Sa
Spotlight presentation: Orals & Spotlights Track 25: Probabilistic Models/Statistics
on Thu, Dec 10th, 2020 @ 15:50 – 16:00 GMT
on Thu, Dec 10th, 2020 @ 15:50 – 16:00 GMT
Poster Session 6 (more posters)
on Thu, Dec 10th, 2020 @ 17:00 – 19:00 GMT
GatherTown: Probabilistic Methods ( Town B0 - Spot D1 )
on Thu, Dec 10th, 2020 @ 17:00 – 19:00 GMT
GatherTown: Probabilistic Methods ( Town B0 - Spot D1 )
Join GatherTown
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Metropolis-Hastings (MH) is a commonly-used MCMC algorithm, but it can be intractable on large datasets due to requiring computations over the whole dataset. In this paper, we study \emph{minibatch MH} methods, which instead use subsamples to enable scaling. We observe that most existing minibatch MH methods are inexact (i.e. they may change the target distribution), and show that this inexactness can cause arbitrarily large errors in inference. We propose a new exact minibatch MH method, \emph{TunaMH}, which exposes a tunable trade-off between its minibatch size and its theoretically guaranteed convergence rate. We prove a lower bound on the batch size that any minibatch MH method \emph{must} use to retain exactness while guaranteeing fast convergence---the first such bound for minibatch MH---and show TunaMH is asymptotically optimal in terms of the batch size. Empirically, we show TunaMH outperforms other exact minibatch MH methods on robust linear regression, truncated Gaussian mixtures, and logistic regression.