Timezone: »

 
Poster
Local-Global MCMC kernels: the best of both worlds
Sergey Samsonov · Evgeny Lagutin · Marylou Gabrié · Alain Durmus · Alexey Naumov · Eric Moulines

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #143
Recent works leveraging learning to enhance sampling have shown promising results, in particular by designing effective non-local moves and global proposals. However, learning accuracy is inevitably limited in regions where little data is available such as in the tails of distributions as well as in high-dimensional problems. In the present paper we study an Explore-Exploit Markov chain Monte Carlo strategy ($\operatorname{Ex^2MCMC}$) that combines local and global samplers showing that it enjoys the advantages of both approaches. We prove $V$-uniform geometric ergodicity of $\operatorname{Ex^2MCMC}$ without requiring a uniform adaptation of the global sampler to the target distribution. We also compute explicit bounds on the mixing rate of the Explore-Exploit strategy under realistic conditions. Moreover, we propose an adaptive version of the strategy ($\operatorname{FlEx^2MCMC}$) where a normalizing flow is trained while sampling to serve as a proposal for global moves. We illustrate the efficiency of $\operatorname{Ex^2MCMC}$ and its adaptive version on classical sampling benchmarks as well as in sampling high-dimensional distributions defined by Generative Adversarial Networks seen as Energy Based Models.

Author Information

Sergey Samsonov (National Research University Higher School of Economics)
Evgeny Lagutin (Moscow Institute of Physics and Technology)
Marylou Gabrié (NYU / Flatiron Institute)
Alain Durmus (Ecole polytechnique)
Alexey Naumov (HSE University)
Eric Moulines (Ecole Polytechnique)

More from the Same Authors