Skip to yearly menu bar Skip to main content


Poster

Online Posterior Sampling with a Diffusion Prior

Branislav Kveton · Boris Oreshkin · Youngsuk Park · Aniket Anand Deshmukh · Rui Song


Abstract:

Posterior sampling in contextual bandits with a Gaussian prior can be implemented exactly or approximately using the Laplace approximation. The Gaussian prior is computationally efficient but it cannot describe complex distributions. In this work, we propose approximate posterior sampling algorithms for contextual bandits with a diffusion model prior. The key idea is to sample from a chain of approximate conditional posteriors, one for each stage of the reverse process, which are estimated in a closed form using the Laplace approximation. Our approximations are motivated by posterior sampling with a Gaussian prior, and inherit its simplicity and efficiency. They are asymptotically consistent and perform well empirically on a variety of contextual bandit problems.

Live content is unavailable. Log in and register to view live content