Timezone: »

Local Expectation Gradients for Black Box Variational Inference
Michalis Titsias · Miguel Lázaro-Gredilla

Mon Dec 07 04:00 PM -- 08:59 PM (PST) @ 210 C #46

We introduce local expectation gradients which is a general purpose stochastic variational inference algorithm for constructing stochastic gradients by sampling from the variational distribution. This algorithm divides the problem of estimating the stochastic gradients over multiple variational parameters into smaller sub-tasks so that each sub-task explores intelligently the most relevant part of the variational distribution. This is achieved by performing an exact expectation over the single random variable that most correlates with the variational parameter of interest resulting in a Rao-Blackwellized estimate that has low variance. Our method works efficiently for both continuous and discrete random variables. Furthermore, the proposed algorithm has interesting similarities with Gibbs sampling but at the same time, unlike Gibbs sampling, can be trivially parallelized.

Author Information

Michalis Titsias (Athens University of Economics and Business)
Miguel Lázaro-Gredilla (Vicarious)

More from the Same Authors