Timezone: »

Excess Risk Bounds for the Bayes Risk using Variational Inference in Latent Gaussian Models
Rishit Sheth · Roni Khardon

Wed Dec 06 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #183 #None

Bayesian models are established as one of the main successful paradigms for complex problems in machine learning. To handle intractable inference, research in this area has developed new approximation methods that are fast and effective. However, theoretical analysis of the performance of such approximations is not well developed. The paper furthers such analysis by providing bounds on the excess risk of variational inference algorithms and related regularized loss minimization algorithms for a large class of latent variable models with Gaussian latent variables. We strengthen previous results for variational algorithms by showing they are competitive with any point-estimate predictor. Unlike previous work, we also provide bounds on the risk of the \emph{Bayesian} predictor and not just the risk of the Gibbs predictor for the same approximate posterior. The bounds are applied in complex models including sparse Gaussian processes and correlated topic models. Theoretical results are complemented by identifying novel approximations to the Bayesian objective that attempt to minimize the risk directly. An empirical evaluation compares the variational and new algorithms shedding further light on their performance.

Author Information

Rishit Sheth (Microsoft Research New England)
Roni Khardon (Indiana University, Bloomington)

More from the Same Authors