Skip to yearly menu bar Skip to main content


Workshop

Approximate Bayesian Inference in Continuous/Hybrid Models

Matthias Seeger · David Barber · Neil D Lawrence · Onno Zoeter

Hilton: Diamond Head

Fri 7 Dec, 7:30 a.m. PST

Deterministic (variational) techniques are used all over Machine Learning to approximate Bayesian inference for continuous- and hybrid-variable problems. In contrast to discrete variable approximations, surprisingly little is known about convergence, quality of approximation, numerical stability, specific biases, and differential strengths and weaknesses of known methods. In this workshop, we aim to highlight important problems and to gather ideas of how to address them. The target audience are practitioners, providing insight into and analysis of problems with certain methods or comparative studies of several methods, as well as theoreticians interested in characterizing the hardness of continuous distributions or proving relevant properties of an established method. We especially welcome contributions from Statistics (Markov Chain Monte Carlo), Information Geometry, Optimal Filtering, or other related fields if they make an effort of bridging the gap towards variational techniques.

Live content is unavailable. Log in and register to view live content