Spotlight
Evaluating probabilities under high-dimensional latent variable models
Iain Murray · Russ Salakhutdinov

Mon Dec 8th 08:39 -- 08:40 PM @ None

We present a simple new Monte Carlo algorithm for evaluating probabilities of observations in complex latent variable models, such as Deep Belief Networks. While the method is based on Markov chains, estimates based on short runs are formally unbiased. In expectation, the log probability of a test set will be underestimated, and this could form the basis of a probabilistic bound. The method is much cheaper than gold-standard annealing-based methods and only slightly more expensive than the cheapest Monte Carlo methods. We give examples of the new method substantially improving simple variational bounds at modest extra cost.

Author Information

Iain Murray (University of Edinburgh)

Iain Murray is a SICSA Lecturer in Machine Learning at the University of Edinburgh. Iain was introduced to machine learning by David MacKay and Zoubin Ghahramani, both previous NIPS tutorial speakers. He obtained his PhD in 2007 from the Gatsby Computational Neuroscience Unit at UCL. His thesis on Monte Carlo methods received an honourable mention for the ISBA Savage Award. He was a commonwealth fellow in Machine Learning at the University of Toronto, before moving to Edinburgh in 2010. Iain's research interests include building flexible probabilistic models of data, and probabilistic inference from indirect and uncertain observations. Iain is passionate about teaching. He has lectured at several Summer schools, is listed in the top 15 authors on videolectures.net, and was awarded the EUSA Van Heyningen Award for Teaching in Science and Engineering in 2015.

Russ Salakhutdinov (Carnegie Mellon University)

More from the Same Authors