NIPS 2009
Skip to yearly menu bar Skip to main content


Workshop

Nonparametric Bayes

Dilan Gorur · Francois Caron · Yee Whye Teh · David B Dunson · Zoubin Ghahramani · Michael Jordan

Westin: Emerald A

One of the major problems driving current research in statistical machine learning is the search for ways to exploit highly-structured models that are both expressive and tractable. Bayesian nonparametrics (BNP) provides a framework for developing robust and flexible models that can accurately represent the complex structure in the data. Model flexibility is achieved by assigning priors with unbounded capacity and overfitting is prevented by the Bayesian approach of integrating out all parameters and latent variables. Inference is typically achieves with approximation techniques like Markov chain Monte Carlo and variational Bayes. As a result, the model can automatically infer the necessary amount of complexity required for modeling the given data.

Live content is unavailable. Log in and register to view live content