Timezone: »
Variational methods provide a computationally scalable alternative to Monte Carlo methods for large-scale, Bayesian nonparametric learning. In practice, however, conventional batch and online variational methods quickly become trapped in local optima. In this paper, we consider a nonparametric topic model based on the hierarchical Dirichlet process (HDP), and develop a novel online variational inference algorithm based on split-merge topic updates. We derive a simpler and faster variational approximation of the HDP, and show that by intelligently splitting and merging components of the variational posterior, we can achieve substantially better predictions of test data than conventional online and batch variational algorithms. For streaming analysis of large datasets where batch analysis is infeasible, we show that our split-merge updates better capture the nonparametric properties of the underlying model, allowing continual learning of new topics.
Author Information
Michael Bryant (Brown University)
Erik Sudderth (University of California, Irvine)
More from the Same Authors
-
2021 Poster: Scalable and Stable Surrogates for Flexible Classifiers with Fairness Constraints »
Henry C Bendekgey · Erik Sudderth -
2015 Poster: Scalable Adaptation of State Complexity for Nonparametric Hidden Markov Models »
Michael Hughes · William Stephenson · Erik Sudderth -
2013 Poster: Efficient Online Inference for Bayesian Nonparametric Relational Models »
Dae Il Kim · Prem Gopalan · David Blei · Erik Sudderth -
2013 Poster: Memoized Online Variational Inference for Dirichlet Process Mixture Models »
Michael Hughes · Erik Sudderth -
2012 Poster: Effective Split-Merge Monte Carlo Methods for Nonparametric Models of Sequential Data »
Michael Hughes · Emily Fox · Erik Sudderth -
2012 Poster: Minimization of Continuous Bethe Approximations: A Positive Variation »
Jason Pacheco · Erik Sudderth -
2012 Poster: From Deformations to Parts: Motion-based Segmentation of 3D Objects »
Soumya Ghosh · Erik Sudderth · Matthew Loper · Michael J Black -
2011 Poster: The Doubly Correlated Nonparametric Topic Model »
Dae Il Kim · Erik Sudderth -
2011 Poster: Spatial distance dependent Chinese Restaurant Process for image segmentation »
Soumya Ghosh · Andrei B Ungureanu · Erik Sudderth · David Blei -
2010 Poster: Global seismic monitoring as probabilistic inference »
Nimar Arora · Stuart J Russell · Paul Kidwell · Erik Sudderth -
2010 Spotlight: Layered image motion with explicit occlusions, temporal consistency, and depth ordering »
Deqing Sun · Erik Sudderth · Michael J Black -
2010 Poster: Layered image motion with explicit occlusions, temporal consistency, and depth ordering »
Deqing Sun · Erik Sudderth · Michael J Black -
2009 Session: Oral session 9: Bayesian Analysis »
Erik Sudderth -
2009 Poster: Sharing Features among Dynamical Systems with Beta Processes »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2009 Oral: Sharing Features among Dynamical Systems with Beta Processes »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2008 Oral: Shared Segmentation of Natural Scenes Using Dependent Pitman-Yor Processes »
Erik Sudderth · Michael Jordan -
2008 Poster: Nonparametric Bayesian Learning of Switching Linear Dynamical Systems »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2008 Poster: Shared Segmentation of Natural Scenes Using Dependent Pitman-Yor Processes »
Erik Sudderth · Michael Jordan -
2008 Spotlight: Nonparametric Bayesian Learning of Switching Linear Dynamical Systems »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2008 Session: Oral session 4: Combinatorial Approximation »
Erik Sudderth -
2007 Poster: Loop Series and Bethe Variational Bounds in Attractive Graphical Models »
Erik Sudderth · Martin J Wainwright · Alan S Willsky