Timezone: »
Sampling inference methods are computationally difficult to scale for many models in part because global dependencies can reduce opportunities for parallel computation. Without strict conditional independence structure among variables, standard Gibbs sampling theory requires sample updates to be performed sequentially, even if dependence between most variables is not strong. Empirical work has shown that some models can be sampled effectively by going "Hogwild'' and simply running Gibbs updates in parallel with only periodic global communication, but the successes and limitations of such a strategy are not well understood. As a step towards such an understanding, we study the Hogwild Gibbs sampling strategy in the context of Gaussian distributions. We develop a framework which provides convergence conditions and error bounds along with simple proofs and connections to methods in numerical linear algebra. In particular, we show that if the Gaussian precision matrix is generalized diagonally dominant, then any Hogwild Gibbs sampler, with any update schedule or allocation of variables to processors, yields a stable sampling process with the correct sample mean.
Author Information
Matthew Johnson (MIT)
James Saunderson (Massachusetts Institute of Technology)
Alan S Willsky (Massachusetts Institute of Technology)
More from the Same Authors
-
2016 : Matthew Johnson – Autodiff writes your exponential family inference code »
Matthew Johnson -
2016 Poster: Composing graphical models with neural networks for structured representations and fast inference »
Matthew Johnson · David Duvenaud · Alex Wiltschko · Ryan Adams · Sandeep R Datta -
2015 Poster: Dependent Multinomial Models Made Easy: Stick-Breaking with the Polya-gamma Augmentation »
Scott Linderman · Matthew Johnson · Ryan Adams -
2013 Poster: Learning Gaussian Graphical Models with Observed or Latent FVSs »
Ying Liu · Alan S Willsky -
2011 Poster: High-Dimensional Graphical Model Selection: Tractable Graph Families and Necessary Conditions »
Animashree Anandkumar · Vincent Tan · Alan S Willsky -
2011 Oral: High-Dimensional Graphical Model Selection: Tractable Graph Families and Necessary Conditions »
Animashree Anandkumar · Vincent Tan · Alan S Willsky -
2009 Poster: Sharing Features among Dynamical Systems with Beta Processes »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2009 Oral: Sharing Features among Dynamical Systems with Beta Processes »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2008 Poster: Nonparametric Bayesian Learning of Switching Linear Dynamical Systems »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2008 Spotlight: Nonparametric Bayesian Learning of Switching Linear Dynamical Systems »
Emily Fox · Erik Sudderth · Michael Jordan · Alan S Willsky -
2007 Spotlight: Message Passing for Max-weight Independent Set »
Sujay Sanghavi · Devavrat Shah · Alan S Willsky -
2007 Poster: Message Passing for Max-weight Independent Set »
Sujay Sanghavi · Devavrat Shah · Alan S Willsky -
2007 Poster: Adaptive Embedded Subgraph Algorithms using Walk-Sum Analysis »
Venkat Chandrasekaran · Jason K Johnson · Alan S Willsky -
2007 Poster: Linear programming analysis of loopy belief propagation for weighted matching »
Sujay Sanghavi · Dmitry Malioutov · Alan S Willsky -
2007 Poster: Loop Series and Bethe Variational Bounds in Attractive Graphical Models »
Erik Sudderth · Martin J Wainwright · Alan S Willsky