Timezone: »
In this paper we make several contributions towards accelerating approximate Bayesian structural inference for non-decomposable GGMs. Our first contribution is to show how to efficiently compute a BIC or Laplace approximation to the marginal likelihood of non-decomposable graphs using convex methods for precision matrix estimation. This optimization technique can be used as a fast scoring function inside standard Stochastic Local Search (SLS) for generating posterior samples. Our second contribution is a novel framework for efficiently generating large sets of high-quality graph topologies without performing local search. This graph proposal method, which we call "Neighborhood Fusion" (NF), samples candidate Markov blankets at each node using sparse regression techniques. Our final contribution is a hybrid method combining the complementary strengths of NF and SLS. Experimental results in structural recovery and prediction tasks demonstrate that NF and hybrid NF/SLS out-perform state-of-the-art local search methods, on both synthetic and real-world datasets, when realistic computational limits are imposed.
Author Information
Baback Moghaddam (Caltech)
Benjamin Marlin (University of Massachusetts Amherst)
Mohammad Emtiyaz Khan (RIKEN)
Emtiyaz Khan (also known as Emti) is a team leader at the RIKEN center for Advanced Intelligence Project (AIP) in Tokyo where he leads the Approximate Bayesian Inference Team. He is also a visiting professor at the Tokyo University of Agriculture and Technology (TUAT). Previously, he was a postdoc and then a scientist at Ecole Polytechnique Fédérale de Lausanne (EPFL), where he also taught two large machine learning courses and received a teaching award. He finished his PhD in machine learning from University of British Columbia in 2012. The main goal of Emti’s research is to understand the principles of learning from data and use them to develop algorithms that can learn like living beings. For the past 10 years, his work has focused on developing Bayesian methods that could lead to such fundamental principles. The approximate Bayesian inference team now continues to use these principles, as well as derive new ones, to solve real-world problems.
Kevin Murphy (Google)
Related Events (a corresponding poster, oral, or spotlight)
-
2009 Poster: Accelerating Bayesian Structural Inference for Non-Decomposable Gaussian Graphical Models »
Thu. Dec 10th 03:00 -- 07:59 AM Room
More from the Same Authors
-
2021 : Beyond Target Networks: Improving Deep $Q$-learning with Functional Regularization »
Alexandre Piche · Joseph Marino · Gian Maria Marconi · Valentin Thomas · Chris Pal · Mohammad Emtiyaz Khan -
2022 : Can Calibration Improve Sample Prioritization? »
Ganesh Tata · Gautham Krishna Gudur · Gopinath Chennupati · Mohammad Emtiyaz Khan -
2022 : Reliability benchmarks for image segmentation »
Estefany Kelly Buchanan · Michael Dusenberry · Jie Ren · Kevin Murphy · Balaji Lakshminarayanan · Dustin Tran -
2022 : Practical Structured Riemannian Optimization with Momentum by using Generalized Normal Coordinates »
Wu Lin · Valentin Duruisseaux · Melvin Leok · Frank Nielsen · Mohammad Emtiyaz Khan · Mark Schmidt -
2023 Poster: The Memory-Perturbation Equation: Understanding Model's Sensitivity to Data »
Peter Nickl · Lu Xu · Dharmesh Tailor · Thomas Möllenhoff · Mohammad Emtiyaz Khan -
2023 Poster: SPAE: Semantic Pyramid AutoEncoder for Multimodal Generation with Frozen LLMs »
Lijun Yu · Yong Cheng · Zhiruo Wang · Vivek Kumar · Wolfgang Macherey · Yanping Huang · David Ross · Irfan Essa · Yonatan Bisk · Ming-Hsuan Yang · Kevin Murphy · Alexander Hauptmann · Lu Jiang -
2023 Poster: Beyond Invariance: Test-Time Label-Shift Adaptation for Addressing ``Spurious'' Correlations »
Qingyao Sun · Kevin Murphy · Sayna Ebrahimi · Alexander D'Amour -
2022 : Invited Keynote 2 »
Mohammad Emtiyaz Khan · Mohammad Emtiyaz Khan -
2021 Poster: Dual Parameterization of Sparse Variational Gaussian Processes »
Vincent ADAM · Paul Chang · Mohammad Emtiyaz Khan · Arno Solin -
2021 Poster: Knowledge-Adaptation Priors »
Mohammad Emtiyaz Khan · Siddharth Swaroop -
2019 Poster: Approximate Inference Turns Deep Networks into Gaussian Processes »
Mohammad Emtiyaz Khan · Alexander Immer · Ehsan Abedi · Maciej Korzepa -
2019 Poster: Practical Deep Learning with Bayesian Principles »
Kazuki Osawa · Siddharth Swaroop · Mohammad Emtiyaz Khan · Anirudh Jain · Runa Eschenhagen · Richard Turner · Rio Yokota -
2019 Tutorial: Deep Learning with Bayesian Principles »
Mohammad Emtiyaz Khan -
2016 Poster: A scalable end-to-end Gaussian process adapter for irregularly sampled time series classification »
Steven Cheng-Xian Li · Benjamin Marlin -
2015 Poster: Kullback-Leibler Proximal Variational Inference »
Mohammad Emtiyaz Khan · Pierre Baque · François Fleuret · Pascal Fua -
2014 Workshop: 4th Workshop on Automated Knowledge Base Construction (AKBC) »
Sameer Singh · Fabian M Suchanek · Sebastian Riedel · Partha Pratim Talukdar · Kevin Murphy · Christopher Ré · William Cohen · Tom Mitchell · Andrew McCallum · Jason E Weston · Ramanathan Guha · Boyan Onyshkevych · Hoifung Poon · Oren Etzioni · Ari Kobren · Arvind Neelakantan · Peter Clark -
2014 Poster: Decoupled Variational Gaussian Inference »
Mohammad Emtiyaz Khan -
2014 Session: Tutorial Session A »
Kevin Murphy -
2014 Session: Tutorial Session A »
Kevin Murphy -
2014 Session: Tutorial Session A »
Kevin Murphy -
2012 Poster: Fast Bayesian Inference for Non-Conjugate Gaussian Process Regression »
Mohammad Emtiyaz Khan · Shakir Mohamed · Kevin Murphy -
2010 Poster: Variational bounds for mixed-data factor analysis »
Mohammad Emtiyaz Khan · Benjamin Marlin · Guillaume Bouchard · Kevin Murphy -
2007 Workshop: Statistical Network Models »
Kevin Murphy · Lise Getoor · Eric Xing · Raphael Gottardo