Poster
Conditional Matrix Flows for Gaussian Graphical Models
Marcello Massimo Negri · Fabricio Arend Torres · Volker Roth
Great Hall & Hall B1+B2 (level 1) #1219
Abstract:
Studying conditional independence among many variables with few observations is a challenging task.Gaussian Graphical Models (GGMs) tackle this problem by encouraging sparsity in the precision matrix through regularization with .However, most GMMs rely on the norm because the objective is highly non-convex for sub- pseudo-norms.In the frequentist formulation, the norm relaxation provides the solution path as a function of the shrinkage parameter .In the Bayesian formulation, sparsity is instead encouraged through a Laplace prior, but posterior inference for different requires repeated runs of expensive Gibbs samplers.Here we propose a general framework for variational inference with matrix-variate Normalizing Flow in GGMs, which unifies the benefits of frequentist and Bayesian frameworks.As a key improvement on previous work, we train with one flow a continuum of sparse regression models jointly for all regularization parameters and all norms, including non-convex sub- pseudo-norms.Within one model we thus have access to (i) the evolution of the posterior for any and any (pseudo-) norm, (ii) the marginal log-likelihood for model selection, and (iii) the frequentist solution paths through simulated annealing in the MAP limit.
Chat is not available.