Timezone: »
Combining predictive distributions is a central problem in Bayesian inference and machine learning. Currently, predictives are almost exclusively combined using linear density-mixtures such as Bayesian model averaging, Bayesian stacking, and mixture of experts.Nonetheless, linear mixtures impose traits that might be undesirable for some applications, such as multi-modality.While there are alternative strategies (e.g., geometric bridge or superposition), optimizing their parameters usually implies computing intractable normalizing constant repeatedly.In this extended abstract, we present two novel Bayesian model combination tools. They are generalizations of \emph{stacking}, but combine posterior densities by log-linear pooling (\emph{locking}) and quantum superposition (\emph{quacking}). To optimize model weights while avoiding the burden of normalizing constants, we maximize the Hyv\"arinen score of the combined posterior predictions. We demonstrate locking and quacking with an illustrative example.
Author Information
Yuling Yao (Flatiron Institute)
Luiz Carvalho (Fundação Getulio Vargas)
Biologist by training, statistician by trade. Lecturer at the School of Applied Mathematics, Getulio Vargas Foundation, Brazil.
Diego Mesquita (Getulio Vargas Foundation)
More from the Same Authors
-
2021 : Make cross-validation Bayes again »
Yuling Yao · Aki Vehtari -
2022 : Provably expressive temporal graph networks »
Amauri Souza · Diego Mesquita · Samuel Kaski · Vikas Garg -
2022 Poster: Provably expressive temporal graph networks »
Amauri Souza · Diego Mesquita · Samuel Kaski · Vikas Garg