Timezone: »
We develop a framework for generalized variational inference in infinite-dimensional function spaces and use it to construct a method termed Gaussian Wasserstein inference (GWI). GWI leverages the Wasserstein distance between Gaussian measures on the Hilbert space of square-integrable functions in order to determine a variational posterior using a tractable optimization criterion. It avoids pathologies arising in standard variational function space inference. An exciting application of GWI is the ability to use deep neural networks in the variational parametrization of GWI, combining their superior predictive performance with the principled uncertainty quantification analogous to that of Gaussian processes. The proposed method obtains state-of-the-art performance on several benchmark datasets.
Author Information
Veit David Wild (University of Oxford)
Robert Hu (Amazon)
Dino Sejdinovic (University of Adelaide)
More from the Same Authors
-
2022 : Bayesian inference for aerosol vertical profiles »
Shahine Bouabid · Duncan Watson-Parris · Dino Sejdinovic -
2022 Poster: Giga-scale Kernel Matrix-Vector Multiplication on GPU »
Robert Hu · Siu Lun Chau · Dino Sejdinovic · Joan Glaunès -
2022 Poster: Explaining Preferences with Shapley Values »
Robert Hu · Siu Lun Chau · Jaime Ferrando Huertas · Dino Sejdinovic -
2022 Poster: RKHS-SHAP: Shapley Values for Kernel Methods »
Siu Lun Chau · Robert Hu · Javier González · Dino Sejdinovic