Skip to yearly menu bar Skip to main content


Poster

Kernel Bayesian Inference with Posterior Regularization

Yang Song · Jun Zhu · Yong Ren

Area 5+6+7+8 #132

Keywords: [ (Other) Bayesian Inference ] [ (Other) Probabilistic Models and Methods ] [ Kernel Methods ]


Abstract:

We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former thresholding approach used in kernel POMDPs whose consistency remains to be established. Our theoretical work solves this open problem and provides consistency analysis in regression settings. Based on our optimizational formulation, we propose a flexible Bayesian posterior regularization framework which for the first time enables us to put regularization at the distribution level. We apply this method to nonparametric state-space filtering tasks with extremely nonlinear dynamics and show performance gains over all other baselines.

Live content is unavailable. Log in and register to view live content