Timezone: »

 
Poster
Implicit Posterior Variational Inference for Deep Gaussian Processes
Haibin YU · Yizhou Chen · Bryan Kian Hsiang Low · Patrick Jaillet · Zhongxiang Dai

Tue Dec 10 05:30 PM -- 07:30 PM (PST) @ East Exhibition Hall B + C #146

A multi-layer deep Gaussian process (DGP) model is a hierarchical composition of GP models with a greater expressive power. Exact DGP inference is intractable, which has motivated the recent development of deterministic and stochastic approximation methods. Unfortunately, the deterministic approximation methods yield a biased posterior belief while the stochastic one is computationally costly. This paper presents an implicit posterior variational inference (IPVI) framework for DGPs that can ideally recover an unbiased posterior belief and still preserve time efficiency. Inspired by generative adversarial networks, our IPVI framework achieves this by casting the DGP inference problem as a two-player game in which a Nash equilibrium, interestingly, coincides with an unbiased posterior belief. This consequently inspires us to devise a best-response dynamics algorithm to search for a Nash equilibrium (i.e., an unbiased posterior belief). Empirical evaluation shows that IPVI outperforms the state-of-the-art approximation methods for DGPs.

Author Information

Haibin YU (National University of Singapore)
Yizhou Chen (National University of Singapore)
Bryan Kian Hsiang Low (National University of Singapore)
Patrick Jaillet (MIT)
Zhongxiang Dai (National University of Singapore)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors