Timezone: »

Particle-based Variational Inference with Preconditioned Functional Gradient Flow
Hanze Dong · Xi Wang · Yong Lin · Tong Zhang
Event URL: https://openreview.net/forum?id=it6CsoPZfk »

Particle-based variational inference (VI) minimizes the KL divergence between model samples and the target posterior with gradient flow estimates. With the popularity of Stein variational gradient descent (SVGD), the focus of particle-based VI algorithms have been on the properties of functions in Reproducing Kernel Hilbert Space (RKHS) to approximate the gradient flow. However, the requirement of RKHS restricts the function class and algorithmic flexibility. This paper remedies the problem by proposing a general framework to obtain tractable functional gradient flow estimates. The functional gradient flow in our framework can be defined by a general functional regularization term that includes the RKHS norm as a special case. We also use our framework to propose a new particle-based VI algorithm: \emph{preconditioned functional gradient flow} (PFG). Compared with SVGD, the proposed preconditioned functional gradient method has several advantages: larger function classes; greater scalability in the large particle-size scenarios; better adaptation to ill-conditioned target distribution; provable continuous-time convergence in KL divergence. Both theoretical and experiments have shown the effectiveness of our framework.

Author Information

Hanze Dong (The Hong Kong University of Science and Technology)
Xi Wang (None)
Yong Lin (The Hong Kong University of Science and Technology)

I am an CSE PhD sudent in HKUST, supervised by Professor Tong Zhang. Out of Distribution Generalization , Robustess of Deep Learning and Learning theory are my research interests. Particularly, we are now working on topics related to Invariant Learning. If you are also interested in these fields or just my works, you can come to have a chat me.

Tong Zhang (The Hong Kong University of Science and Technology)

More from the Same Authors