Timezone: »

 
Variational Wasserstein gradient flow
Jiaojiao Fan · Amirhossein Taghvaei · Yongxin Chen
Event URL: https://openreview.net/pdf?id=WZR7ckBkzPY »

The gradient flow of a function over the space of probability densities with respect to the Wasserstein metric often exhibits nice properties and has been utilized in several machine learning applications. The standard approach to compute the Wasserstein gradient flow is the finite difference which discretizes the underlying space over a grid, and is not scalable. In this work, we propose a scalable proximal gradient type algorithm for Wasserstein gradient flow. The key of our method is a variational formulation of the objective function, which makes it possible to realize the JKO proximal map through a primal-dual optimization. This primal-dual problem can be efficiently solved by alternatively updating the parameters in the inner and outer loops. Our framework covers all the classical Wasserstein gradient flows including the heat equation and the porous medium equation. We demonstrate the performance and scalability of our algorithm with several numerical examples.

Author Information

Jiaojiao Fan (Georgia Institute of Technology)
Amirhossein Taghvaei (University of Illinois at Urbana-Champaign)

Amirhossein Taghvaei graduated from Sharif University of Technology, Tehran, Iran, in 2013, receiving two B.S degrees; in Mechanical Engineering and in Physics. He is currently a PhD student in the Mechanical Science and Engineering department at University of Illinois at Urbana-Champaign. He is working under the direction of Prof. Prashant Mehta in the Coordinated Science Laboratory, Decision and Control group. His research interest lies in the intersection of probability and pde. He is currently working on the Feedback Particle Filter algorithm, and its application to high dimensional problems.

Yongxin Chen (Georgia Institute of Technology)

More from the Same Authors