Skip to yearly menu bar Skip to main content


Keynote talk
in
Workshop: Optimal Transport and Machine Learning

Variational inference via Wasserstein gradient flows (Sinho Chewi)


Abstract:

Probabilistic problems which involve the non-smooth entropy functional benefit from the design of proximal I will showcase the use of Wasserstein gradient flows as a conceptual framework for developing principled algorithms for variational inference (VI) with accompanying convergence guarantees, particularly for Gaussian VI and mean-field VI. This is joint work with Francis Bach, Krishnakumar Balasubramanian, Silvère Bonnabel, Michael Diao, Yiheng Jiang, Marc Lambert, Aram-Alexandre Pooladian, Philippe Rigollet, and Adil Salim.

Chat is not available.