Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimal Transport and Machine Learning

Gradient flows on graphons: existence, convergence, continuity equations

Sewoong Oh · Soumik Pal · Raghav Somani · Raghav Tripathi


Abstract:

Wasserstein gradient flows on probability measures have found a host of applications in various optimization problems. They typically arise as the continuum limit of exchangeable particle systems evolving by some mean-field interaction involving a gradient-type potential. However, in many problems, such as in multi-layer neural networks, the so-called particles are edge weights on large graphs whose nodes are exchangeable. Such large graphs are known to converge to continuum limits called graphons as their size grow to infinity. We show that the Euclidean gradient flow of a suitable function of the edge-weights converges to a novel continuum limit given by a curve on the space of graphons that can be appropriately described as a gradient flow or, more technically, a curve of maximal slope. Furthermore, such curves can also be recovered as the limit of a sequence of continuity equations satisfied by the time-marginal laws of stochastic processes of random matrices.

Chat is not available.