Poster
in
Workshop: Optimal Transport and Machine Learning
Sinkhorn EM: An Expectation-Maximization algorithm based on entropic optimal transport
Gonzalo Mena · Amin Nejatbakhsh · Erdem Varol · Jonathan Niles-Weed
We study Sinkhorn EM (sEM), a variant of expectation-maximization (EM) based on entropic optimal transport, as an algorithm for mixture model inference when prior information about the mixing weights is known.sEM differs from the classic EM algorithm in the way responsibilities are computed during the expectation step: rather than assign data points to clusters independently, sEM uses optimal transport to compute responsibilities that respect the known proportions.Like EM, sEM has a natural interpretation as a coordinate ascent procedure, which iteratively constructs and optimizes a lower bound on the log-likelihood.However, when the mixing weights are known, we show theoretically and empirically that sEM has better behavior than EM: it possesses better global convergence guarantees and is less prone to getting stuck in bad local optima.We complement our theoretical findings with experiments on simulated data, and demonstrate an application of sEM to an image segmentation task arising in neuroscience.In this setting, sEM yields segmentations that are significantly better than other approaches.