Timezone: »

Submodular Bregman Divergences with Applications
Rishabh K Iyer · Jeff Bilmes

Mon Dec 03 07:00 PM -- 12:00 AM (PST) @ Harrah’s Special Events Center 2nd Floor #None

We introduce a class of discrete divergences on sets (equivalently binary vectors) that we call the submodular Bregman divergences. We consider two kinds, defined either from tight modular upper or tight modular lower bounds of a submodular function. We show that the properties of these divergences are analogous to the (standard continuous) Bregman divergence. Further, we demonstrate how they generalize many useful divergences, including the weighted Hamming distance, squared weighted Hamming, weighted precision, recall, conditional mutual information, and a generalized KL-divergence on sets. We also show that the lower bound submodular Bregman is actually a special case of the generalized Bregman divergence on the \lovasz{} extension of a submodular function which we call the \lovasz{} Bregman divergence. We then point out a number of applications of the submodular Bregman divergences, and in particular show that a proximal algorithm defined through the submodular Bregman divergences provides a framework for many mirror-descent style algorithms related to submodular function optimization. We also show that a generalization of the k-means algorithm using the \lovasz{} Bregman divergence is natural in clustering scenarios where the ordering is important. A unique property of this algorithm is that computing the mean ordering is extremely efficient unlike the other order based distance measures. \extendedv{Finally we provide a clustering framework for the submodular Bregman, and we derive fast algorithms for clustering sets of binary vectors (equivalently sets of sets).

Author Information

Rishabh K Iyer (University of Washington, Seattle)
Jeff Bilmes (University of Washington, Seattle)

More from the Same Authors