Timezone: »

 
Poster
Generalised Mutual Information for Discriminative Clustering
Louis Ohl · Pierre-Alexandre Mattei · Charles Bouveyron · Warith HARCHAOUI · Mickaël Leclercq · Arnaud Droit · Frederic Precioso

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #1035

In the last decade, recent successes in deep clustering majorly involved the mutual information (MI) as an unsupervised objective for training neural networks with increasing regularisations. While the quality of the regularisations have been largely discussed for improvements, little attention has been dedicated to the relevance of MI as a clustering objective. In this paper, we first highlight how the maximisation of MI does not lead to satisfying clusters. We identified the Kullback-Leibler divergence as the main reason of this behaviour. Hence, we generalise the mutual information by changing its core distance, introducing the generalised mutual information (GEMINI): a set of metrics for unsupervised neural network training. Unlike MI, some GEMINIs do not require regularisations when training. Some of these metrics are geometry-aware thanks to distances or kernels in the data space. Finally, we highlight that GEMINIs can automatically select a relevant number of clusters, a property that has been little studied in deep clustering context where the number of clusters is a priori unknown.

Author Information

Louis Ohl (Université Côte d'Azur & Université Laval)
Pierre-Alexandre Mattei (INRIA)
Charles Bouveyron (Université Côte d'Azur)
Warith HARCHAOUI (Jellysmack)
Mickaël Leclercq
Arnaud Droit
Frederic Precioso (Universite Cote d'Azur)

More from the Same Authors

  • 2022 Poster: Concept Embedding Models »
    Mateo Espinosa Zarlenga · Pietro Barbiero · Gabriele Ciravegna · Giuseppe Marra · Francesco Giannini · Michelangelo Diligenti · Zohreh Shams · Frederic Precioso · Stefano Melacci · Adrian Weller · Pietro Lió · Mateja Jamnik