Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Principles in Cognitive Systems

Higher-order mutual information reveals synergistic sub-networks for multi-neuron importance

Kenzo Clauw · Daniele Marinazzo · Sebastiano Stramaglia


Abstract:

Quantifying which neurons are important with respect to the classification decision of a trained neural network is essential for understanding their inner workings. Previous work primarily attributed importance to individual neurons. In this work, we study which groups of neurons contain synergistic or redundant information using a multivariate mutual information method called the O-information. We observe the first layer is dominated by redundancy suggesting general shared features (i.e. detecting edges) while the last layer is dominated by synergy indicating local class-specific features (i.e. concepts). Finally, we show the O-information can be used for multi-neuron importance by re-training a synergistic sub-network results in a minimal change in performance. These results suggest our method can be used for pruning and unsupervised representation learning.

Chat is not available.