Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations (NeurReps)

What shapes the loss landscape of self-supervised learning?

Liu Ziyin · Ekdeep S Lubana · Masahito Ueda · Hidenori Tanaka

Keywords: [ phase transition ] [ symmetry breaking ] [ Collapse ] [ Self-supervised learning ]


Abstract:

Prevention of complete and dimensional collapse of representations has recently become a design principle for self-supervised learning (SSL). However, questions remain in our theoretical understanding: Under what precise condition do these collapses occur? We provide theoretically grounded answers to this question by analyzing SSL loss landscapes for a linear model. We derive an analytically tractable theory of SSL landscape and show that it accurately captures an array of collapse phenomena and identifies their causes.

Chat is not available.