Skip to yearly menu bar Skip to main content


Poster

Structured and Deep Similarity Matching via Structured and Deep Hebbian Networks

Dina Obeid · Hugo Ramambason · Cengiz Pehlevan

East Exhibition Hall B, C #101

Keywords: [ Biologically Plausible Deep Networks ] [ Deep Learning ] [ Plasticity and Adaptation ] [ Neuroscience and Cognitive Science ]


Abstract:

Synaptic plasticity is widely accepted to be the mechanism behind learning in the brain’s neural networks. A central question is how synapses, with access to only local information about the network, can still organize collectively and perform circuit-wide learning in an efficient manner. In single-layered and all-to-all connected neural networks, local plasticity has been shown to implement gradient-based learning on a class of cost functions that contain a term that aligns the similarity of outputs to the similarity of inputs. Whether such cost functions exist for networks with other architectures is not known. In this paper, we introduce structured and deep similarity matching cost functions, and show how they can be optimized in a gradient-based manner by neural networks with local learning rules. These networks extend F\"oldiak’s Hebbian/Anti-Hebbian network to deep architectures and structured feedforward, lateral and feedback connections. Credit assignment problem is solved elegantly by a factorization of the dual learning objective to synapse specific local objectives. Simulations show that our networks learn meaningful features.

Live content is unavailable. Log in and register to view live content