H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks
Thomas Limbacher, Robert Legenstein
Spotlight presentation: Orals & Spotlights Track 08: Deep Learning
on 2020-12-08T07:50:00-08:00 - 2020-12-08T08:00:00-08:00
on 2020-12-08T07:50:00-08:00 - 2020-12-08T08:00:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: The ability to base current computations on memories from the past is critical for many cognitive tasks such as story understanding. Hebbian-type synaptic plasticity is believed to underlie the retention of memories over medium and long time scales in the brain. However, it is unclear how such plasticity processes are integrated with computations in cortical networks. Here, we propose Hebbian Memory Networks (H-Mems), a simple neural network model that is built around a core hetero-associative network subject to Hebbian plasticity. We show that the network can be optimized to utilize the Hebbian plasticity processes for its computations. H-Mems can one-shot memorize associations between stimulus pairs and use these associations for decisions later on. Furthermore, they can solve demanding question-answering tasks on synthetic stories. Our study shows that neural network models are able to enrich their computations with memories through simple Hebbian plasticity processes.