Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Memory in Artificial and Real Intelligence (MemARI)

Learning to Control Rapidly Changing Synaptic Connections: An Alternative Type of Memory in Sequence Processing Artificial Neural Networks

Kazuki Irie · Jürgen Schmidhuber

Keywords: [ synaptic connection weights ] [ short-term memory ] [ fast weight programmers ]


Abstract:

Short-term memory in standard, general-purpose, sequence-processing recurrent neural networks (RNNs) is stored as activations of nodes or ''neurons.'' Generalizing feedforward NNs (FNNs) to such RNNs is mathematically straightforward and natural, and even historical: already in 1943, McCulloch and Pitts proposed this as a surrogate to ''synaptic modifications,'' generalizing the Lenz-Ising model, the first RNN architecture of 1925. A lesser known alternative approach to storing short-term memory in ''synaptic connections''---by parameterising and controlling the dynamics of a context-sensitive time-varying weight matrix through another NN---yields another ''natural'' type of short-term memory in sequence processing NNs: the Fast Weight Programmers (FWPs) of the early 1990s. FWPs have seen a recent revival as generic sequence processors, achieving competitive performance across various tasks. They are formally closely related to the now popular Transformers. Here we present them in the context of artificial NNs as an abstraction of biological NNs---a perspective that has not been stressed enough in previous FWP work. We first review aspects of FWPs for pedagogical purposes, then discuss connections to related works motivated by insights from neuroscience.

Chat is not available.