Skip to yearly menu bar Skip to main content


Poster

Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules

Kazuki Irie · Francesco Faccio · Jürgen Schmidhuber

Hall J (level 1) #932

Keywords: [ linear Transformers ] [ continuous-time sequence processing ] [ Neural ODEs ] [ Neural controlled differential equations ] [ fast weight programmers ]


Abstract:

Neural ordinary differential equations (ODEs) have attracted much attention as continuous-time counterparts of deep residual neural networks (NNs), and numerous extensions for recurrent NNs have been proposed. Since the 1980s, ODEs have also been used to derive theoretical results for NN learning rules, e.g., the famous connection between Oja's rule and principal component analysis. Such rules are typically expressed as additive iterative update processes which have straightforward ODE counterparts. Here we introduce a novel combination of learning rules and Neural ODEs to build continuous-time sequence processing nets that learn to manipulate short-term memory in rapidly changing synaptic connections of other nets. This yields continuous-time counterparts of Fast Weight Programmers and linear Transformers. Our novel models outperform the best existing Neural Controlled Differential Equation based models on various time series classification tasks, while also addressing their fundamental scalability limitations. Our code is public.

Chat is not available.