Timezone: »

Combining Recurrent, Convolutional, and Continuous-Time Models with Structured Learnable Linear State-Space Layers
Isys Johnson · Albert Gu · Karan Goel · Khaled Saab · Tri Dao · Atri Rudra · Christopher Ré

The Linear State-Space Layer (LSSL) is a model family that combines the strengths of sequential modeling paradigms such as recurrence, convolution, and differential equations. For example, they generalize convolutions to continuous-time, explain common RNN heuristics, and share features of NDEs such as time-scale adaptation. Although naive LSSLs struggle with modeling long dependencies, we introduce a class of LSSL (SLLSSL), which overcomes these limitations by utilizing a trainable set of structured matrices that endow it with long range memory.

Author Information

Isys Johnson (State University of New York at Buffalo)
Albert Gu (Stanford)
Karan Goel (Stanford)
Khaled Saab (Stanford University)
Tri Dao (Stanford University)
Atri Rudra (University at Buffalo, SUNY)
Christopher Ré (Stanford)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors