Linear Dynamical Systems as a Core Computational Primitive

Shiva Kaul

Spotlight presentation: Orals & Spotlights Track 16: Continual/Meta/Misc Learning
on 2020-12-09T07:10:00-08:00 - 2020-12-09T07:20:00-08:00
Poster Session 4 (more posters)
on 2020-12-09T09:00:00-08:00 - 2020-12-09T11:00:00-08:00
GatherTown: Algorithms ( Town A3 - Spot B0 )
Join GatherTown
Only iff poster is crowded, join Zoom . Authors have to start the Zoom call from their Profile page / Presentation History.
Abstract: Running nonlinear RNNs for T steps takes O(T) time. Our construction, called LDStack, approximately runs them in O(log T) parallel time, and obtains arbitrarily low error via repetition. First, we show nonlinear RNNs can be approximated by a stack of multiple-input, multiple-output (MIMO) LDS. This replaces nonlinearity across time with nonlinearity along depth. Next, we show that MIMO LDS can be approximated by an average or a concatenation of single-input, multiple-output (SIMO) LDS. Finally, we present an algorithm for running (and differentiating) SIMO LDS in O(log T) parallel time. On long sequences, LDStack is much faster than traditional RNNs, yet it achieves similar accuracy in our experiments. Furthermore, LDStack is amenable to linear systems theory. Therefore, it improves not only speed, but also interpretability and mathematical tractability.

Preview Video and Chat

Chat is not available.