Dissecting Neural ODEs
Stefano Massaroli · Michael Poli · Jinkyoo Park · Atsushi Yamashita · Hajime Asama
2020 Poster
Abstract
Continuous deep learning architectures have recently re-emerged as Neural Ordinary Differential Equations (Neural ODEs). This infinite-depth approach theoretically bridges the gap between deep learning and dynamical systems, offering a novel perspective. However, deciphering the inner working of these models is still an open challenge, as most applications apply them as generic black-box modules. In this work we ``open the box'', further developing the continuous-depth formulation with the aim of clarifying the influence of several design choices on the underlying dynamics.
Video
Chat is not available.
Successful Page Load