Timezone: »

Dissecting Neural ODEs
Stefano Massaroli · Michael Poli · Jinkyoo Park · Atsushi Yamashita · Hajime Asama

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #222

Continuous deep learning architectures have recently re-emerged as Neural Ordinary Differential Equations (Neural ODEs). This infinite-depth approach theoretically bridges the gap between deep learning and dynamical systems, offering a novel perspective. However, deciphering the inner working of these models is still an open challenge, as most applications apply them as generic black-box modules. In this work we ``open the box'', further developing the continuous-depth formulation with the aim of clarifying the influence of several design choices on the underlying dynamics.

Author Information

Stefano Massaroli (The University of Tokyo)
Michael Poli (KAIST)

My work spans topics in deep learning, dynamical systems, variational inference and numerical methods. I am broadly interested in ensuring the successes achieved by deep learning methods in computer vision and natural language are extended to other engineering domains.

Jinkyoo Park (KAIST)
Atsushi Yamashita (The University of Tokyo)
Hajime Asama (The University of Tokyo)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors