Skip to yearly menu bar Skip to main content


Poster
in
Workshop: The Symbiosis of Deep Learning and Differential Equations II

Efficient Robustness Verification of Neural Ordinary Differential Equations

Mustafa Zeqiri · Mark Müller · Marc Fischer · Martin Vechev


Abstract:

Neural Ordinary Differential Equations (NODEs) are a novel neural architecture, built around initial value problems with learned dynamics. Thought to be inherently more robust against adversarial perturbations, they were recently shown to be vulnerable to strong adversarial attacks, highlighting the need for formal guarantees. In this work, we tackle this challenge and propose GAINS, an analysis framework for NODEs based on three key ideas: (i) a novel class of ODE solvers, based on variable but discrete time steps, (ii) an efficient graph representation of solver trajectories, and (iii) a bound propagation algorithm operating on this graph representation. Together, these advances enable the efficient analysis and certified training of high-dimensional NODEs, which we demonstrate in an extensive evaluation on computer vision and time-series forecasting problems.

Chat is not available.