Timezone: »

Zonotope Domains for Lagrangian Neural Network Verification
Matt Jordan · Jonathan Hayase · Alex Dimakis · Sewoong Oh

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #723

Neural network verification aims to provide provable bounds for the output of a neural network for a given input range. Notable prior works in this domain have either generated bounds using abstract domains, which preserve some dependency between intermediate neurons in the network; or framed verification as an optimization problem and solved a relaxation using Lagrangian methods. A key drawback of the latter technique is that each neuron is treated independently, thereby ignoring important neuron interactions. We provide an approach that merges these two threads and uses zonotopes within a Lagrangian decomposition. Crucially, we can decompose the problem of verifying a deep neural network into the verification of many 2-layer neural networks. While each of these problems is provably hard, we provide efficient relaxation methods that are amenable to efficient dual ascent procedures. Our technique yields bounds that improve upon both linear programming and Lagrangian-based verification techniques in both time and bound tightness.

Author Information

Matt Jordan (UT Austin)
Jonathan Hayase (University of Washington)
Alex Dimakis (University of Texas, Austin)
Sewoong Oh (University of Washington)

More from the Same Authors