Timezone: »
We propose a new parametric framework, called k-ReLU, for computing precise and scalable convex relaxations used to certify neural networks. The key idea is to approximate the output of multiple ReLUs in a layer jointly instead of separately. This joint relaxation captures dependencies between the inputs to different ReLUs in a layer and thus overcomes the convex barrier imposed by the single neuron triangle relaxation and its approximations. The framework is parametric in the number of k ReLUs it considers jointly and can be combined with existing verifiers in order to improve their precision. Our experimental results show that k-ReLU en- ables significantly more precise certification than existing state-of-the-art verifiers while maintaining scalability.
Author Information
Gagandeep Singh (ETH Zurich)
Rupanshu Ganvir (ETH Zurich)
Markus Püschel (ETH Zurich)
Martin Vechev (ETH Zurich, Switzerland)
More from the Same Authors
-
2020 Poster: Learning Certified Individually Fair Representations »
Anian Ruoss · Mislav Balunovic · Marc Fischer · Martin Vechev -
2020 Poster: Certified Defense to Image Transformations via Randomized Smoothing »
Marc Fischer · Maximilian Baader · Martin Vechev -
2019 Poster: Powerset Convolutional Neural Networks »
Chris Wendler · Markus Püschel · Dan Alistarh -
2019 Poster: Certifying Geometric Robustness of Neural Networks »
Mislav Balunovic · Maximilian Baader · Gagandeep Singh · Timon Gehr · Martin Vechev -
2018 Poster: Learning to Solve SMT Formulas »
Mislav Balunovic · Pavol Bielik · Martin Vechev -
2018 Oral: Learning to Solve SMT Formulas »
Mislav Balunovic · Pavol Bielik · Martin Vechev -
2018 Poster: Fast and Effective Robustness Certification »
Gagandeep Singh · Timon Gehr · Matthew Mirman · Markus Püschel · Martin Vechev