`

Timezone: »

 
Poster
Fast Adversarial Robustness Certification of Nearest Prototype Classifiers for Arbitrary Seminorms
Sascha Saralajew · Lars Holdijk · Thomas Villmann

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1659

Methods for adversarial robustness certification aim to provide an upper bound on the test error of a classifier under adversarial manipulation of its input. Current certification methods are computationally expensive and limited to attacks that optimize the manipulation with respect to a norm. We overcome these limitations by investigating the robustness properties of Nearest Prototype Classifiers (NPCs) like learning vector quantization and large margin nearest neighbor. For this purpose, we study the hypothesis margin. We prove that if NPCs use a dissimilarity measure induced by a seminorm, the hypothesis margin is a tight lower bound on the size of adversarial attacks and can be calculated in constant time—this provides the first adversarial robustness certificate calculable in reasonable time. Finally, we show that each NPC trained by a triplet loss maximizes the hypothesis margin and is therefore optimized for adversarial robustness. In the presented evaluation, we demonstrate that NPCs optimized for adversarial robustness are competitive with state-of-the-art methods and set a new benchmark with respect to computational complexity for robustness certification.

Author Information

Sascha Saralajew (Dr. Ing. h.c. F. Porsche AG)
Lars Holdijk (University of Amsterdam)
Thomas Villmann (University of Applied Sciences Mittweida)

More from the Same Authors