`

Timezone: »

 
Poster
Most ReLU Networks Suffer from $\ell^2$ Adversarial Perturbations
Amit Daniely · Hadas Shacham

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1556
We consider ReLU networks with random weights, in which the dimension decreases at each layer. We show that for most such networks, most examples $x$ admit an adversarial perturbation at an Euclidean distance of $O\left(\frac{\|x\|}{\sqrt{d}}\right)$, where $d$ is the input dimension. Moreover, this perturbation can be found via gradient flow, as well as gradient descent with sufficiently small steps. This result can be seen as an explanation to the abundance of adversarial examples, and to the fact that they are found via gradient descent.

Author Information

Amit Daniely (Hebrew University and Google Research)
Hadas Shacham (Hebrew University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors