Timezone: »

 
Exact Gradient Computation for Spiking Neural Networks
Jane Lee · Saeid Haghighatshoar · Amin Karbasi
Event URL: https://openreview.net/forum?id=UC_gA3cyFNu »

Spiking neural networks (SNNs) have recently emerged as an alternative to traditional neural networks, holding promise for energy efficiency benefits. However, the classic backpropagation algorithm for training traditional networks has been notoriously difficult to apply to SNNs due to the hard-thresholding and discontinuities at spike times. Therefore, a large majority of prior work believes that exact gradients for SNN w.r.t. their weights do not exist and has focused on approximation methods to produce surrogate gradients. In this paper, (1)\,by applying the implicit function theorem to SNN at the discrete spike times, we prove that, albeit being non-differentiable in time, SNNs have well-defined gradients w.r.t. their weights, and (2)\,we propose a novel training algorithm, called \emph{forward propagation} (FP), that computes exact gradients for SNNs. Our derivation of FP in this paper provides insights on why other related algorithms such as Hebbian learning and also recently-proposed surrogate gradient methods may perform well.

Author Information

Jane Lee (Yale University)
Saeid Haghighatshoar (SynSense AG)
Saeid Haghighatshoar

Saeid Haghighatshoar received his B.Sc. in electronics and his M.Sc. in communications systems from Sharif University of Technology, Tehran, Iran, in 2007 and 2009, respectively, and his Ph.D. in computer and communication sciences from EPFL, Lausanne, Switzerland, in 2014. Since 2015, he had several R&D positions in signal processing, wireless communications, machine learning, smart sensing and internet-of-things in TU Berlin, Berlin (2015-2019), Germany, and CSEM, Neuchatel, Switzerland (2020-2021). Currently, he is working as a senior R&D machine learning engineer on software and hardware development for spiking neural networks and neuromorphic computation at SynSense, Zurich, Switzerland (www.synsense.ai).

Amin Karbasi (Yale University)

More from the Same Authors