Timezone: »

Deep Residual Learning in Spiking Neural Networks
Wei Fang · Zhaofei Yu · Yanqi Chen · Tiejun Huang · Timothée Masquelier · Yonghong Tian

Tue Dec 07 04:30 PM -- 06:00 PM (PST) @

Deep Spiking Neural Networks (SNNs) present optimization difficulties for gradient-based approaches due to discrete binary activation and complex spatial-temporal dynamics. Considering the huge success of ResNet in deep learning, it would be natural to train deep SNNs with residual learning. Previous Spiking ResNet mimics the standard residual block in ANNs and simply replaces ReLU activation layers with spiking neurons, which suffers the degradation problem and can hardly implement residual learning. In this paper, we propose the spike-element-wise (SEW) ResNet to realize residual learning in deep SNNs. We prove that the SEW ResNet can easily implement identity mapping and overcome the vanishing/exploding gradient problems of Spiking ResNet. We evaluate our SEW ResNet on ImageNet, DVS Gesture, and CIFAR10-DVS datasets, and show that SEW ResNet outperforms the state-of-the-art directly trained SNNs in both accuracy and time-steps. Moreover, SEW ResNet can achieve higher performance by simply adding more layers, providing a simple method to train deep SNNs. To our best knowledge, this is the first time that directly training deep SNNs with more than 100 layers becomes possible. Our codes are available at https://github.com/fangwei123456/Spike-Element-Wise-ResNet.

Author Information

Wei Fang (the School of Electronics Engineering and Computer Science, Peking University)
Zhaofei Yu (Peking University)
Yanqi Chen (Peking University)
Tiejun Huang (Peking University)
Timothée Masquelier (CNRS)
Yonghong Tian (Peking University)

More from the Same Authors