Timezone: »

 
Spotlight
Removing the Feature Correlation Effect of Multiplicative Noise
Zijun Zhang · Yining Zhang · Zongpeng Li

Thu Dec 06 07:25 AM -- 07:30 AM (PST) @ Room 517 CD

Multiplicative noise, including dropout, is widely used to regularize deep neural networks (DNNs), and is shown to be effective in a wide range of architectures and tasks. From an information perspective, we consider injecting multiplicative noise into a DNN as training the network to solve the task with noisy information pathways, which leads to the observation that multiplicative noise tends to increase the correlation between features, so as to increase the signal-to-noise ratio of information pathways. However, high feature correlation is undesirable, as it increases redundancy in representations. In this work, we propose non-correlating multiplicative noise (NCMN), which exploits batch normalization to remove the correlation effect in a simple yet effective way. We show that NCMN significantly improves the performance of standard multiplicative noise on image classification tasks, providing a better alternative to dropout for batch-normalized networks. Additionally, we present a unified view of NCMN and shake-shake regularization, which explains the performance gain of the latter.

Author Information

Zijun Zhang (University of Calgary)
Yining Zhang (University of Calgary)
Zongpeng Li (Wuhan University)

Related Events (a corresponding poster, oral, or spotlight)