Timezone: »

Backpropagation-Friendly Eigendecomposition
Wei Wang · Zheng Dang · Yinlin Hu · Pascal Fua · Mathieu Salzmann

Tue Dec 10 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #43

Eigendecomposition (ED) is widely used in deep networks. However, the backpropagation of its results tends to be numerically unstable, whether using ED directly or approximating it with the Power Iteration method, particularly when dealing with large matrices. While this can be mitigated by partitioning the data in small and arbitrary groups, doing so has no theoretical basis and makes its impossible to exploit the power of ED to the full. In this paper, we introduce a numerically stable and differentiable approach to leveraging eigenvectors in deep networks. It can handle large matrices without requiring to split them. We demonstrate the better robustness of our approach over standard ED and PI for ZCA whitening, an alternative to batch normalization, and for PCA denoising, which we introduce as a new normalization strategy for deep networks, aiming to further denoise the network's features.

Author Information

Wei Wang (EPFL)
Zheng Dang (Xi'an Jiaotong University)
Yinlin Hu (EPFL)
Pascal Fua (EPFL, Switzerland)
Mathieu Salzmann (EPFL)

More from the Same Authors