Timezone: »

On the Sparsity of Image Super-resolution Network
Chenyu Dong · Hailong Ma · Jinjin Gu · Ruofan Zhang · Jieming Li · Chun Yuan
Event URL: https://openreview.net/forum?id=n1aYPBNibBQ »

The over parameterization of neural networks has been widely concerned for a long time. This gives us the opportunity to find a sub-networks that can improve the parameter efficiency of neural networks from a over parameterized network. In our study, we used EDSR as the backbone network to explore the parameter efficiency in super-resolution(SR) networks in the form of sparsity. Specifically, we search for sparse sub-networks at the two granularity of weight and kernel through various methods, and analyze the relationship between the structure and performance of the sub-networks. (1) We observe the ``Lottery Ticket Hypothesis'' from a new perspective in the regression task of SR on weight granularity. (2) On convolution kernel granularity, we apply several methods to explore the influence of different sparse sub-networks on network performance and found that based on certain rules, the performance of different sub-networks rarely depends on their structures. (3) We propose a very convenient width-sparsity method on convolution kernel granularity, which can improve the parameter utilization efficiency of most SR networks.

Author Information

Chenyu Dong (Electronic Engineering, Tsinghua University, Tsinghua University)
Hailong Ma (Tsinghua University, Tsinghua University)
Jinjin Gu (University of Sydney)
Ruofan Zhang
Jieming Li
Chun Yuan (Tsinghua University)

More from the Same Authors