Skip to yearly menu bar Skip to main content


Poster

TETRIS: TilE-matching the TRemendous Irregular Sparsity

Yu Ji · Ling Liang · Lei Deng · Youyang Zhang · Youhui Zhang · Yuan Xie

Room 210 #68

Keywords: [ Efficient Inference Methods ] [ Optimization for Deep Networks ]


Abstract:

Compressing neural networks by pruning weights with small magnitudes can significantly reduce the computation and storage cost. Although pruning makes the model smaller, it is difficult to get practical speedup in modern computing platforms such as CPU and GPU due to the irregularity. Structural pruning has attract a lot of research interest to make sparsity hardware-friendly. Increasing the sparsity granularity can lead to better hardware utilization, but it will compromise the sparsity for maintaining accuracy.

In this work, we propose a novel method, TETRIS, to achieve both better hardware utilization and higher sparsity. Just like a tile-matching game, we cluster the irregularly distributed weights with small value into structured groups by reordering the input/output dimension and structurally prune them. Results show that it can achieve comparable sparsity with the irregular element-wise pruning and demonstrate negligible accuracy loss. The experiments also shows ideal speedup, which is proportional to the sparsity, on GPU platforms. Our proposed method provides a new solution toward algorithm and architecture co-optimization for accuracy-efficiency trade-off.

Live content is unavailable. Log in and register to view live content