Skip to yearly menu bar Skip to main content


Poster

Activation Map Compression through Tensor Decomposition for Deep Learning

Le-Trung Nguyen · Aël Quélennec · Enzo Tartaglione · Samuel Tardieu · Van-Tam Nguyen

[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Internet of Things and Deep Learning are synergetically exponentially growing industrial fields with a massive call for their unification into a common framework called Edge AI. While on-device inference is a well-explored topic in recent research, backpropagation remains an open challenge due to its prohibitive computational and memory costs compared to the extreme resource constraints of embedded devices. Drawing on tensor decomposition research, we tackle the major bottleneck in backpropagation, namely the memory footprint of activation map storage. We investigate and compare the effects of activation compression using Singular Value Decomposition and its tensor variant, High-Order Singular Value Decomposition. The application of low-order decomposition enables considerable memory savings while preserving the features being essential for learning, providing as well theoretical guarantees to convergence. Experimental results obtained on main-stream architectures and tasks demonstrate its Pareto-superiority over other state-of-the-art solutions, in terms of the trade-off between generalization and memory footprint.

Live content is unavailable. Log in and register to view live content