Timezone: »

CL-LSG: Continual Learning via Learnable Sparse Growth
Li Yang · Sen Lin · Junshan Zhang · Deliang Fan
Event URL: https://openreview.net/forum?id=xdAVqcCCBg »

Continual learning (CL) has been developed to learn new tasks sequentially and perform knowledge transfer from the old tasks to the new ones without forgetting, which is well known as catastrophic forgetting. While recent structure-based learning methods show the capability of alleviating the forgetting problem, these methods require a complex learning process to gradually grow-and-prune of a full-size network for each task, which is inefficient. To address this problem and enable efficient network expansion for new tasks, to the best of our knowledge, we are the first to develop a learnable sparse growth (LSG) method, which explicitly optimizes the model growth to only select important and necessary channels for growing. Building on the LSG, we then propose CL-LSG, a novel end-to-end CL framework to grow the model for each new task dynamically and sparsely. Different from all previous structure-based CL methods that start from and then prune (i.e., two-step) a full-size network, our framework starts from a compact seed network with a much smaller size and grows to the necessary model size (i.e., one-step) for each task, which eliminates the need of additional pruning in previous structure-based growing methods.

Author Information

Li Yang (Arizona State University)
Sen Lin (Ohio State University, Columbus)
Junshan Zhang (University of California, Davis)
Deliang Fan (Arizona State University)

More from the Same Authors