Timezone: »
This paper considers incremental few-shot learning, which requires a model to continually recognize new categories with only a few examples provided. Our study shows that existing methods severely suffer from catastrophic forgetting, a well-known problem in incremental learning, which is aggravated due to data scarcity and imbalance in the few-shot setting. Our analysis further suggests that to prevent catastrophic forgetting, actions need to be taken in the primitive stage -- the training of base classes instead of later few-shot learning sessions. Therefore, we propose to search for flat local minima of the base training objective function and then fine-tune the model parameters within the flat region on new tasks. In this way, the model can efficiently learn new classes while preserving the old ones. Comprehensive experimental results demonstrate that our approach outperforms all prior state-of-the-art methods and is very close to the approximate upper bound. The source code is available at https://github.com/moukamisama/F2M.
Author Information
Guangyuan SHI (The Hong Kong Polytechnic University)
JIAXIN CHEN (The Hong Kong Polytechnic University)
Wenlong Zhang (The Hong Kong Polytechnic University)
Li-Ming Zhan (The Hong Kong Polytechnic University)
Xiao-Ming Wu (The Hong Kong Polytechnic University)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Spotlight: Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima »
Dates n/a. Room
More from the Same Authors
-
2022 : Emergent collective intelligence from massive-agent cooperation and competition »
Hanmo Chen · Stone Tao · JIAXIN CHEN · Weihan Shen · Xihui Li · Chenghui Yu · Sikai Cheng · Xiaolong Zhu · Xiu Li -
2023 Poster: PromptRestorer: A Prompting Image Restoration Method with Degradation Perception »
Cong Wang · Jinshan Pan · Wei Wang · Jiangxin Dong · Mengzhu Wang · Yakun Ju · Junyang Chen · Xiao-Ming Wu -
2023 Poster: Real-World Image Super-Resolution as Multi-Task Learning »
Wenlong Zhang · Xiaohui Li · Guangyuan SHI · Xiangyu Chen · Yu Qiao · Xiaoyun Zhang · Xiao-Ming Wu · Chao Dong -
2023 Competition: Lux AI Challenge Season 2 NeurIPS Edition »
· Qimai Li · Yuhao Jiang · JIAXIN CHEN · Xiaolong Zhu · Bovard Doerschuk-Tiberi · Isabelle Pan · Addison Howard -
2020 Poster: A Closer Look at the Training Strategy for Modern Meta-Learning »
JIAXIN CHEN · Xiao-Ming Wu · Yanke Li · Qimai LI · Li-Ming Zhan · Fu-lai Chung