Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Adaptive Foundation Models: Evolving AI for Personalized and Efficient Learning

Dynamically Managing a Prompt Pool via Self-Enhancement in Continual Learning

Hayun Lee · Kiseong Hong · Hwanhee Lee · Sungho Suh · Eunwoo Kim


Abstract:

Prompt-based continual learning methods have emerged to address catastrophic forgetting by leveraging large-scale foundation models. These methods keep pretrained models frozen and tune only small sets of parameters called prompts to learn tasks sequentially. However, when a new task comes in, the key-query matching mechanism in prompt-based methods selects the most relevant prompt without adequately considering whether it is actually suitable for learning the task. To address this, we propose the CoEn (Continual Enhanced prompt pool), which dynamically manages the prompt pool each time a new task is introduced. Our goal is to transform the static management of the prompt pool into a dynamic approach, enabling greater flexibility in adapting to new tasks and reducing the risk of catastrophic forgetting. Specifically, CoEn includes a new self-enhancement mechanism that assesses whether the prompts in the prompt pool can positively transfer knowledge to a new task and selectively strengthens the prompts. We demonstrate the proposed method under image classification benchmarks for class-incremental learning. Experimental results show that the proposed method outperforms existing prompt-based methods with an average margin of 3.8% across all scenarios.

Live content is unavailable. Log in and register to view live content