Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2023: Optimization for Machine Learning

(Un)certainty selection methods for Active Learning on Label Distributions

James Spann · Christopher Homan


Abstract:

Some supervised learning problems can require predicting a probability distribution over possible answers than one (set of) answer(s). In such cases, a major scaling issue is the amount of labels needed, since compared to their single- or multi-label counterparts, distributional labels are typically (1) harder to learn and (2) more expensive to obtain for training and testing. In this paper, we explore the use of active learning to alleviate this bottleneck. We progressively train a label distribution learning model by selectively labeling data and, achieving the minimum error rate with fifty percent fewer data items than non-active learning strategies. Our experiments show that certainty-based query strategies outperform uncertainty-based ones on the label distribution learning problems we study.

Chat is not available.