Timezone: »

Learning from Complementary Labels
Takashi Ishida · Gang Niu · Weihua Hu · Masashi Sugiyama

Tue Dec 06:30 PM -- 10:30 PM PST @ Pacific Ballroom #14 #None

Collecting labeled data is costly and thus a critical bottleneck in real-world classification tasks. To mitigate this problem, we propose a novel setting, namely learning from complementary labels for multi-class classification. A complementary label specifies a class that a pattern does not belong to. Collecting complementary labels would be less laborious than collecting ordinary labels, since users do not have to carefully choose the correct class from a long list of candidate classes. However, complementary labels are less informative than ordinary labels and thus a suitable approach is needed to better learn from them. In this paper, we show that an unbiased estimator to the classification risk can be obtained only from complementarily labeled data, if a loss function satisfies a particular symmetric condition. We derive estimation error bounds for the proposed method and prove that the optimal parametric convergence rate is achieved. We further show that learning from complementary labels can be easily combined with learning from ordinary labels (i.e., ordinary supervised learning), providing a highly practical implementation of the proposed method. Finally, we experimentally demonstrate the usefulness of the proposed methods.

Author Information

Takashi Ishida (The University of Tokyo, RIKEN, SMAM)
Gang Niu (RIKEN)
Weihua Hu (The University of Tokyo)
Masashi Sugiyama (RIKEN / University of Tokyo)

More from the Same Authors