Timezone: »

Positive-Unlabeled Learning with Non-Negative Risk Estimator
Ryuichi Kiryo · Gang Niu · Marthinus C du Plessis · Masashi Sugiyama

Tue Dec 05 10:55 AM -- 11:10 AM (PST) @ Hall A

From only \emph{positive}~(P) and \emph{unlabeled}~(U) data, a binary classifier can be trained with PU learning, in which the state of the art is \emph{unbiased PU learning}. However, if its model is very flexible, its empirical risk on training data will go negative and we will suffer from serious overfitting. In this paper, we propose a \emph{non-negative risk estimator} for PU learning. When being minimized, it is more robust against overfitting and thus we are able to train very flexible models given limited P data. Moreover, we analyze the \emph{bias}, \emph{consistency} and \emph{mean-squared-error reduction} of the proposed risk estimator and the \emph{estimation error} of the corresponding risk minimizer. Experiments show that the proposed risk estimator successfully fixes the overfitting problem of its unbiased counterparts.

Author Information

Ryuichi Kiryo (UTokyo/RIKEN)
Gang Niu (RIKEN)
Marthinus C du Plessis (The University of Tokyo)
Masashi Sugiyama (RIKEN / University of Tokyo)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors