Poster
The Sample Complexity of Semi-Supervised Learning with Nonparametric Mixture Models
Chen Dan · Liu Leqi · Bryon Aragam · Pradeep Ravikumar · Eric Xing

Thu Dec 6th 05:00 -- 07:00 PM @ Room 210 #96
We study the sample complexity of semi-supervised learning (SSL) and introduce new assumptions based on the mismatch between a mixture model learned from unlabeled data and the true mixture model induced by the (unknown) class conditional distributions. Under these assumptions, we establish an $\Omega(K\log K)$ labeled sample complexity bound without imposing parametric assumptions, where $K$ is the number of classes. Our results suggest that even in nonparametric settings it is possible to learn a near-optimal classifier using only a few labeled samples. Unlike previous theoretical work which focuses on binary classification, we consider general multiclass classification ($K>2$), which requires solving a difficult permutation learning problem. This permutation defines a classifier whose classification error is controlled by the Wasserstein distance between mixing measures, and we provide finite-sample results characterizing the behaviour of the excess risk of this classifier. Finally, we describe three algorithms for computing these estimators based on a connection to bipartite graph matching, and perform experiments to illustrate the superiority of the MLE over the majority vote estimator.

Author Information

Chen Dan (Carnegie Mellon University)
Liu Leqi (Carnegie Mellon University)
Bryon Aragam (Carnegie Mellon University)
Pradeep Ravikumar (Carnegie Mellon University)
Eric Xing (Petuum Inc. / Carnegie Mellon University)

More from the Same Authors