Timezone: »
Little work has been done to directly combine the outputs of multiple supervised and unsupervised models. However, it can increase the accuracy and applicability of ensemble methods. First, we can boost the diversity of classification ensemble by incorporating multiple clustering outputs, each of which provides grouping constraints for the joint label predictions of a set of related objects. Secondly, ensemble of supervised models is limited in applications which have no access to raw data but to the meta-level model outputs. In this paper, we aim at calculating a consolidated classification solution for a set of objects by maximizing the consensus among both supervised predictions and unsupervised grouping constraints. We seek a global optimal label assignment for the target objects, which is different from the result of traditional majority voting and model combination approaches. We cast the problem into an optimization problem on a bipartite graph, where the objective function favors smoothness in the conditional probability estimates over the graph, as well as penalizes deviation from initial labeling of supervised models. We solve the problem through iterative propagation of conditional probability estimates among neighboring nodes, and interpret the method as conducting a constrained embedding in a transformed space, as well as a ranking on the graph. Experimental results on three real applications demonstrate the benefits of the proposed method over existing alternatives.
Author Information
Jing Gao (University of Illinois, Urbana-Champaign)
Feng Liang (Univ. of Illinois Urbana-Champaign)
Wei Fan (Huawei Noah′s Ark Lab, Hong Kong)
Yizhou Sun
Jiawei Han (University of Illinois at Urbana-Champaign)
More from the Same Authors
-
2022 : Shift-Robust Node Classification via Graph Clustering Co-training »
Qi Zhu · Chao Zhang · Chanyoung Park · Carl Yang · Jiawei Han -
2022 Poster: Generating Training Data with Language Models: Towards Zero-Shot Language Understanding »
Yu Meng · Jiaxin Huang · Yu Zhang · Jiawei Han -
2021 Poster: Universal Graph Convolutional Networks »
Di Jin · Zhizhi Yu · Cuiying Huo · Rui Wang · Xiao Wang · Dongxiao He · Jiawei Han -
2021 Poster: Shift-Robust GNNs: Overcoming the Limitations of Localized Graph Training data »
Qi Zhu · Natalia Ponomareva · Jiawei Han · Bryan Perozzi -
2021 Poster: Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization »
Qi Zhu · Carl Yang · Yidan Xu · Haonan Wang · Chao Zhang · Jiawei Han -
2021 Poster: COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining »
Yu Meng · Chenyan Xiong · Payal Bajaj · saurabh tiwary · Paul Bennett · Jiawei Han · XIA SONG -
2019 Poster: Bayesian Joint Estimation of Multiple Graphical Models »
Lingrui Gan · Xinming Yang · Naveen Narisetty · Feng Liang -
2014 Poster: PAC-Bayesian AUC classification and scoring »
James Ridgway · Pierre Alquier · Nicolas Chopin · Feng Liang -
2014 Poster: Generalized Higher-Order Orthogonal Iteration for Tensor Decomposition and Completion »
Yuanyuan Liu · Fanhua Shang · Wei Fan · James Cheng · Hong Cheng -
2014 Poster: Robust Tensor Decomposition with Gross Corruption »
Quanquan Gu · Huan Gui · Jiawei Han -
2014 Poster: On a Theory of Nonparametric Pairwise Similarity for Clustering: Connecting Clustering to Classification »
Yingzhen Yang · Feng Liang · Shuicheng Yan · Zhangyang Wang · Thomas S Huang -
2012 Poster: Selective Labeling via Error Bound Minimization »
Quanquan Gu · Tong Zhang · Chris Ding · Jiawei Han