Timezone: »

Efficient Convex Relaxation for Transductive Support Vector Machine
Zenglin Xu · Rong Jin · Jianke Zhu · Irwin King · Michael Lyu

Tue Dec 04 10:30 AM -- 10:40 AM (PST) @ None #None

We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples. Although several studies are devoted to Transductive SVM, they suffer either from the high computation complexity or from the solutions of local optimum. To address this problem, we propose solving Transductive SVM via a convex relaxation, which converts the NP-hard problem to a semi-definite programming. Compared with the other SDP relaxation for Transductive SVM, the proposed algorithm is computationally more efficient with the number of free parameters reduced from O(n2) to O(n) where n is the number of examples. Empirical study with several benchmark data sets shows the promising performance of the proposed algorithm in comparison with other state-of-the-art implementations of Transductive SVM.

Author Information

Zenglin Xu (University of Electronic Science & Technology of China)
Rong Jin (Michigan State University (MSU))
Jianke Zhu (The Chinese University of Hong Kong)
Irwin King (Chinese University of Hong Kong)
Michael Lyu (CUHK)

More from the Same Authors