1 / 1

Efficient Convex Relaxation for Transductive Support Vector Machine

Efficient Convex Relaxation for Transductive Support Vector Machine Zenglin Xu 1 , Rong Jin 2 , Jianke Zhu 1 , Irwin King 1 , and Michael R. Lyu 1. 2 rongjin@cse.msu.edu Department of Computer Science and Engineering Michigan State University East Lansing, MI, 48824.

brita
Download Presentation

Efficient Convex Relaxation for Transductive Support Vector Machine

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Efficient Convex Relaxation for Transductive Support Vector Machine Zenglin Xu1, Rong Jin2, Jianke Zhu1, Irwin King1, and Michael R. Lyu1 2 rongjin@cse.msu.edu Department of Computer Science and Engineering Michigan State University East Lansing, MI, 48824 1 {zlxu, jkzhu, king, lyu}@cse.cuhk.edu.hk Department of Computer Science and Engineering The Chinese University of Hong Kong Shatin, N.T., Hong Kong 3. Efficient Relaxation for TSVM 1. Introduction 4. ExperimentalResults and Conclusion We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples. Although several studies are devoted to Transductive SVM, they suffer either from We introduce a new variable z, such that To evaluate the performance of the proposed method, we perform experiments on several data sets with different scales. The dual problem then becomes as follows: (1) the high computation complexity, or from (2) the solutions of local optimum. , we then have and We further define Contributions The following figure compares the training time of the traditional relaxation method with the proposed efficient convex relaxation method. We propose an improved SDP relaxation algorithm to TSVM. (1) Unlike the semi-definite relaxation [Xu et al., 2005] that approximates TSVM by dropping the rank constraint, the proposed approach approximates TSVM by its dual problem, which provides a tighterapproximation. where the last constraint is the balance constraint, which avoids assigning all the data to one of the classes. (2) The proposed algorithm involves fewer free parameters and therefore significantly improves the efficiency by reducing the worst-case computational complexity from to . To solve the above problem, we get the following theorem by calculating the Lagrange of the above non-convex problem. Computation time of the proposed convex relaxation approach for TSVM (i.e., CTSVM) and the traditional semi-definite relaxation approach for TSVM (i.e., RTSVM) versus the number of unlabeled examples. The Course data set is used, and the number of labeled examples is 20. 2. Problem The following table shows the classification performance of supervised SVM and a variety of approximation methods to TSVM. The dual form of SVM can be formulated in the following according to [Lanckriet et al., 2004] : where K is the kernel matrix. Discussion Using Schur complement, we have a semi-definite programming (SDP) 1 variable scale: worst-case time complexity: Note: (1) Due to the high computational complexity, the traditional SDP-relaxation method is only evaluated on the small-size data sets, i.e., “IBM-s” and “Course-s”. The results are 68.57±22.73 and 64.03±7.65, respectively, which are worse than the proposed CTSVM method. (2) SVM-light fails to converge on Banana within one week. Our proposed convex approximation method provides tighter approximation than the previous method. Our prediction function implements the conjugate of conjugate of the prediction function f(x), which is the convex envelope of f(x). 2 Conclusion Based on the result that the set 3 The experimental results demonstrate that our proposed algorithm is effective to decrease the time complexity of convex relaxation TSVM and to improve the transductive classification accuracy performance. , The solution of the proposed algorithm is related to that of harmonic functions. the following approximation is achieved in [Xu et al., 2005] The decision function can be written as follows:. 1 variable scale: Acknowledgments We first propagate the class labels from the labeled examples to the time complexity: unlabeled one by term , then adjust the prediction labels by the 2 The constraint The authors thank the anonymous reviewers and PC members for their valuable suggestions and comments to improve this paper. factor is dropped.

More Related