110 likes | 126 Views
Multiclass boosting with repartitioning. Graduate : Chen, Shao-Pei Authors : Ling Li ICML. Outline. Motivation Objective Methodology AdaBoost.ECC AdaBoost.ERP Experimental Results Conclusion. 2. Motivation.
E N D
Multiclass boosting with repartitioning Graduate : Chen, Shao-Pei Authors : Ling Li ICML
Outline • Motivation • Objective • Methodology • AdaBoost.ECC • AdaBoost.ERP • Experimental Results • Conclusion 2
Motivation The quality of the final solution is affected by both the performance of the base learner and the error-correcting ability of the coding matrix. A coding matrix with strong error-correcting ability may not be overall optimal. 3
Objective • A new multi-class boosting algorithm that modifies the coding matrix according to the learning ability of the base learner.
Methodology-AdaBoost.ECC X Training Testing T SVM-Perceptron Code Book AdaBoosting.ECC Hamming distance Assign class T1: -1-1-11111 … M X1:y3 T1:y3 T2:y2 … M
Methodology-AdaBoost.ECC W K-class, The training set contains N examples, , Where is the input and . Given an input x, the ensemble output
The tangram experiment Max-cut Maximize Rand-half We have to find a good trade-off between maximizing and minimizing .
Methodology-AdaBoost.ERP Until convergence or some specified steps SVM-Perceptron AdaBoost.ERP Repartition M M’ Training Testing AdaBoost.ERP Assign class T X T1:y3 T2:y2 … Hamming distance Code Book To reduce the cost. T1: -1-1-11111 …
Methodology-AdaBoost.ERP Repartition To reduce the cost.
Conclusion • The improvement can be especially significant when the base learner is not very powerful. • Compared to boosting algorithms, their training time is usually much less, and be comparable or even lower.