350 likes | 543 Views
Variations of Minimax Probability Machine. Huang, Kaizhu 2003-09-16. Overview. Classification types, problems Minimax Probability Machine Main work Biased Minimax Probability Machine Minimum Error Minimax probability Machine Experiments Future work. Classification.
E N D
Variations of Minimax Probability Machine Huang, Kaizhu 2003-09-16
Overview • Classification • types, problems • Minimax Probability Machine • Main work • Biased Minimax Probability Machine • Minimum Error Minimax probability Machine • Experiments • Future work
Types of Classifiers • Generative Classifiers • Discriminative Classifiers
Classification—Generative Classifier p1 p2 Generative model assumes specific distributions on two class of data and uses these distributions to construct classification boundary.
Problems of Generative Model • All models are wrong, but some are useful –by Box • The distributional assumptions lack the generality and are invalidate in real cases It seems that Generative model should not assume specific model on the data
Classification—Discriminative Classifier:SVM support vectors
Problems of SVM support vectors It seems that SVM should consider the distribution of the data
It seems that SVM should consider the distribution of the data SVM GM It seems that Generative model should not assume specific models on the data
Minimax Probability Machine (MPM) • Features: • With distribution considerations • With no specific distribution assumption
Minimax Probability Machine • With distribution considerations • Assume the mean and covariance directly estimated from data reliably represent the real mean of covariance • Without specific distribution assumption • Directly construct classifiers from data
Minimax Probability Machine (Cont’d) • MPM problem leads to Second Order Cone Programming • Dual Problem • Geometric interpretation
Minimax Probability Machine (Cont’d) • Summary • Distribution-free • In general case, the accuracy of classification of the future data is bounded by α • Demonstrated to achieve comparative performance with the SVM.
Problems of MPM • In real cases, the importance for two classes is not always the same, which implies the lower bound α for two classes is not necessarily the same. – Motivate Biased Minimax Probability Machine • On the other hand, it seems that no reason exists that these equal bounds are required to be equal. The derived model is thus non-optimal in this sense.– Motivate Minimum Error Minimax Probability Machine
Biased Minimax Probability Machine • Observation: In diagnosing a severe epidemic disease, misclassification of the positive class causes more serious consequence than misclassification of the negative class. • A typical setting: as long as the accuracy of classification of the less important maintainsat an acceptable level ( specified by the real practitioners), the accuracy of classification of theimportant class should be as high as possible.
Biased Minimax Probability Machine (BMPM) • Objective • the same meaning as previous • an acceptable accuracy level • Equivalently
BMPM (Cont’d) • Objective • Equivalently, • Equivalently,
BMPM (Cont’d) • Parametric Method • Find by solving • Update • Equivalently • Least-squares approach
MPM BMPM Biased Minimax Probability Machine at an acceptable accuracy level
Minimum Error Minimax Probability Machine MEMPM MPM The MEMPM achieves the distribution-free Bayes optimal hyperplane in the worst-case setting.
Minimum Error Minimax Probability Machine • MEMPM achieves the Bayes optimal hyerplane when we assume some specific distribution, e.g. Gaussian distribution on data. Lemma : If the distribution of the normalized random variable is independent of a , the classifier derived by MEMPM will exactly represent the real Bayes optimal hyerplane.
MEMPM (Cont’d) • Objective • Equivalently
MEMPM (Cont’d) • Objective • Line search + sequential BMPM method
Kernelized Version • Kernelized BMPM • where
Kernelized Version (Cont’d) • Kernelized BMPM • where • and
Illustration of kernel methods Kernel Linear
Experimental results (BMPM) • Five benchmark datasets • Twonorm, Breast, Ionosphere, Pima, Sonar • Procedure – 5-fold cross validation • Linear • Gaussian Kernel • Parameter setting • pima • others
Experiments for MEMPM • Six benchmark datasets • Twonorm, Breast, Ionosphere, Pima, Heart, Vote • Procedure – 10-fold cross validation • Linear • Gaussian Kernel
Experiments for MEMPM • Six benchmark datasets • Twonorm, Breast, Ionosphere, Pima, Heart, Vote • Procedure – 10-fold cross validation • Linear • Gaussian Kernel
Conclusions and Future works • Conclusions • First quantitative method to analyze the biased classification task • Minimize the classification error rate in the worst case • Future works • Improve the efficiency of algorithm, especially in the kernelized version • Any decomposed method? • Robust estimation • Relation between VC bound in Support Vector Machine and bound in MEMPM • Regression model?
Reference • Popescu, I. and Bertsimas, D. (2001). Optimal inequalities in probability theory: A convex optimization approach. Technical Report TM62, INSEAD. • Lanckriet, G. R. G., El Ghaoui, L., and Jordan, M. I. (200a). Minimax probability machine. In Advances in Neural Information Processing Systems (NIPS) 14, Cambridge, MA. MIT Press. • Kaizhu Huang, Haiqin Yang, Irwin King, R. Michael Lyu, and Laiwan Chan. Biased minimax probability machine. 2003. • Kaizhu Huang, Haiqin Yang, Irwin King, R. Michael Lyu, and Laiwan Chan. Minimum error minimax probability machine. 2003.