120 likes | 247 Views
Making classifiers by supervised learning. Naïve Bayes Support Vector Machine Fisher’s discriminant Logistic Mahalanobis. Optimal hyperplane. Non-plasma cells. Plasma cells. Naïve Bayes Classifier.
E N D
Making classifiers by supervised learning • Naïve Bayes • Support Vector Machine • Fisher’s discriminant • Logistic • Mahalanobis Optimal hyperplane Non-plasma cells Plasma cells
Naïve Bayes Classifier where, Cand ¬C represent plasma cell and non-plasma cell, and Fi represent i-th different discrete fluorescence data. Using Bayes’ theorem, Statistical independence Our model makes the simplifying assumption that, conditional on the state of the cell (i.e. C/¬C), the fluorescence data are independent: i.e., Conditional independence Similarly, for the non-plasma cell, we can calculate its probability by the following equation, Finally, log-likelihood ratio can be written as following, http://en.wikipedia.org/wiki/Naive_Bayes_classifier
SVM (Hard margin) Distance between hyper plane and xi: Maximize margin: Scaling: http://en.wikipedia.org/wiki/Support_vector_machine, Pattern Recognition And Machine Learning (Christopher M. Bishop)
Quadratic programming (Primal and dual form) Lagrangian: KKT conditions QP: By SMO Only a few ai will be greater than 0 (support vectors), which lie on the margin and satisfy
SVM (Soft margin) Lagrangian: QP:
Fisher discriminant analysis 射影されたクラスの平均の分離度 Within class variance with label k (1), (2), (3), (4)を代入 Maximize J(w) scalar
Naïve Bayes Classification Plasma cells Non-plasma cells Sensitivity = 91.63 % Specificity = 99.90 %
SVM classification (Radial kernel) Plasma cells Non-plasma cells Sensitivity = 97.02 % Specificity = 99.97 %