100 likes | 256 Views
1. Stat 231. A.L. Yuille. Fall 2004. AdaBoost.. Summary and Extensions. Read Viola and Jones Handout. 2. Basic AdaBoost Review. Data Set of weak classifiers Weights Parameters Strong Classifier:. 3. Basic AdaBoost Algorithm. Initialize Update Rule:
E N D
1. Stat 231. A.L. Yuille. Fall 2004 • AdaBoost.. • Summary and Extensions. • Read Viola and Jones Handout. Lecture notes for Stat 231: Pattern Recognition and Machine Learning
2. Basic AdaBoost Review • Data • Set of weak classifiers • Weights • Parameters • Strong Classifier: Lecture notes for Stat 231: Pattern Recognition and Machine Learning
3. Basic AdaBoost Algorithm • Initialize • Update Rule: where Z is the normalization constant. • Let • Pick classifier to minimize • Set • Repeat. Lecture notes for Stat 231: Pattern Recognition and Machine Learning
4. Basic AdaBoost Algorithm • .Errors: • Bounded by, which equals • AdaBoost is a greedy algorithm that tries to minimize the bound by minimizing the Z’s in order w.r.t. Lecture notes for Stat 231: Pattern Recognition and Machine Learning
5. AdaBoost Variant 1. • In preparation for Viola and Jones. New parameter • Strong classifier • Modify update rule: • Let be the sum of weights if weak class is p, true class q. • Pick weak classifier to minimize set Lecture notes for Stat 231: Pattern Recognition and Machine Learning
6. AdaBoost Variant 1. • As before: the error is bounded by • Same “trick” If weak classifier is right then: If weak classifier is wrong then: Lecture notes for Stat 231: Pattern Recognition and Machine Learning
7. AdaBoost Variant 2. • We have assumed a loss function which pays equal penalties for false positives and false negatives. • But we may want false negatives to cost more (Viola and Jones). • Use loss function: Lecture notes for Stat 231: Pattern Recognition and Machine Learning
8. AdaBoost Variant 2. • Modify the update rule: • Verify that the loss: • Same update rule as for Variant 1, except Lecture notes for Stat 231: Pattern Recognition and Machine Learning
9. AdaBoost Extensions • AdaBoost can be extended to multiclasses: (Singer and Schapire) • The weak classifiers can have take multiple values. • The conditional probability interpretation applies to these extensions. Lecture notes for Stat 231: Pattern Recognition and Machine Learning
10. AdaBoost Summary • Basic AdaBoost:. Combine weak classifiers to make a strong classifier. • Dynamically weight the data, so that misclassified data weighs more (like SVM pay more attention to hard-to-classify data). • Exponential convergence to empirical risk (weak conditions). • Useful for combining weak cues for Visual Detection tasks. • Probabilistic Interpretation/Multiclass/Multivalued classifiers. Lecture notes for Stat 231: Pattern Recognition and Machine Learning