1 / 19

Co-operative Training in Classifier Ensembles

Co-operative Training in Classifier Ensembles. Rozita Dara PAMI Lab University of Waterloo. Outline. Introduction Sharing Training Resources Sharing Training Patterns Sharing Training Algorithms Sharing Training Information Sharing Training Information: An Algorithm Experimental Study

Download Presentation

Co-operative Training in Classifier Ensembles

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

  2. Outline • Introduction • Sharing Training Resources • Sharing Training Patterns • Sharing Training Algorithms • Sharing Training Information • Sharing Training Information: An Algorithm • Experimental Study • Discussion and Conclusions

  3. Introduction • Multiple Classifier Systems provide • Improved Performance • Better Reliability and Generalization • Multiple Classifier Systems motivations include • Empirical Observation • Problem decomposed naturally from using various sensors • Avoid making commitments to arbitrary initial conditions or parameters Co-operative Training in Classifier Ensembles

  4. Introduction (cntd…) “Combining identical classifiers will not lead to improved performance.” • Importance of creating diverse classifiers • How does the amount of “sharing” between classifiers affect the performance? Co-operative Training in Classifier Ensembles

  5. Sharing Training Resources • A measure of the degree of co-operation between various classifiers. • Sharing Training Patterns • Sharing Training Algorithms • Sharing Training Information Co-operative Training in Classifier Ensembles

  6. Sharing Training Patterns Co-operative Training in Classifier Ensembles

  7. Sharing Training Algorithms Co-operative Training in Classifier Ensembles

  8. Sharing Training Information Co-operative Training in Classifier Ensembles

  9. Training • Training each component independently • Optimize individual components, may not lead to overall improvement • Collinearity, high correlation between classifiers • Components, under-trained or over-trained Co-operative Training in Classifier Ensembles

  10. Training (cntd…) • Adaptive training • Selective: Reducing correlation between components • Focused: Re-training focuses on misclassified patterns. • Efficient: Determined the duration of training Co-operative Training in Classifier Ensembles

  11. Adaptive Training: Main loop • Share Training Information between members of the ensemble • Incremental learning • Evaluation of training to determine the re-training set Co-operative Training in Classifier Ensembles

  12. Adaptive Training: Training • Save classifier if it performs well on the evaluation set • Determine when to terminate training for each module Co-operative Training in Classifier Ensembles

  13. Adaptive Training: Evaluation • Train aggregation modules • Evaluate training sets for each classifier • Compose new training data Co-operative Training in Classifier Ensembles

  14. Adaptive Training: Data Selection • New training data are composed by concatenating • Errori: Misclassified entries of training data for classifier i. • Correcti: Random choice of R*(P*δ_i) correctly classified entries of the training data for classifier i. Co-operative Training in Classifier Ensembles

  15. Results Five one-hidden layer BP classifiers • Training used partially disjoint data sets • No optimization is performed for the trained networks • The parameters of all the networks are maintained for all the classifiers that are trained • Three data sets • 20 Class Gaussian • Satimages Co-operative Training in Classifier Ensembles

  16. Results (cntd…) Co-operative Training in Classifier Ensembles

  17. Conclusions • Exchange of information during training would allow for a informed fusion process • Enhance diversity amongst classifiers • Algorithms that share training information can improve overall classification accuracy. Co-operative Training in Classifier Ensembles

  18. Conclusions (cntd…) Co-operative Training in Classifier Ensembles

  19. References Co-operative Training in Classifier Ensembles

More Related