1 / 24

Week 2 Video 5

Week 2 Video 5. Cross-Validation and Over-Fitting. Over-Fitting. I’ve mentioned over-fitting a few times during the last few weeks Fitting to the noise as well as the signal. Over-Fitting. Good fit. Over fit. Reducing Over-Fitting. Use simpler models

schuyler
Download Presentation

Week 2 Video 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Week 2 Video 5 Cross-Validation and Over-Fitting

  2. Over-Fitting • I’ve mentioned over-fitting a few times during the last few weeks • Fitting to the noise as well as the signal

  3. Over-Fitting Good fit Over fit

  4. Reducing Over-Fitting • Use simpler models • Fewer variables (BiC, AIC, Occam’s Razor) • Less complex functions (MDL)

  5. Eliminating Over-Fitting? • Every model is over-fit in some fashion • The questions are: • How bad? • What is it over-fit to?

  6. Assessing Generalizability • Does your model transfer to new contexts? • Or is it over-fit to a specific context?

  7. Training Set/Test Set • Split your data into a training set and test set

  8. Notes • Model tested on unseen data • But uses data unevenly

  9. Cross-validation • Split data points into N equal-size groups

  10. Cross-validation • Train on all groups but one, test on last group • For each possible combination

  11. Cross-validation • Train on all groups but one, test on last group • For each possible combination

  12. Cross-validation • Train on all groups but one, test on last group • For each possible combination

  13. Cross-validation • Train on all groups but one, test on last group • For each possible combination

  14. Cross-validation • Train on all groups but one, test on last group • For each possible combination

  15. Cross-validation • Train on all groups but one, test on last group • For each possible combination

  16. How many groups? • K-fold • Pick a number K, split into this number of groups • Leave-out-one • Every data point is a fold

  17. How many groups? • K-fold • Pick a number K, split into this number of groups • Quicker; preferred by some theoreticians • Leave-out-one • Every data point is a fold • More stable • Avoids issue of how to select folds (stratification issues)

  18. Cross-validation variants • Flat Cross-Validation • Each point has equal chance of being placed into each fold • Stratified Cross-Validation • Biases fold selection so that some variable is equally represented in each fold • The variable you’re trying to predict • Or some variable that is thought to be an important context

  19. Student-level cross-validation • Folds are selected so that no student’s data is represented in two folds • Allows you to test model generalizability to new students • As opposed to testing model generalizability to new data from the same students

  20. Student-level cross-validation • Usually seen as the minimum cross-validation needed, in the EDM conference • Papers that don’t pay attention to this issue are usually rejected • OK to explicitly choose something else and discuss that choice • Not OK to just ignore the issue and do what’s easiest

  21. Student-level cross-validation • Easy to do with Batch X-Validation in RapidMiner

  22. Other Levels Sometimes Used for Cross-Validation • Lesson/Content • School • Demographic (Urban/Rural/Suburban, Race, Gender) • Software Package

  23. Important Consideration • Where do you want to be able to use your model? • New students? • New schools? • New populations? • New software content? • Make sure to cross-validate at that level

  24. Next Lecture • More on Generalization and Validity

More Related