250 likes | 422 Views
Machine Learning Lecture 5: Theory I – PAC Learning. Moshe Koppel Slides adapted from Tom Mitchell. To shatter n examples, we need 2 n hypotheses (since there are that many dichotomies. So the number of examples we can shatter with |H| hypotheses < log |H|.
E N D
Machine Learning Lecture 5: Theory I – PAC Learning Moshe Koppel Slides adapted from Tom Mitchell
To shatter n examples, we need 2n hypotheses (since there are that many dichotomies. So the number of examples we can shatter with |H| hypotheses < log |H|