580 likes | 737 Views
September 2, 2009. 10601 Machine Learning. Recitation 2 Öznur Taştan. Logistics. Homework 2 is going to be out tomorrow. It is due on Sep 16, Wed. There is no class on Monday Sep 7 th (Labor day) Those who have not return Homework 1 yet
E N D
September 2, 2009 10601 Machine Learning Recitation 2 Öznur Taştan
Logistics • Homework 2 is going to be out tomorrow. It is due on Sep 16, Wed. • There is no class on Monday Sep 7th (Labor day) • Those who have not return Homework 1 yet • For details of how to submit the homework policy please check : http://www.cs.cmu.edu/~ggordon/10601/hws.html
Outline • We will review • Some probability and statistics • Some graphical models • We will not go over Homework 1 • Since the grace period has not ended yet. • Solutions will be up next week on the web page.
We’ll play a game: Catch the goof! • I’ll be the sloppy TA… will make ‘intentional’ mistakes • You’ll catch those mistakes and correct me! Slides with mistakes are marked with Correct slides are marked with
Law of total probability Given two discrete random variables X and Y X takes values in Y takes values in
Law of total probability Given two discrete random variables X and Y X takes values in Y takes values in
Law of total probability Given two discrete random variables X and Y X takes values in Y takes values in
Law of total probability Given two discrete random variables X and Y X takes values in Y takes values in
Law of total probability Given two discrete random variables X and Y Joint probability Marginal probability Conditional probability of X conditioned on Y
Law of total probability Given two discrete random variables X and Y Formulas are fine. Anything wrong with the names? Joint probability Marginal probability Conditional probability of X conditioned on Y
Law of total probability Given two discrete random variables X and Y Joint probability of X,Y Marginal probability Conditional probability of X conditioned on Y Marginal probability
In a strange world Two discrete random variables X and Y take binary values Joint probabilities
In a strange world Two discrete random variables X and Y take binary values Joint probabilities Should sum up to 1
The world seems fine Two discrete random variables X and Y take binary values Joint probabilities
What about the marginals? Joint probabilities Marginal probabilities
This is a strange world Joint probabilities Marginal probabilities
In a strange world Joint probabilities Marginal probabilities
This is a strange world Joint probabilities Marginal probabilities
Let’s have a simple problem Joint probabilities Marginal probabilities
Conditional probabilities What is the complementary event of P(X=0|Y=1) ? P(X=1|Y=1) OR P(X=0|Y=0)
Conditional probabilities What is the complementary event of P(X=0|Y=1) ? P(X=1|Y=1) OR P(X=0|Y=0)
Independent number of parameters Assume X and Y take Boolean values {0,1}: • How many independent parameters do you need to fully specify: • marginal probability of X? • the joint probability of P(X,Y)? • the conditional probability of P(X|Y)?
Independent number of parameters Assume X and Y take Boolean values {0,1}: • How many independent parameters do you need to fully specify: • marginal probability of X? P(X=0) 1 parameter only [ because P(X=1)+P(X=0)=1 ] • the joint probability of P(X,Y)? P(X=0, Y=0) 3 parameters P(X=0, Y=1) P(X=1, Y=0) • the conditional probability of P(X|Y)?
Number of parameters • Assume X and Y take Boolean values {0,1}? • How many independent parameters do you need to fully specify marginal probability of X? P(X=0) 1 parameter only P(X=1)= 1-P(X=0) • How many independent parameters do you need to fully specify the joint probability of P(X,Y)? P(X=0, Y=0) 3 parameters P(X=0, Y=1) P(X=1, Y=0) • How many independent parameters do you need to fully specify the conditional probability of P(X|Y)? P(X=0|Y=0) 2 parameters P(X=0|Y=1)
Number of parameters • What about P(X | Y,Z) , how many independent parameters do you need to be able to fully specify the probabilities? Assume each RV takes: m values P(X | Y,Z) q values n values
Number of parameters • What about P(X | Y,Z) , how many independent parameters do you need to be able to fully specify the probabilities? Assume each RV takes: m values P(X | Y,Z) q values n values Number of independent parameters: (m-1)*nq
Graphical models • A graphical model is a way of representing probabilistic relationships between random variables • Variablesare represented by nodes: • Edges indicates probabilistic relationships: You miss the bus Arrive class late
Serial connection Is X and Z independent? ?
Serial connection Is X and Z independent? X and Z are not independent
Serial connection Is X conditionally independent of Z given Y? ?
Serial connection Is X conditionally independent of Z given Y? Yes they are independent
How can we show it? Is X conditionally independent of Z given Y?
An example case Studied late last night Wake up late Arrive class late
Common cause Age Shoe Size Gray Hair X and Y are not marginally independent X and Y are conditionally independent given Z
Explaining away Flu Allergy Z X Y Sneeze X and Z marginally independent X and Z conditionally dependent given Y
D-separation • X and Z are conditionally independent given Y if Y d-separates X and Z Neither Y nor its descendants should be observed Path between X and Z is blocked by Y
D-separation example Is B, C independent given A?
D-separation example Is B, C independent given A? Yes
D-separation example Observed, A blocks the path Is B, C independent given A? Yes
Observed, A blocks the path Is B, C independent given A? Yes not observed neither its descendants
D-separation example Is A, F independent given E?
Naïve Bayes Model J D C R J: The person is a junior D: The person knows calculus C: The person leaves in campus R: Saw the “Return of the King” more than once