230 likes | 352 Views
Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 18 Feb, 26, 2014. Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller , Stanford CS - Probabilistic Graphical Models . Lecture Overview. Indicator function P(X,Y) vs. P(X|Y) and Naïve Bayes
E N D
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 26, 2014 Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller, Stanford CS - Probabilistic Graphical Models CPSC 422, Lecture 18
Lecture Overview • Indicator function • P(X,Y) vs. P(X|Y) and Naïve Bayes • Model P(Y|X) explicitly with Markov Networks • Parameterization • Inference • Generative vs. Discriminative models CPSC 422, Lecture 18
P(X,Y) vs. P(Y|X) Assume that you always observe a set of variables X = {X1…Xn} and you want to predict one or more variables Y = {Y1…Ym} You can model P(X,Y) and then infer P(Y|X) CPSC 422, Lecture 18
P(X,Y) vs. P(Y|X) With a Bnetwe can represent a joint as the product of Conditional Probabilities With a Markov Network we can represent a joint a the product of Factors We will see that Markov Network are also suitable for representing the conditional prob. P(Y|X) directly CPSC 422, Lecture 18
Directed vs. Undirected • Factorization CPSC 422, Lecture 18
Naïve Bayesian Classifier P(Y,X) Avery simple and successful Bnets that allow to classify entities in a set of classes Y1, given a set of features (X1…Xn) • Example: • Determine whether an email is spam (only two classes spam=T and spam=F) • Useful attributes of an email ? • Assumptions • The value of each attribute depends on the classification • (Naïve) The attributes are independent of each other given the classification • P(“bank” | “account” , spam=T) = P(“bank” | spam=T) CPSC 422, Lecture 18
Naïve Bayesian Classifier for Email Spam • The corresponding Bnet represent : P(Y1, X1…Xn) Email Spam • What is the structure? words Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” CPSC 422, Lecture 18
NB Classifier for Email Spam: Usage Can we derive : P(Y1| X1…Xn) for any x1…xn “free money for you now” Email Spam Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” But you can also perform any other inference… e.g., P(X1| X3 ) CPSC 422, Lecture 18
NB Classifier for Email Spam: Usage Can we derive : P(Y1| X1…Xn) “free money for you now” Email Spam Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” But you can perform also any other inference e.g., P(X1| X3 ) CPSC 422, Lecture 18
Lecture Overview • P(X,Y) vs. P(X|Y) and Naïve Bayes • Model P(Y|X) explicitly with Markov Networks • Parameterization • Inference • Generative vs. Discriminative models CPSC 422, Lecture 18
Model P(Y| X) explicitly • This can be done with a special case of Markov Networks where all the Xi are always observed • These are called Conditional Random Fields (CRFs) • Simple case P(Y1| X1…Xn) CPSC 422, Lecture 18
What are the Parameters? CPSC 422, Lecture 18
Let’s derive the probabilities we need Y1 … Xn X2 X1 CPSC 422, Lecture 18
Let’s derive the probabilities we need Y1 … Xn X2 X1 CPSC 422, Lecture 18
Let’s derive the probabilities we need Y1 0 … Xn X2 X1 CPSC 422, Lecture 18
Let’s derive the probabilities we need Y1 … Xn X2 X1 CPSC 422, Lecture 18
Sigmoid Function used in Logistic Regression Y1 • Great practical interest • Number of paramwi is linear instead of exponential in the number of parents • Natural model for many real-world applications • Naturally aggregates the influence of different parents … Xn X2 X1 Y1 … Xn X2 X1 CPSC 422, Lecture 18
Logistic Regression as a Markov Net (CRF) Logistic regression is a simple Markov Net (a CRF) akanaïve markov model Y … Xn X2 X1 • But only models the conditional distribution, P(Y | X) and not the full joint P(X,Y ) CPSC 422, Lecture 18
Naïve Bayes vs. Logistic Regression Y Naïve Bayes … Xn X2 X1 Generative Discriminative Conditional Y Logistic Regression (Naïve Markov) Xn X2 X1 … CPSC 422, Lecture 18
Generative vs. Discriminative Models • Generative models (like Naïve Bayes): not directly designed to maximize performance on classification. They model the joint distribution P(X,Y). • Classification is then done using Bayesian inference • But a generative model can also be used to perform any other inference task, e.g. P(X1 | X2, …Xn, ) • “Jack of all trades, master of none.” • Discriminative models (like CRFs): specifically designed and trained to maximize performance of classification. They only model the conditional distribution P(Y | X ). • By focusing on modeling the conditional distribution, they generally perform better on classification than generative models when given a reasonable amount of training data. CPSC 422, Lecture 18
On Fri: Sequence Labeling .. Y2 YT Y1 HMM … XT X2 X1 Generative Discriminative Conditional .. Y2 YT Y1 Linear-chain CRF … XT X2 X1 CPSC 422, Lecture 18
CPSC 422, Lecture 18 Learning Goals for today’s class • You can: • Describe how a Naïve Bayes model is modeling P(X,Y) and can be used for computing P(Y|X) • Describe a natural parameterization for a Naïve Markov model (which is a simple CRF) • Derive how P(Y|X) can be computed for a Naïve Markov model • Explain the discriminative vs. generative distinction and its implications
Next class Linear-chain CRFs • To Do Revise generative temporal models • Assignment-1 marked, Midterm will be marked by Fri CPSC 422, Lecture 18