270 likes | 397 Views
INC 551 Artificial Intelligence. Lecture 11 Machine Learning (Continue). Bayes Classifier. Bayes Rule. Play Tennis Example. John wants to play tennis everyday. However, in some days, the condition is not good. So, he decide not to play.
E N D
INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)
Bayes Classifier Bayes Rule
Play Tennis Example John wants to play tennis everyday. However, in some days, the condition is not good. So, he decide not to play. The following table is the record for the last 14 days.
Question: Today’s condition is <Sunny, Mild Temperature, Normal Humidity, Strong Wind> Do you think John will play tennis?
Find We need to use naïve Bayes assumption. Assume that all events are independent. Now, let’s look at each property
Since P(condition) is the same, we can conclude that John is more likely to play tennis today. Note that, we do not need to compute P(condition) to get the answer. However, if you want to get the number, we can calculate P(condition) in the way similar to normalize the probability.
Therefore, John is more likely to play tennis today with 58% chance.
Learning and Bayes Classifier Learning is the adjustment of probability values to compute a posterior probability when new data Is added.
Classifying Object Example Suppose we want to classify objects into two classes, A and B. There are two features that we can measure from each object, f1 and f2. We sample four objects randomly to be a database and classify it by hand. Now, we have another sample that have f1=3.2 f2=4.2 we want to know what class it is.
We want to find Using Bayes rule From the table, we will count the number of events.
Find Again, we use the naïve Bayes assumption. Assume that all events are independent. To find we need to assume probability distribution because the features are continuous value. The most common form of distribution is Gaussian (normal) distribution.
Gaussian distribution There are two parameters: mean µ and variance σ Using the maximum likelihood principle, the mean and the variance can be estimated from the samples In the database.
Class A f1: Mean = (2.3+1.5 )/2 = 1.9 SD = 0.4 f2: Mean = (5.4+4.4 )/2 = 4.9 SD = 0.5 Class B f1: Mean = (5.2+4.5 )/2 = 4.85 SD = 0.35 f2: Mean = (1.2+2.1 )/2 = 1.65 SD = 0.45
Therefore, From Bayes Therefore, we should classify the sample as Class A.
Nearest Neighbor Classification NN is considered as no model classification. Nearest Neighbor’s Principle The unknown sample is classified to be the same class as the sample with closet distance.
Feature 2 Closet Distance Feature 1 We classify the sample as a circle.
Distance between Samples Sample X and Y have multi-dimension feature values. The distance between sample X,Y can be calculated by this formula.
If k = 1 , the distance is called Manhattan distance If k = 2 , the distance is called Euclidean distance If k = ∞ , the distance is the maximum value of feature Euclidean is well-known and is the prefer one.
Classifying Object with NN Now, we have another sample, f1=3.2 f2=4.2 We want to know its class.
Compute Euclidian distance from it to all other samples The unknown sample has the closest distance to the second sample. Therefore, we classify it to be the same class as the second sample, which is Class A.
K-Nearest Neighbor (KNN) Instead of using the closet sample as the decided class, we use the closet k samples as the decided class.
Feature 2 Feature 1 Example k=3 The data is classified as a circle
Feature 2 Feature 1 Example k=5 The data is classified as a star.