140 likes | 281 Views
lAB 5. TA: Nouf Al- harbi nouf200@hotmail.com . Lab objective:. Appling Bayesian Classification Classify an input feature value into one of two classes Compute and draw the posteriors for 2 classes. Quick review of Bayesian classification . Bayesian Decision Theory.
E N D
lAB5 TA: Nouf Al-harbinouf200@hotmail.com
Lab objective: • Appling Bayesian Classification • Classify an input feature value into one of two classes • Compute and draw the posteriors for 2 classes
Bayesian Decision Theory Decidex1 if P(1 | x)>P(2 | x) and x2if P(1 | x)<P(2 | x)
Part 1 Classify a given input feature value into one of two classes
function rslt = BayesClassifier(x) W1 or w2
Appling part 1 Practical Learning 1 • Make a Matlab function that classify a feature value into one of two classes w1,w2 that have these properties: • P(x|w1) ≈N(20,3) • P(x|w2) ≈N(30,2) • P(w1)=1/3 • P(w2)=2/3 • Then use it to classify the input value x = 15
Full code • function rslt = BayesClassifier (x) • Pw1 = 1/3; % P(w1) • Pw2 = 2/3; % P(w2) • mu1 = 20; • sigma1 = 2; • mu2 = 30; • sigma2 = 2; • Pxw1 = normalfn( x, mu1, sigma1); • Pxw2 = normalfn( x, mu2, sigma2); • Px = Pxw1 * Pw1 + Pxw2 * Pw2; • Pw1x = Pxw1 * Pw1/ Px; • Pw2x = Pxw2 * Pw2/ Px; • if (Pw1x > Pw2x) • rslt = sprintf('\n%d belongs to w1, with an error = %d\n', x, Pw2x); • else • rslt = sprintf('\n%d belongs to w2, with an error = %d\n', x, Pw1x); • end
Part 2 Compute and draw the posteriors for 2 classes
Appling part 2 Practical Learning 2 • Make a Matlab function that compute the posterior probabilities P(w1|x) & P(w2|x) if you know these info: • P(x|w1) ≈N(20,3) • P(x|w2) ≈N(30,2) • P(w1)=1/3 • P(w2)=2/3 • Plot both likelihoods and posteriors • Draw decision regions
Full code • xmax = max(x1max,x2max) • x = xmin: 0.1 : xmax; • n = length(x); • Pxw1 = zeros(1,n); • Pxw2 = zeros(1,n); • for i = 1:n • Pxw1(i) = normalfn( x(i), mu1, sigma1); • Pxw2(i) = normalfn( x(i), mu2, sigma2); • End • plot(x, Pxw1, 'r-', x, Pxw2, 'b:'); • title('likelihoods plot'); • xlabel('feature values'); • ylabel('likelihood') • Pw1x = zeros(1,n); • Pw2x = zeros(1,n); • for i = 1:n • Px = Pxw1(i) * Pw1 + Pxw2(i) * Pw2; • Pw1x(i) = Pxw1(i) * Pw1/ Px; • Pw2x(i) = Pxw2(i) * Pw2/ Px; • End • R = Pw1x > Pw2x; • figure, plot(x, Pw1x, 'r-', x, Pw2x, 'b:', x, R, 'g-'); • title('Posteriors plot'); • xlabel('feature values'); • ylabel('Posterior') function BayesClassifier2() Pw1 = 1/3; Pw2 = 2/3; mu1 = 20; sigma1 = 3; mu2 = 30; sigma2 = 2; x1min = mu1 – 4 * sigma1 x1max = mu1 + 4 * sigma1; x2min = mu2 – 4 * sigma2 x2max = mu2 + 4 * sigma2; xmin = min(x1min,x2min)
H.W 3 It has been uploaded on the website