80 likes | 165 Views
Ch. 18 – Learning. Supplemental slides for CSE 327 Prof. Jeff Heflin. Decision Tree Learning.
E N D
Ch. 18 – Learning Supplemental slides for CSE 327 Prof. Jeff Heflin
Decision Tree Learning functionDec-Tree-Learn(examples,attribs,parent_examples) returns a decision treeif examples is empty then return Plurality-Value(parent_examples)else if all examples have the same classificationthen return the classificationelse if attribs is empty then returnPlurality-Value (examples)elseA argmaxaattribsImportance(a, examples) tree a new decision tree with root test Afor each value vkof A doexs {e : e examples and e.A = vk}subtree Dec-Tree-Learn (exs,attribs – A, examples) add a branch to tree with label (A = vk) and subtreesubtreereturn tree From Figure 18.5, p. 702
Decision Tree Result +: X3,X6-: X1,X2,X4,X5,X7,X8 Shape? circle square triangle +: -: X2,X7 +: X3,X6-: X5 +: -: X1,X4,X8 No No Color? green red blue yellow +: X3,X6-: +: -: X5 +: -: +:-: Yes Yes No Yes
Perceptron Learning functionPerceptron-Learning(examples,network)returns a perceptron hypothesis inputs:examples, a set of examples with input x and output y network, a perceptron with weights Wj and activation function g repeat for eachexample (x,y) inexamples doErr y – g(in) for each jin 0..nWj Wj + Err g’(in) xjuntil some stopping criteria is satisfiedreturnNeural-Net-Hypothesis(network)
ALVINN Straight ahead Sharp right Sharp left 30 output units 4 hidden units 1 input pixel Input is 30x32 pixels = 960 values
SVM Kernels • Non-linear separator in 2 dimensions: • Mapped to 3 dimensions