210 likes | 361 Views
Symbolic learning. Points Definitions Representation in logic What is an arch? Version spaces • Candidate elimination Learning decision trees Explanation-based learning. Definitions. Learning is a change that helps improve future performance.
E N D
Symbolic learning • Points • Definitions • Representation in logic • What is an arch? • Version spaces • • Candidate elimination • Learning decision trees • Explanation-based learning
Definitions • Learning is a change that helps improve future performance. • (a paraphrase of Herbert Simon's definition) • Broad categories of machine learning methods and systems: • Symbolic our focus in this brief presentation • Statistical • Neural / Connectionist • Genetic / Evolutionary
Definitions (2) • Various characterizations of symbolic learning • Learning can be • supervised—pre-classified data, • unsupervised (conceptual clustering)—raw data. • Data for supervised learning can be • many positive and negative examples, • a theory and one example. • The goal of learning can be • concept discovery, (section 10.6) • generalization from examples (inductive learning), • a heuristic, a procedure, and so on.
Representation in logic • The AI techniques in symbolic ML: • search, • graph matching, • theorem proving. • Knowledge can be represented in logic: • a list of elementary properties and facts about things, • examples are conjunctive formulae with constants, • generalized concepts are formulae with variables.
Representation in logic (2) • Two instances of a concept: • size( i1, small ) colour( i1, red ) shape( i1, ball ) • size( i2, large ) colour( i2, red ) shape( i2, brick ) • A generalization of these instances: • size( X, Y ) colour( X, red ) shape( X, Z ) • Another representation: • obj( small, red, ball ) • obj( large, red, brick ) • A generalization of these instances: • obj( X, red, Z )
What is an arch? • Positive and negative examples • Learning the concept of an arch (section 10.1) • An arch: • A hypothesis: an arch has three bricks as parts. • Another example—not an arch: • Another hypothesis: two bricks support a third brick.
What is an arch (2) • Another arch: • Two bricks support a pyramid. We can generalize both positive examples if we have a taxonomy of blocks: the supported object is a polygon. • Not an arch: • We have to specialize the hypothesis: two bricks that do not touch support a polygon.
Version spaces • This is a method of learning a concept from positive and negative examples. • In the first stage, we have to select features that characterize the concepts, and their values. Our example: • size = {large, small} • colour = {red, white, blue} • shape = {ball, brick, cube} • We will represent a feature “bundle” like this: • obj( Size, Colour, Shape )
Version spaces (2) • There are 18 specific objects and 30 classes of objects (expressed by variables instead of constants), variously generalized. They are all arranged into a graph called a version space. Here is part of this space for our example:
Version spaces (3) • Generalization and specialization in this very simple representation are also simple. • To generalize, replace a constant with a variable: • obj( small, red, ball ) obj( small, X, ball ) • More generalization requires introducing more (unique) variables. • obj( small, X, ball ) obj( small, X, Y ) • To specialize, replace a variable with a constant (it must come from the set of allowed feature values): • obj( small, X, Y ) obj( small, blue, Y ) • obj( small, blue, Y ) obj( small, blue, cube )
Version spaces (4) • Other generalization operators (illustrated with a different representation) include the following. • Drop a conjunct. • size( i1, small ) colour( i1, red ) shape( i1, ball ) • colour( i1, red ) shape( i1, ball ) • Add a disjunct. • colour( i2, red ) shape( i2, ball ) • (colour( i2, red ) colour( i2, blue )) shape( i2, ball ) • Use a taxonomy (assuming that you have it!). • Suppose we have a hierarchy of colours where “red” is a subclass of primaryColour". We can generalize: • colour( i3, red ) colour( i3, primaryColour )
Version spaces (5) • The candidate elimination algorithm • There are three variants: general-to-specific, specific-to-general and a combination of both directions. • All three methods work with sets of hypotheses — classes of concepts. • We consider, one by one, a series of examples, both positive and negative. • In the specific-to-general method, the set S is the (evolving) target concept, the set N stores negative examples.
Version spaces (6) • Initialize the concept set S to the first positive example. • Initialize the concept set N to Ø. Then repeat: • For a positive example p • Replace every s S that does not match p with the minimal (most specific) generalization that matches p. • Remove any s S more general than another s’ S. • Remove any s S that matches some n N. • For a negative example n • Remove any s S that matches n. • Add n to N for future use.
The conceptof a ball Version spaces (7) • Example: obj(small, X, ball) minimally generalizesobj(small, white, ball) and obj(small, red, ball).
Version spaces (8) • Now, general-to-specific. • Initialize the concept set G to the most general concept. • Initialize the concept set P to Ø. Then repeat: • For a negative example n • Replace every g G that matches n with the minimal (most general) specialization that does not match n. • Remove any g G more specific than another g’ G. • Remove any g G that does not match some p P. • For a positive example p • Remove any g G that does not match p. • Add p to P for future use.
The conceptof a ball Version spaces (9) • Example: obj(large, Y, Z), obj(X, Y, ball), ...minimally specialize obj(X, Y, Z).
Version spaces (10) • Initialize G to the most general concept, S to the first positive example. • For a positive example p • Remove any g G that does not match p. • Replace every s S that does not match p with the most specific generalization that matches p. • Remove any s S more general than another s’ S. • Remove any s S more general that some g G. • For a negative example n • Remove any s S that matches n. • Replace every g G that matches n with the most general specialization that does not match n. • Remove any g G more specific than another g’ G. • Remove any g G more specific than some s S. • If G = S = {c}, the learning of c succeeds. • If G = S = Ø, learning fails.
The conceptof a red ball Version spaces (11)