210 likes | 329 Views
For Friday. Read chapter 22 Program 4 due. Program 4. Any questions?. Learning mini-project. Worth 2 homeworks Due Monday Foil6 is available in /home/ mecalif /public/itk340/foil A manual and sample data files are there as well.
E N D
For Friday • Read chapter 22 • Program 4 due
Program 4 • Any questions?
Learning mini-project • Worth 2 homeworks • Due Monday • Foil6 is available in /home/mecalif/public/itk340/foil • A manual and sample data files are there as well. • Create a data file that will allow FOIL to learn rules for a sister/2 relation from background relations of parent/2, male/1, and female/1. You can look in the prolog folder of my 327 folder for sample data if you like. • Electronically submit your data file—which should be named sister.d, and turn in a hard copy of the rules FOIL learns.
Strategies for Learning a Single Rule • TopDown (General to Specific): • Start with the most general (empty) rule. • Repeatedly add feature constraints that eliminate negatives while retaining positives. • Stop when only positives are covered. • BottomUp (Specific to General): • Start with a most specific rule (complete description of a single instance). • Repeatedly eliminate feature constraints in order to cover more positive examples. • Stop when further generalization results in covering negatives.
FOIL • Basic topdown sequential covering algorithm adapted for Prolog clauses. • Background provided extensionally. • Initialize clause for target predicate P to P(X1 ,...Xr ) : . • Possible specializations of a clause include adding all possible literals: • Qi (V1 ,...Vr ) • not(Qi (V1 ,...Vr )) • Xi = Xj • not(Xi = X ) where X's are variables in the existing clause, at least one of V1 ,...Vr is an existing variable, others can be new. • Allow recursive literals if not cause infinite regress.
Foil Input Data • Consider example of finding a path in a directed acyclic graph. • Intended Clause: path(X,Y) : edge(X,Y). path(X,Y) : edge(X,Z), path (Z,Y). • Examples edge: { <1,2>, <1,3>, <3,6>, <4,2>, <4,6>, <6,5> } path: { <1,2>, <1,3>, <1,6>, <1,5>, <3,6>, <3, 5>, <4,2>, <4,6>, <4,5>, <6, 5> } • Negative examples of the target predicate can be provided directly or indirectly produced using a closed world assumption. Every pair <x,y> not in positive tuples for path.
Example Induction + : { <1,2>, <1,3>, <1,6>, <1,5>, <3,6>, <3, 5>, <4,2>,<4,6>, <4,5>, <6, 5> } - : {<1,4>, <2,1>, <2,3>, <2,4>, <2,5> <2,6>, <3,1>, <3,2>, <3,4>, <4,1> <4,3>, <5,1>, <5,2>, <5,3>, <5,4> <5,6>, <6,1>, <6,2>, <6,3>, <6,4> } • Start with empty rule: path(X,Y) :. • Among others, consider adding literal edge(X,Y) (also consider edge(Y,X), edge(X,Z), edge(Z,X), path(Y,X), path(X,Z), path(Z,X), X=Y, and negations) • 6 positive tuples and NO negative tuples covered. • Create “base case” and remove covered examples: path(X,Y) : edge(X,Y).
+ : { <1,6>, <1,5>, <3, 5>, <4,5> } - : { <1,4>, <2,1>, <2,3>, <2,4>, <2,5> <2,6>, <3,1>, <3,2>, <3,4>, <4,1>,<4,3>, <5,1>, <5,2>, <5,3>, <5,4> <5,6>, <6,1>, <6,2>, <6,3>, <6,4> } • Start with new empty rule: path(X,Y) :. • Consider literal edge(X,Z) (among others...) • 4 remaining positives satisfy it but so do 10 of 20 negatives • Current rule: path(x,y) : edge(X,Z). • Consider literal path(Z,Y) (as well as edge(X,Y), edge(Y,Z), edge(X,Z), path(Z,X), etc....) • No negatives covered, complete clause. path(X,Y) : edge(X,Z), path(Z,Y). • New clause actually covers all remaining positive tuples of path, so definition is complete.
Picking the Best Literal • Based on information gain (similar to ID3). |p|*(log2 (|p| /(|p|+|n|)) - log2 (|P| /(|P|+|N|))) P is number of positives before adding literal L N is number of negatives before adding literal L p is number of positives after adding literal L n is number of negatives after adding literal L • Given n predicates of arity m there are O(n2m) possible literals to chose from, so branching factor can be quite large.
Other Approaches • Golem • CHILL • Foidl • Bufoidl
Domains • Any kind of concept learning where background knowledge is useful. • Natural Language Processing • Planning • Chemistry and biology • DNA • Protein structure
Natural Language Processing • What’s the goal?
Communication • Communication for the speaker: • Intention: Decided why, when, and what information should be transmitted. May require planning and reasoning about agents' goals and beliefs. • Generation: Translating the information to be communicated into a string of words. • Synthesis: Output of string in desired modality, e.g.text on a screen or speech.
Communication (cont.) • Communication for the hearer: • Perception: Mapping input modality to a string of words, e.g. optical character recognition or speech recognition. • Analysis: Determining the information content of the string. • Syntactic interpretation (parsing): Find correct parse tree showing the phrase structure • Semantic interpretation: Extract (literal) meaning of the string in some representation, e.g. FOPC. • Pragmatic interpretation: Consider effect of overall context on the meaning of the sentence • Incorporation: Decide whether or not to believe the content of the string and add it to the KB.
Ambiguity • Natural language sentences are highly ambiguous and must be disambiguated. I saw the man on the hill with the telescope. I saw the Grand Canyon flying to LA. I saw a jet flying to LA. Time flies like an arrow. Horse flies like a sugar cube. Time runners like a coach. Time cars like a Porsche.
Syntax • Syntax concerns the proper ordering of words and its effect on meaning. The dog bit the boy. The boy bit the dog. * Bit boy the dog the Colorless green ideas sleep furiously.
Semantics • Semantics concerns of meaning of words, phrases, and sentences. Generally restricted to “literal meaning” • “plant” as a photosynthetic organism • “plant” as a manufacturing facility • “plant” as the act of sowing
Pragmatics • Pragmatics concerns the overall commuinicative and social context and its effect on interpretation. • Can you pass the salt? • Passerby: Does your dog bite? Clouseau: No. Passerby: (pets dog) Chomp! I thought you said your dog didn't bite!! Clouseau:That, sir, is not my dog!
Modular Processing Speech recognition Parsing acoustic/ phonetic syntax semantics pragmatics Sound waves words Parse trees literal meaning meaning
Examples • Phonetics “grey twine” vs. “great wine” “youth in Asia” vs. “euthanasia” “yawanna” > “do you want to” • Syntax I ate spaghetti with a fork. I ate spaghetti with meatballs.
More Examples • Semantics I put the plant in the window. Ford put the plant in Mexico. The dog is in the pen. The ink is in the pen. • Pragmatics The ham sandwich wants another beer. John thinks vanilla.