1 / 59

Learning Markov Logic Networks Using Structural Motifs

Learning Markov Logic Networks Using Structural Motifs. Stanley Kok Dept. of Computer Science and Eng. University of Washington Seattle, USA Joint work with Pedro Domingos. Outline. Background Learning Using Structural Motifs Experiments Future Work. Background

zoe
Download Presentation

Learning Markov Logic Networks Using Structural Motifs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Markov Logic Networks Using Structural Motifs Stanley Kok Dept. of Computer Science and Eng. University of Washington Seattle, USA Joint work with Pedro Domingos

  2. Outline • Background • Learning Using Structural Motifs • Experiments • Future Work • Background • Learning Using Structural Motifs • Experiments • Future Work 2

  3. Markov Logic Networks[Richardson & Domingos, MLJ’06] • A logical KB is a set of hard constraintson the set of possible worlds • Let’s make them soft constraints:When a world violates a formula,it becomes less probable, not impossible • Give each formula a weight(Higher weight  Stronger constraint) 2.7 Teaches(p,c))Professor(p)

  4. Markov Logic • A Markov logic network (MLN) is a set of pairs (F,w) • F is a formula in first-order logic • wis a real number vector of truth assignments to ground atoms #true groundings of ith formula weight of ithformula partition function

  5. MLN Structure Learning Input: Relational Data Output: MLN MLN Structure Learner 2.7 Teaches(p, c) ÆTAs(s, c) ) Advises(p, s) 1.4 Advises(p, s) Æ Teaches(p, c)) TAs(s, c) 1.1 :TAs(s, c) ˅ : Advises (s, p) … Advises TAs Pete Pete Sam CS1 Sam CS1 Pete Sam Pete CS2 CS2 Saul Paul Paul Sara CS1 Sara CS2 … … … … … … Teaches

  6. Generate-and-test or greedy MSL [Kok & Domingos, ICML’05] BUSL[Mihalkova & Mooney, ICML’07] Computationally expensive; large search space Susceptible to local maxima Previous Systems 6

  7. LHL System[Kok & Domingos, ICML’09] Teaches Advises Pete CS1 Pete Paul Pat Phil CS1 CS2 CS3 CS4 CS5 CS6 CS7 CS8 Teaches Paul CS2 Advises TAs Pat CS3 Pete Sam Pete CS1 Sam CS1 Advises Phil Pete Pete Sam CS2 Saul CS2 Sam Sara Saul Sue CS4 Paul Paul Sara CS1 CS2 Sara Trace paths & convert paths to first-order clauses … … … … … … CS5 Sam Professor TAs Course Teaches CS6 Sara ‘Lifts’ CS7 Saul CS8 Sue TAs Student 7

  8. Outline • Background • Learning Using Structural Motifs • Experiments • Future Work 8

  9. First MLN structure learner that can learn long clauses Capture more complex dependencies Explore a larger space of clauses Learning Using Structural Motifs (LSM)

  10. LHL Recap Course1 C1 Course3 C3 TAs S1 Student1 S2 S3 S4 S9 Student3 S10 S11 Course5 C5 Course6 C6 Advises Prof1 P1 Prof2 P2 S14 Student5 S15 S16 Teaches S17 Student6 S18 S19 S5 Student2 S6 S7 S8 Student4 S12 S13 P3 Prof3 P4 P5 Prof4 P6 Course4 C4 Course2 C2 Course11 C13 C9 Course9 C10 Course7 C7 Course13 C15 S33 Student11 S34 S35 S25 Student9 S26 S27 S28 S20 Student7 S21 S22 S38 Student13 S39 S40 Prof7 P9 Prof6 P8 Prof5 P7 Prof8 P10 P11 S36 Student12 S37 S29 Student10 S30 S31 S32 Student8 S23 S24 Student13 S41 Course12 C14 Course10 C8 C11 Course8 C12 10 Course13 C16

  11. Repeated Patterns Course1 Course3 Student1 Student3 Course5 Course6 Prof1 Prof2 Student5 Student6 Student2 Student4 Prof3 Prof4 Course4 Course2 Course11 Course9 Course7 Course13 Student11 Student9 Student7 Student13 Prof7 Prof6 Prof5 Prof8 Student12 Student10 Student8 Student13 Course12 Course10 Course8 11 Course13

  12. Repeated Patterns Course1 Course3 Student1 Student3 Course5 Course6 Course Prof1 Prof2 Student5 Student6 Student2 Student4 Prof3 TAs Prof4 Course4 Course2 Student Teaches Course11 Course9 Course7 Course13 Student11 Student9 Student7 Advises Student13 Prof Prof7 Prof6 Prof5 Prof8 Student12 Student10 Student8 Student13 Course12 Course10 Course8 12 Course13

  13. Finds literals that are densely connected Random walks& hitting times Groups literals into structural motifs Cluster nodes into high-level concepts Symmetrical paths&nodes Learning Using Structural Motifs (LSM) :Teaches(p,c) ˅ TAs(s,c) ˅ Advises(p,s) … • Structural motif = set of literals • → a set of clauses Teaches(p,c) ˅ :TAs(s,c) … • = a subspace of clauses TAs(s,c) … { Teaches(p,c), TAs(s,c), Advises(p,s) } 13

  14. LSM’s Three Components LSM Advises TAs Input: Relational DB Sam Pete Pete CS1 Sam CS1 Output:MLN Pete Pete Sam CS2 Saul CS2 2.7 Teaches(p, c) ÆTAs(s, c) ) Advises(p, s) 1.4 Advises(p, s) Æ Teaches(p, c)) TAs(s, c) -1.1 TAs(s, c) ) Advises(s, p) … Identify Motifs Create MLN Find Paths Paul Paul Sara CS1 Sar CS2 … … … … … … Teaches 14

  15. Random Walk • Begin at node A • Randomly pick neighbor n E B D A A F C 15

  16. Random Walk • Begin at node A • Randomly pick neighbor n • Move to node n E 2 B D A F C 16

  17. Random Walk • Begin at node A • Randomly pick neighbor n • Move to node n • Repeat E B D A F 2 C 17

  18. Expected number of steps starting from node ibefore node jis visited for first time Smaller hitting time → closer to start node i Truncated Hitting Time [Sarkar & Moore, UAI’07] Random walks are limited to Tsteps Computed efficiently & with high probability by sampling random walks [Sarkar, Moore & Prakash ICML’08] Hitting Time from node i to j 18

  19. Finding Truncated Hitting Time By Sampling E B D A 1 F C T=5 A 19

  20. Finding Truncated Hitting Time By Sampling E B 4 D A F C T=5 A D 20

  21. Finding Truncated Hitting Time By Sampling E 5 B D A F C T=5 A D E 21

  22. Finding Truncated Hitting Time By Sampling E B 4 D A F C T=5 A D E D 22

  23. Finding Truncated Hitting Time By Sampling E B D A 6 F C T=5 A D E D F 23

  24. Finding Truncated Hitting Time By Sampling E 5 B D A F C T=5 A D E D F E 24

  25. Finding Truncated Hitting Time By Sampling E B hAE=2 hAB=5 hAD=1 hAA=0 D A F C hAC=5 hAF=4 T=5 A D E D F E 25

  26. Symmetrical Paths Physics History C1 C3 TAs S1 S2 S3 S4 S9 S10 S11 Advises P1 P2 Teaches S12 S13 S5 S6 S7 S8 C4 C2 P1→S2 P1→S3 Symmetrical P1, Advises, S2 P1, Advises, S3 0, Advises, 1 0, Advises, 1 26

  27. Symmetrical Paths Physics History C1 C3 TAs S1 S2 S3 S4 S9 S10 S11 Advises P1 P2 Teaches S12 S13 S5 S6 S7 S8 C4 C2 P1→S2 P1→S3 P1, Advises, S2 P1, Advises, S3 0, Advises, 1 0, Advises, 1 P1, Advises, S1, TAs, C1, TAs, S2 0, Advises, 1, TAs, 2, TAs, 3 P1, Advises, S4, TAs, C1, TAs, S3 0, Advises, 1, TAs, 2, TAs, 3 27

  28. Symmetrical Nodes Physics History C1 C3 TAs Sym. nodes have identical truncated hitting times S1 S2 S3 S4 S9 S10 S11 Advises P1 P2 Sym. nodes have identical path distributions in a sample of random walks Teaches S12 S13 S5 S6 S7 S8 Symmetrical C4 C2 P1→S2 P1→S3 P1, Advises, S2 P1, Advises, S3 0, Advises, 1 0, Advises, 1 P1, Advises, S1, TAs, C1, TAs, S2 0, Advises, 1, TAs, 2, TAs, 3 P1, Advises, S4, TAs, C1, TAs, S3 0, Advises, 1, TAs, 2, TAs, 3 … … 28

  29. Learning Using Structural Motifs LSM Advises TAs Input: Relational DB Sam Pete Pete CS1 Sam CS1 Output:MLN Pete Pete Sam CS2 Saul CS2 2.7 Teaches(p, c) ÆTAs(s, c) ) Advises(p, s) 1.4 Advises(p, s) Æ Teaches(p, c)) TAs(s, c) -1.1 TAs(s, c) ) Advises(s, p) … Identify Motifs Create MLN Find Paths Paul Paul Sara CS1 Sar CS2 … … … … … … Teaches 29

  30. Sample Random Walks Physics History 0,Advises,1,TAs,2 1 C1 C3 … … TAs S1 S2 S3 S4 S9 S10 S11 Advises P1 P2 Teaches S12 S13 S5 S6 S7 S8 C4 C2 30

  31. Estimate TruncatedHitting Times Physics History 3.99 3.2 C1 C3 3.55 3.55 3.55 3.55 4 4 3.93 S1 S2 S3 S4 S9 S10 S11 3.99 0 P1 P2 4 4 S12 S13 S5 S6 S7 S8 3.52 3.52 3.52 3.52 C4 C2 4 31 3.21

  32. Prune ‘Faraway’ Nodes Physics History 3.99 3.2 C1 C3 3.55 3.55 3.55 3.55 4 4 3.93 S1 S2 S3 S4 S9 S10 S11 3.99 0 P1 P2 4 4 S12 S13 S5 S6 S7 S8 3.52 3.52 3.52 3.52 C4 C2 4 32 3.21

  33. Group Nodes with Similar Hitting Times 3.2 C1 3.55 3.55 3.55 3.55 S1 S2 S3 S4 Candidate symmetrical nodes 0 P1 S5 S6 S7 S8 3.52 3.52 3.52 3.52 C2 33 3.21

  34. Cluster Nodes • Cluster nodes with similar path distributions C1 S1 S2 S3 S4 P1 0,Advises,1 0.5 0,Advises,2,…,1 0.1 S5 S6 S7 S8 … … C2 34

  35. Create ‘Lifted’ Graph Course C2 C1 TAs Teaches Student S1 S2 S3 S4 S5 S6 S7 S8 Advises Professor P1 35

  36. Extract Motif with DFS Course C2 C1 TAs Teaches Student S1 S2 S3 S4 S5 S6 S7 S8 Advises Professor P1 36

  37. Create Motif C1 Motif TAs { Teaches(P1,C1), TAs(S1,C1), Advises(P1,S1) } { Teaches(p,c), TAs(s,c), Advises(p,s) } Teaches S1 Advises true grounding of P1 37

  38. Restart from Next Node Physics History C1 C3 S1 S2 S3 S4 S9 S10 S11 P1 P2 S12 S13 S5 S6 S7 S8 C4 C2 C2 38

  39. Restart from Next Node different motif over same set of constants Physics Course1 C1 Student1 S1 S2 S3 S4 Professor P1 Student S5 S6 S7 S8 Course2 C2 C2 39

  40. Select Motifs • Choose motifs with large #true groundings Motif Est. #True Gndings { Teaches(p,c), TAs(s,c), Advises(p,s) } 100 { Teaches(p,c), …} 20 … … 40

  41. LSM • Pass selected motifs to FindPaths & CreateMLN LSM Advises TAs Sam Pete Pete CS1 CS1 Sam Input: Relational DB Sam Pete Pete CS2 Saul CS2 Output:MLN Paul Paul Sara CS1 CS2 Sar 2.7 Teaches(p, c) ÆTAs(s, c) ) Advises(p, s) 1.4 Advises(p, s) ) Teaches(p, c)Æ TAs(s, c) -1.1 TAs(s, c) ) Advises(s, p) … Identify Motif … … … … … … Create MLN Find Paths Teaches 41

  42. FindPaths Paths Found p Advises(p,s) p Teaches { Teaches(p,c), TAs(s,c), Advises(p,s) } c c Advises(p,s) , Teaches (p,c) Advises TAs s s Advises(p,s) , Teaches (p,c), TAs(s,c) 42

  43. Clause Creation Advises(p,s) Æ Teaches(p,c) Advises(p, s)Æ Teaches(p, c)ÆTAs(s,c) :Advises(p, s) V :Teaches(p, c) V:TAs(s,c) Advises(p, s) V :Teaches(p, c) V:TAs(s,c) Æ Advises(p, s) VTeaches(p, c) V:TAs(s,c) … TAs(s,c) 43

  44. Clause Pruning Score -1.15 : Advises(p, s) V :Teaches(p, c) VTAs(s,c) -1.17 Advises(p, s) V :Teaches(p, c) VTAs(s,c) … … -2.21 : Advises(p, s) V :Teaches(p, c) -2.23 : Advises(p, s) VTAs(s,c) -2.03 :Teaches(p, c) VTAs(s,c) … … :Advises(p, s) -3.13 ` : Teaches(p, c) -2.93 -3.93 TAs(s,c) 44

  45. Clause Pruning Compare each clause against its sub-clauses (taken individually) Score -1.15 : Advises(p, s) V :Teaches(p, c) VTAs(s,c) -1.17 Advises(p, s) V :Teaches(p, c) VTAs(s,c) … … -2.21 : Advises(p, s) V :Teaches(p, c) -2.23 : Advises(p, s) VTAs(s,c) -2.03 :Teaches(p, c) VTAs(s,c) … … :Advises(p, s) -3.13 : Teaches(p, c) -2.93 -3.93 TAs(s,c) 45

  46. Add all clauses to empty MLN Train weights of clauses Remove clauses with absolute weights below threshold MLN Creation 46

  47. Outline • Background • Learning Using Structural Motifs • Experiments • Future Work 47

  48. IMDB Created from IMDB.com DB Movies, actors, etc., and relationships 17,793 ground atoms; 1224 true ones UW-CSE Describes academic department Students, faculty, etc., and relationships 260,254 ground atoms; 2112 true ones Datasets 48

  49. Cora Citations to computer science papers Papers, authors, titles, etc., and their relationships 687,422ground atoms; 42,558 true ones Datasets 49

  50. Five-fold cross validation Inferred prob. true for groundings of each pred. Groundings of all other predicates as evidence For Cora, inferred four predicates jointly too SameCitation, SameTitle, SameAut, SameVenue MCMC to eval test atoms: 106 samples or 24 hrs Evaluation measures: CLL,AUC Methodology 50

More Related