1 / 68

First-Order Probabilistic Inference

This presentation by Rodrigo de Salvo Braz from SRI International discusses the goal of first-order probabilistic inference, propositionalization, lifted inference, inversion elimination, counting elimination, DBLOG, BLOG background, retro-instantiation, and improving BLOG inference. It also explores future directions and conclusions.

leathers
Download Presentation

First-Order Probabilistic Inference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. First-OrderProbabilistic Inference Rodrigo de Salvo Braz SRI International

  2. A remark Feel free to ask clarification questions!

  3. Slides online • Just search for“Rodrigo de Salvo Braz” and check at the presentations page. (www.cs.berkeley.edu/~braz)

  4. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  5. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  6. Abstracting Inference and Learning AI Problems ... Commonsense reasoning Natural LanguageProcessing Planning Vision How to solvecommonsensereasoning DomainKnowledge How to solveNaturalLanguageProcessing LanguageKnowledge Domain &PlanningKnowledge How to solvePlanning Objects &OpticsKnowledge How to solveVision InferenceandLearning InferenceandLearning InferenceandLearning InferenceandLearning InferenceandLearning

  7. Inference and Learning InferenceandLearning What capabilities should it have? • It should represent and use: • predicates (relations) and quantification over objects: • 8 X male(X) Ç female(X) • 8 X male(X) ) sick(X) • sick(p121) • varying degrees of uncertainty: • 8 X { male(X) ) sick(X) } • since it may hold often but not absolutely. • essential for language, vision, pattern recognition, commonsense reasoning etc • modal knowledge, utilities, and more.

  8. Abstracting Inference and Learning Commonsense reasoning Natural Language Processing • Domain knowledge: • male(X) Ç female(X) • { male(X) ) sick(X) } • sick(o121) • ... Language knowledge: { sentence(S) Æverb(S,”had”) Æ object(S,”fever”) subject(S,X) ) fever(X) } ... Sharing inference and learning module InferenceandLearning

  9. Abstracting Inference and Learning Commonsense reasoning WITH Natural Language Processing • Domain knowledge: • male(X) Ç female(X) • { male(X) ) sick(X) } • sick(o121) • ... Language knowledge: { sentence(S) Æverb(S,”had”) Æ object(S,”fever”) subject(S,X) ) fever(X) } verb(s13,”had”), object(S, “fever”), subject(s13,o121)... Joint solution made simpler InferenceandLearning

  10. Standard solutions fall short • Logic • Has objects and properties; • but statements are absolute; • Graphical models and machine learning • Have varying degrees of uncertainty; • but propositional; treating objects and quantification is awkward.

  11. Standard solutions fall short Graphical models and objects sick_o135 • male(X) Ç female(X) • { male(X) )sick(X) } • { friend(X,Y) Æ male(X) ) male(Y) } male_o135 female_o135 • sick(o121) sick(o135) friend_o121_o135 • Transformation (propositionalization) not part of graphical model framework; • Graphical model have only random variables, not objects; • Different data creates distinct, formally unrelated model. male_o121 female_o121 sick_121

  12. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  13. First-Order Probabilistic Inference • Many languages have been proposed • Knowledge-based model construction (Breese, 1992) • Probabilistic Logic Programming (Ng & Subrahmanian, 1992) • Probabilistic Logic Programming (Ngo and Haddawy, 1995) • Probabilistic Relational Models (Friedman et al., 1999) • Relational Dependency Networks (Neville & Jensen, 2004) • Bayesian Logic (BLOG) (Milch & Russell, 2004) • Bayesian Logic Programs (Kersting & DeRaedt, 2001) • DAPER (Heckerman, Meek & Koller, 2007) • Markov Logic Networks (Richardson & Domingos, 2004) • “Smart” propositionalization is the main method.

  14. Propositionalization • Bayesian Logic Programs Prolog-like statements such as medicine(Hospital) | in(Patient,Hospital), sick(Patient). sick(Patient) | in(Patient,Hospital), exposed(Hospital).associated with CPTs and combination rules. |instead of :-

  15. Propositionalization medicine(Hospital) | in(Patient,Hospital), sick(Patient). (plus CPT) BN grounded from and-or tree: in(john,hosp13) ... ... sick(john) medicine(hosp13) in(peter,hosp13) ... sick(peter) ... ...

  16. Propositionalization Multi-Entity Bayesian Networks (Laskey, 2004) in(Patient,Hospital) exposed(Hospital) sick(Patient) in(Patient,Hospital) sick(Patient) medicine(Hospital)

  17. in(Patient,Hospital) in(john,hosp13) in(peter,hosp13) sick(Patient) sick(john) sick(peter) medicine(Hospital) medicine(hosp13) medicine(hosp13) Propositionalization Multi-Entity Bayesian Networks (Laskey, 2004) first-order fragments instantiated ...

  18. in(john,hosp13) in(peter,hosp13) sick(john) sick(peter) in(john,hosp13) sick(john) in(peter,hosp13) sick(peter) medicine(hosp13) medicine(hosp13) medicine(hosp13) Propositionalization Multi-Entity Bayesian Networks (Laskey, 2004) ... ...

  19. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  20. Lifted Inference medicine(Hospital) ( in(Patient,Hospital) Æ sick(Patient) { sick(Patient) ( in(Patient,Hospital) Æ exposed(Hospital) } • Faster • More compact • More intuitive • Higher level – more information/structure available for optimization Unbound, general variable in(Patient, hosp13) medicine(hosp13) exposed(hosp13) sick(Patient)

  21. Bayesian Networks (directed) P(epidemic) epidemic P(sick_john | epidemic) P(sick_bob | epidemic) sick_john sick_mary sick_bob P(sick_mary | epidemic) P(sick_john, sick_mary, sick_bob, epidemic)= P(sick_john | epidemic) * P(sick_mary | epidemic) * P(sick_bob | epidemic) * P(epidemic)

  22. Factor Networks (undirected) 2 epidemic_france epidemic_uk 3 factor,or potential function 1 epidemic_belgium epidemic_germany 4 P(epi_france, epi_belgium, epi_uk, epi_germany)/1(epi_france,epi_belgium) * 2(epi_france, epi_uk) * 3(epi_france,epi_germany) * 4(epi_belgium,epi_germany)

  23. Bayesian Nets as Factor Networks P(epidemic) epidemic P(sick_john | epidemic) P(sick_bob | epidemic) sick_john sick_mary sick_bob P(sick_mary | epidemic) P(sick_john, sick_mary, sick_bob, epidemic)/ P(sick_john | epidemic) * P(sick_mary | epidemic) * P(sick_bob | epidemic) * P(epidemic)

  24. Inference: Marginalization P(epidemic) epidemic P(sick_john | epidemic) P(sick_bob | epidemic) sick_john sick_mary sick_bob P(sick_mary | epidemic) P(sick_john) /epidemicsick_marysick_bob P(sick_john | epidemic) * P(sick_mary | epidemic) * P(sick_bob | epidemic) * P(epidemic)

  25. Inference: Variable Elimination (VE) P(epidemic) epidemic P(sick_john | epidemic) P(sick_bob | epidemic) sick_john sick_mary sick_bob P(sick_mary | epidemic) P(sick_john) /epidemic P(sick_john | epidemic) * P(epidemic) * sick_maryP(sick_mary | epidemic) * sick_bob P(sick_bob | epidemic)

  26. Inference: Variable Elimination (VE) P(epidemic) epidemic P(sick_john | epidemic) 1(epidemic) sick_john sick_mary P(sick_mary | epidemic) P(sick_john) /epidemic P(sick_john | epidemic) * P(epidemic) * sick_maryP(sick_mary | epidemic) * 1 (epidemic)

  27. Inference: Variable Elimination (VE) P(epidemic) epidemic P(sick_john | epidemic) 1(epidemic) sick_john sick_mary P(sick_mary | epidemic) P(sick_john) /epidemic P(sick_john | epidemic) * P(epidemic) * 1 (epidemic) * sick_maryP(sick_mary | epidemic)

  28. Inference: Variable Elimination (VE) P(epidemic) epidemic P(sick_john | epidemic) 1(epidemic) sick_john 2(epidemic) P(sick_john) /epidemic P(sick_john | epidemic) * P(epidemic) * 1 (epidemic) * 2 (epidemic)

  29. Inference: Variable Elimination (VE) 3(sick_john) sick_john P(sick_john) /3 (sick_john)

  30. A Factor Network epidemic(measles) … epidemic(flu) … … sick(mary,measles) sick(mary,flu) sick(bob,measles) sick(bob,flu) … … … … … hospital(mary) hospital(bob) …

  31. First-order Representation Logical Variables parameterize random variables epidemic(D) Parfactors, for parameterized factors sick(P,D) Atoms represent a set of random variables P  john Contraints to logical variables hospital(P)

  32. Semantics epidemic(measles) … epidemic(flu) … … sick(mary,measles) sick(mary,flu) sick(bob,measles) sick(bob,flu) … … … f(sick(mary,measles), epidemic(measles)) … … hospital(mary) hospital(bob) …

  33. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  34. Inversion Elimination (IE) D  measles D  measles epidemic(D) epidemic(D) IE D  measles D  measles sick(P, D) sick(P, D) Abstraction Grounding sick(P,D)(epidemic(D),sick(P,D)) epidemic(flu) epidemic(rubella) epidemic(flu) epidemic(rubella) … … VE sick(john, flu) sick(mary, rubella) sick(john, flu) sick(mary, rubella) … …

  35. Inversion Elimination (IE) • Proposed by Poole (2003); • Lacked formalization; • Did not identify cases in which it is incorrect; • Formalized and identified correctness conditions (with Dan Roth and Eyal Amir).

  36. Inversion Elimination - Limitations • Requires eliminated RVs to occur in separate instances of parfactor epidemic(D) InversionElimination correct D  measles sick(mary, D) … epidemic(flu) epidemic(rubella) … sick(mary, flu) sick(mary, rubella)

  37. epidemic(D1) month D1  D2 epidemic(D2) Inversion Elimination - Limitations • Requires eliminated RVs to occur in separate instances of parfactor IE not correct epidemic(D1) month D1  D2 epidemic(D2) month epidemic(flu) … epidemic(measles) epidemic(rubella)

  38. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  39. epidemic(D1) month D1  D2 month epidemic(D2) epidemic(flu) … epidemic(measles) epidemic(rubella) Counting Elimination • Need to consider joint assignments; • Exponential number of those; • But actually, potential depends on histogram of values in assignment only: 00101 the same as 11000; • Polynomial number of assignments instead.

  40. A Simple Experiment

  41. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  42. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  43. BLOG (Bayesian LOGic) • Milch & Russell (2004); • A probabilistic logic language; • Current inference is propositional sampling; • Special characteristics: • Open Universe; • Expressive language.

  44. BLOG (Bayesian LOGic) type Hospital; #Hospital ~ Uniform(1,10); random Boolean Large(Hospital hosp) ~ Bernoulli(0.6); random NaturalNum Region(Hospital hosp) ~ TabularCPD[[0.3, 0.4, 0.2, 0.1]](); type Patient; #Patient(HospitalOf = Hospital hosp)if Large(hosp) then ~ Poisson(1500); else if Region(hosp) = 2 then = 300; else ~ Poisson(500); query Average( {#Patient(b) for Hospital b} ); Open Universe Expressive language

  45. Inference in BLOG random Boolean Exposed(hosp) ~ Bernoulli(0.7); random Boolean Sick(Patient patient)if Exposed(HospitalOf(patient)) then if Male(patient) then ~ Bernoulli(0.1); else ~ Bernoulli(0.4); else = False; random Boolean Male(Patient patient) ~ Bernoulli(0.5); random Boolean Medicine(Hospital hosp)= exists Patient patient HospitalOf(patient) = hosp & Sick(patient); guaranteed Hospital hosp13; query Medicine(hosp13);

  46. Inference in BLOG - Sampling Open Universe Exposed(hosp13) #Patient(hosp13) Sampled false Sick(patient1) Medicine(hosp13) false ... false Sick(patient73) false

  47. Inference in BLOG - Sampling Open Universe Exposed(hosp13) #Patient(hosp13) Sampled true Sick(patient1) Male(patient1) Medicine(hosp13) true true true

  48. Outline • A goal: First-order Probabilistic Inference • An ultimate goal • Propositionalization • Lifted Inference • Inversion Elimination • Counting Elimination • DBLOG • BLOG background • DBLOG • Retro-instantiation • Improving BLOG inference • Future Directions and Conclusion

  49. Temporal models in BLOG • Expressive enough to directly write temporal models type Aircraft; #Aircraft ~ Poisson[3]; random Real Position(Aircraft a, NaturalNum t) if t = 0 then ~ UniformReal[-10, 10]() else ~ Gaussian(Position(a, Pred(t)), 2); type Blip; // num of blips from aircraft a is 0 or 1 #Blip(Source = Aircraft a, Time = NaturalNum t) ~ TabularCPD[[0.2, 0.8]]() // num false alarms has Poisson distribution. #Blip(Time = NaturalNum t) ~ Poisson[2]; random Real BlipPosition(Blip b, NaturalNum t) ~ Gaussian(Position(Source(b), Pred(t)), 2));

  50. Temporal models in BLOG • Inference algorithm does not use temporal structure: • Markov property: state depends on previous state only; • evidence and query come for successive time steps; • Dynamic BLOG (DBLOG); • analogous to Dynamic Bayesian networks (DBNs).

More Related