920 likes | 1.05k Views
Open universes and nuclear weapons. Outline. Why we need expressive probabilistic languages BLOG combines probability and first-order logic Application to global seismic monitoring for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The world has things in it!!.
E N D
Outline • Why we need expressive probabilistic languages • BLOG combines probability and first-order logic • Application to global seismic monitoring for the Comprehensive Nuclear-Test-Ban Treaty (CTBT)
The world has things in it!! • Expressive language => concise models => fast learning, sometimes fast reasoning • E.g., rules of chess: • 1 page in first-order logic On(color,piece,x,y,t) • ~100000 pages in propositional logic WhiteKingOnC4Move12 • ~100000000000000000000000000000000000000 pages as atomic-state model R.B.KB.RPPP..PPP..N..N…..PP….q.pp..Q..n..n..ppp..pppr.b.kb.r [Note: chess is a tiny problem compared to the real world]
Brief history of expressiveness probability logic atomic propositional first-order/relational
Brief history of expressiveness probability 5th C B.C. logic atomic propositional first-order/relational
Brief history of expressiveness 17th C probability 5th C B.C. logic atomic propositional first-order/relational
Brief history of expressiveness 17th C probability 5th C B.C. 19th C logic atomic propositional first-order/relational
Brief history of expressiveness 17th C 20th C probability 5th C B.C. 19th C logic atomic propositional first-order/relational
Brief history of expressiveness 17th C 20th C 21st C probability 5th C B.C. 19th C logic atomic propositional first-order/relational
Brief history of expressiveness 17th C 20th C 21st C probability (be patient!) 5th C B.C. 19th C logic atomic propositional first-order/relational
First-order probabilistic languages • Gaifman [1964]: • Possible worlds with objects and relations, probabilities attached to (infinitely many) sentences • Halpern [1990]: • Probabilities within sentences, constraints on distributions over first-order possible worlds • Poole [1993], Sato [1997], Koller & Pfeffer [1998], various others: • KB defines distribution exactly (cf. Bayes nets) • assumes unique names and domain closure like Prolog, databases (Herbrand semantics)
Herbrand vs full first-order Given Father(Bill,William) and Father(Bill,Junior) How many children does Bill have?
Herbrand vs full first-order Given Father(Bill,William) and Father(Bill,Junior) How many children does Bill have? Herbrand semantics: 2
Herbrand vs full first-order Given Father(Bill,William) and Father(Bill,Junior) How many children does Bill have? Herbrand semantics: 2 First-order logical semantics: Between 1 and ∞
Possible worlds • Propositional
Possible worlds A B C D • Propositional • First-order + unique names, domain closure A B A B A B A B C D C D C D C D
Possible worlds A B C D • Propositional • First-order + unique names, domain closure • First-order open-universe A B A B A B A B C D C D C D C D A B C D A B C D A B C D A B C D A B C D A B C D
Open-universe models • Essential for learning about what exists, e.g., vision, NLP, information integration, tracking, life • [Note the GOFAI Gap: logic-based systems going back to Shakey assumed that perceived objects would be named correctly] • Key question: how to define distributions over an infinite, heterogeneous set of worlds?
Bayes nets build propositional worlds Burglary Earthquake Alarm
Bayes nets build propositional worlds Burglary Earthquake Alarm Burglary
Bayes nets build propositional worlds Burglary Earthquake Alarm Burglary not Earthquake
Bayes nets build propositional worlds Burglary Earthquake Alarm Burglary not Earthquake Alarm
Open-universe models in BLOG • Construct worlds using two kinds of steps, proceeding in topological order: • Dependency statements: Set the value of a function or relation on a tuple of (quantified) arguments, conditioned on parent values
Open-universe models in BLOG • Construct worlds using two kinds of steps, proceeding in topological order: • Dependency statements: Set the value of a function or relation on a tuple of (quantified) arguments, conditioned on parent values • Number statements: Add some objects to the world, conditioned on what objects and relations exist so far
Semantics Every well-formed* BLOG model specifies a unique proper probability distribution over open-universe possible worlds; equivalent to an infinite contingent Bayes net * No infinite receding ancestor chains, no conditioned cycles, all expressions finitely evaluable
Example: Citation Matching [Lashkari et al 94] Collaborative Interface Agents, Yezdi Lashkari, Max Metral, and Pattie Maes, Proceedings of the Twelfth National Conference on Articial Intelligence, MIT Press, Cambridge, MA, 1994. Metral M. Lashkari, Y. and P. Maes. Collaborative interface agents. In Conference of the American Association for Artificial Intelligence, Seattle, WA, August 1994. Are these descriptions of the same object? Core task in CiteSeer, Google Scholar, over 300 companies in the record linkage industry
(Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
(Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
(Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
(Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
(Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
(Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
Citation Matching Results Four data sets of ~300-500 citations, referring to ~150-300 papers
Example: Sibyl attacks • Typically between 100 and 10,000 real people • About 90% are honest, have one login ID • Dishonest people own between 10 and 1000 logins. • Transactions may occur between logins • If two logins are owned by the same person (sibyls), then a transaction is highly likely; • Otherwise, transaction is less likely (depending on honesty of each login’s owner). • A login may recommend another after a transaction: • Sibyls with the same owner usually recommend each other; • Otherwise, probability of recommendation depends on the honesty of the two owners.
#Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)
#Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)
#Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)
#Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)
#Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)
#Aircraft(EntryTime = t) ~ NumAircraftPrior(); Exits(a, t) if InFlight(a, t) then ~ Bernoulli(0.1); InFlight(a, t)if t < EntryTime(a) then = falseelseif t = EntryTime(a) then = trueelse = (InFlight(a, t-1) & !Exits(a, t-1)); State(a, t)if t = EntryTime(a) then ~ InitState() elseif InFlight(a, t) then ~ StateTransition(State(a, t-1)); #Blip(Source = a, Time = t) if InFlight(a, t) then ~ NumDetectionsCPD(State(a, t)); #Blip(Time = t) ~ NumFalseAlarmsPrior(); ApparentPos(r)if (Source(r) = null) then ~ FalseAlarmDistrib()else ~ ObsCPD(State(Source(r), Time(r)));
Inference Theorem: BLOG inference algorithms (rejection sampling, importance sampling, MCMC) converge to correct posteriors for any well-formed* model, for any first-order query Current generic MCMC engine is quite slow • Applying compiler technology • Developing user-friendly methods for specifying piecemeal MCMC proposals
CTBT • Bans testing of nuclear weapons on earth • Allows for outside inspection of 1000km2 • 182/195 states have signed • 153/195 have ratified • Need 9 more ratifications including US, China • US Senate refused to ratify in 1998 • “too hard to monitor”