190 likes | 308 Views
Imprecise Probabilities and Their Role in General Intelligence A Pragmatic Approach to Calculating “Weight of Evidence” Combining Imprecise Probabilities and Confidence Intervals. Dr. Matthew Ikl é Department of Mathematics and Computer Science Adams State College. Probability Theory.
E N D
Imprecise Probabilities and Their Role in General Intelligence A Pragmatic Approach to Calculating “Weight of Evidence” Combining Imprecise Probabilities and Confidence Intervals Dr. Matthew Iklé Department of Mathematics and Computer Science Adams State College
Probability Theory • A Principled Foundation for Artificial General Intelligence: • BUT: Constraints are placed by • The need to operate within realistic computational resources. • The current, incomplete state of probabilistic mathematics. • THUS: Probability theory requires augmentation with heuristic approaches to be pragmatic for general intelligence.
Probabilistic Logic Networks (PLN) • A Logical Inference System: • Combines rigorous probabilistic formulas with heuristic rules. • Reasoning based on uncertain knowledge and/or reasoning leading to uncertain conclusions. • Ability to encompass within logic things such as induction, abduction, analogy and speculation, and reasoning about time and causality. • Effectively propagates uncertainties through complex inferences involving quantifiers, higher-order functions, etc. • Designed for integration with a general-purpose cognition process (in the Novamente AI system)
Probabilistic Logic Networks (PLN) • A Rich Set of Inference Rules: • Deduction, Bayes Rule, Unification, Intensional/Extensional Inference, Belief Revision,… • Each rule comes with uncertain truth value formulas, calculating the truth value of the conclusion from the truth values of the premises • Inference is controlled by highly flexible forward and backward chaining processes able to take feedback from external processes and thus behave adaptively
Belief Revision • One simple, but critical rule within PLN and other uncertain inference systems • Allows the combination of two different estimates of the truth value of the same proposition, to form a composite estimate • Is awkwardly handled within standard probabilistic approaches • Different estimates may come from different external sources, OR from different internal inference trails
Belief Revision: A Heuristic Rule • <s,d> -- <strength, weight of evidence> • count = n = k d/(1-d) … d = n/(n+k), assume k=10 • eat (cat, mouse) <.8,.7> // data source 1 • eat(cat, mouse) <.2,.4> // data source 2 • |- • eat(cat, mouse) <.58, .75> // sources 1 and 2 • (.8 * .7 + .2*.4)/(.7+.4) = .58 • d1=.7 --> N1 = 23 • d2=.4 --> N2 = 7 • N = N1 + N2 = 30 (assuming no dependence) • d = .75
Weight of Evidence • What is it? • Why is it important? • E.g. belief revision • One approach: weight of ev. = interval width • [.2,.8] means less evidence than [.4,.6] • Pei Wang’s NARS system • Imprecise probabilities • Heuristic approaches (Izabela Freire Goertzel)
Imprecise Probabilities • The foundation of one approach to weight of evidence calculations within PLN: • Peter Walley’s Imprecise Beta-Binomial (IBB) theory, developed in his seminal work, Statistical Inference with Imprecise Probabilities, 1991. • Uses a parametrized envelope of (Beta-disribution) priors rather than assuming a single prior.
Imprecise Probabilities • Advantages of Imprecise Probabilities in General: • Weakness of the traditional approach to statistics with its reliance on often unmotivated assumptions regarding the functional forms of probability distributions. • More natural and consistent with uncertain and incomplete information. • Standard Bayesian methods offer no generally viable way to assess or reason about “second-order uncertainties” or “weight-of-evidence” (eloquently pointed out by Pei Wang).
Imprecise Probabilities are not (quite) the entire answer • Disadvantages of Imprecise Probabilities: • Overly conservative. • Professing ignorance rather than giving guidance for practical decision-making. • Even with significant information to the contrary, imprecise probability intervals rapidly expand to [0,1].
The PLN Approach • A Hybrid of Imprecise Probabilities and Traditional Confidence Intervals: • Walley’s key ideas provide a solid foundation. • Natural generalization of Walley’s parametrized distributions. • All distributions replaced by envelopes of distributions.
Three Basic Stages • To calculate the weight of evidence associated with the conclusion of an uncertain inference rule (e.g. deduction, Bayes rule,…): • Translate premise strength (probability) values s, count (weight-of-evidence) values n, and standard confidence levels b, into initial intervals [L,U]. • Calculate final [L,U] interval using the inference rule and Monte-Carlo methods (see next slide) • Translate final [L,U] interval back to get final strength, count, and confidence level values.
Monte-Carlo methods final probability interval Imprecise Probability level initial probability interval Translation layer strength, count, confidence level strength, count, confidence level AI engine level
An Example • Suppose we have: • 100 gerbils of unknown color; • 10 gerbils of known color, 5 of which are blue; • and 100 rats of known color, 10 of which are blue. • We wish to estimate the probability of a randomly chosen blue rodent being a gerbil, using Bayes rule • P(gerbil | blue) = ?
Experimental Results PLN Approach
Experimental Results Bayesian (Standard Confidence Interval) Approach Walley’s Approach
The PLN Approach • Advantages of the PLN Hybrid Method: • Introduction of traditional Bayesian confidence intervals at each stage provides an easily configurable way to control the expansion of the probability intervals. • Both Walley’s IBB theory and standard Bayesian inference follow from the PLN approach as special cases. • Allows for the modeling of all probabilities by any family of distributions. • Allows for considerably more flexibility in accounting for known and unknown quantities.
Conclusions • The PLN Hybrid Method: • Combines the solid philosophical underpinnings of imprecise probability theory with the practicality of standard Bayesian methods. • Provides the ability to adjust interval widths based on confidence levels. • Interoperates smoothly with non-probabilistic heuristic methods.