130 likes | 381 Views
Uncertainity. AI Lecture #9 by Zahid Anwar. Is abduction necessary?. Although abduction is unsound it is often essential to solving problems
E N D
Uncertainity AI Lecture #9 by Zahid Anwar
Is abduction necessary? • Although abduction is unsound it is often essential to solving problems • The correct version of battery rule is not particularly useful in diagnosing car troubles since its premise bad-battery is our goal and its conclusions are the observable symptoms we must work with • Modus Ponens can not be applied and the rule must be used in an abductive fasion
Abduction • This is generally true of diagnostic and other expert systems • Fault or diseases cause symptoms, but diagnosis must work from the symptoms back to the cause • Uncertainty results from the use of abductive inference as well as from attempts to reason with missing or unreliable data • To get around this problem, we can attach some measure of confidence to the conclusion
Example • For example, although battery failure does not always accompany the failure of a car’s lights and starters, it almost does, and confidence in this rule is justifiably high. • There are several ways of dealing with uncertainty that results from heuristic rules • Bayesian Approach • Stanford Certainty theory • Zadeh’s fuzzy set theory • Non-monotonic reasoning
Bayesian Probability Theory • The Bayesian approach to uncertainty is based on formal probability theory • Assuming random distribution of events, probability theory allows the calculation of more complex probabilities from previously known results • In mathematical theory of probability individual probability instances are worked out by sampling and combinations of probabilities are worked out using the rule such as
Bayesian Probability Theory • Probability (A and B) = probability (A ) * probability ( B ) given that A and B are independent results • One of the most important results of probability theory is Baye’s theorem • Bayes results provide a way of computing the probability of a hypothesis following from a particular piece of evidence , given only the probabilities with which the evidence follows from actual cases
Bayes’s theorem states • P (H | E) = P (E | H) * P (H) Σ(from k=1 to n)( P(E|HK) * P (HK)) Where P(H|E) is the probability that H is true given evidence E P (H) is the probability that H is true overall P(E|H) is the probability of observing evidence E when H is true N is the number of possible hypothesis
Semantics of Predicate Calculus • The truth of expressions depends on the mapping of constants, variables, predicates and functions into objects and relations of the domain of discourse • The truth of relationships in the domain determines he truth of corresponding expressions • Friends(george, susie) • Friends(george, Kate)
Interpretation • An Interpretation is an assignment of the D to each of the constants, variables, predicates and functions
Satisfy • An Interpretation that makes a sentence true is said to satisfy that sentence • An Interpretation that satisfies every member of a set of expressions is said to satisfy the set
Logically Follows • An expression X, logically follows from a set of predicate calculus expressions S if every Interpretation that satisfies S also satisfies X The function of a logical inference is to produce new sentences that logically follows a given set of expressions
Sound • When every sentence X produced by an inference rule operating on a set S of logical expressions logically follows from S, the inference rule is said to be sound • If the inference rule is able to produce every sentence that logically follows from S, then it is said to be complete • Modus Ponens and resolution are examples of inference rules that are sound
Example of an Interpretation • FOR ALL x: human (x) mortal (x) • Human(socrates) dead(socrates) Every Human Dies