1 / 25

Uncertainty CS570 Lecture Note by Jin Hyung Kim Computer Science Department KAIST

Uncertainty CS570 Lecture Note by Jin Hyung Kim Computer Science Department KAIST. Reasoning Under Uncertainty. Motivation Truth value is unknown Too complex to compute prior to make decision characteristics of real-world applications Source of Uncertainty

Download Presentation

Uncertainty CS570 Lecture Note by Jin Hyung Kim Computer Science Department KAIST

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Uncertainty CS570 Lecture Note by Jin Hyung Kim Computer Science Department KAIST

  2. Reasoning Under Uncertainty • Motivation • Truth value is unknown • Too complex to compute prior to make decision • characteristics of real-world applications • Source of Uncertainty • cannot be explained by deterministic model • decay of radioactive substances • Don’t understand well • disease transmit mechanism • Too complex to compute • coin tossing

  3. Types of Uncertainty • Randomness • Which side will be up if I toss a coin ? • Vagueness • Am I pretty ? • Confidence • How much do you confident on your decision ? One Formalism for all vs. Separate formalism Representation + Computational Engine

  4. Representing Uncertain Knowledge • Modeling • select relevant (believed) propositions and events • ignore irrelevant, independent events • quantifying relationships is difficult • Consideration of Representation • problem characteristic • degree of detail • Computational complexity • simplification by assumption • Data is obtainable ? • Computational Engine is available ?

  5. Uncertain Representation Binary Logic Multi-valued Logic Probability Theory Upper/Lower Probability Possibility Theory 1 0

  6. Application involving Uncertainty • Wide range of Applications • diagnose disease • language understanding • pattern recognition • managerial decision making • useful answer from uncertain, conflicting knowledge • acquiring qualitative and quantitative relationships • data fusion • multiple experts’ opinion

  7. Probability Theory • Frequency Interpretation • flipping coin • generalizing future event • Subjective Interpretation • Probability of passing an exam • degree of belief

  8. Degree of Belief • Elicitation of degree of belief from subject • lottery - utility functions • Elicitation of Conditional probability • conversion using Bayes Rule • Different subjective probability to same proposition • due to different background and knowledge $1 $1 0.9 Rain >< 0.1 $0 No Rain $0

  9. Human Judgement and Fallibility • Systematic bias in probability judgement • Psychologist Tversky’s experiment • conservative in updating belief • A: 80% chance of $4000 • B: 100% chance of $3000 • C: 20% chance of $4000 • D: 25% chance of $3000 A or B ?, C or D ? • Risk-averse with high prob. And risk-taking with low prob.

  10. Random Variable • Variable takes value from sample space • mutually exclusive, exhausitive • Joint Sample Space x1x2 ….  xn = • 0 Pr(X = x)  1, xx and • 1 =

  11. More on Probability Theory • Joint Distribution • joint probability is exponential of RV n • Conditional Probability • Prior Belief + New evidence  Posterior Belief

  12. Calculus of Combining Probabilities • Marginalizing • Addition Rule • Chain Rule Pr(X1, …, Xn) = Pr(X1|X2, …, Xn) Pr(X2|X3, …, X4) …. Pr(Xn-1|Xn) Pr(Xn) • factorization

  13. Bayes Rule • Allows to covert Pr(A|B) to P(B|A), vice versa • Allows to elicit psychologically obtainable values • P( symptom | disease ) vs P(disease | symptom) • P(cause | effect ) vs P(effect | cause ) • P( object | attribute) vs P(attribute | object ) • Probabilistic Reasoning  Baysian School

  14. Conditional Independence • Specifying distribution of n variables • require 2n numbers • Conditional Independence Assumption • Pr(A|B, C) = Pr(A|C) • A is conditionally independent of B given C • A not depend on B given C  We can ignore B in inferring A if we have C • Pr(A,B|C) = Pr(A|C)Pr(B|C) • symmetric relationship • Chain rule and conditional independence together saves storage • ex : Pr(A,B,C,D) = Pr(A|B)Pr(B|C)Pr(C|D)P(D)

  15. Problems • Approximate the joint probability distribution of n variables by set of 2nd order joint probabilities Pr(ABCD) = P(A)P(B|A)P(C|A)P(D|C) or P(B)P(B|A)P(C|B)P(D|C) or ….. • From P(A) and P(B), estimate P(A, B). • Maximum Entropy Principle

  16. Maintaining Consistency • Pr(a|b) = 0.7, Pr(b|a) = 0.3, Pr(b) = 0.5 • inconsisent • check by Bayes rule Pr(a,b) = Pr(b|a)Pr(a) = Pr(a|b)Pr(b) Pr(a) = Pr(a|b)Pr(b) / Pr(b|a) = 0.7 x 0.5 / 0.3 > 1.0

  17. Probabilistic Networks • Bayesian Network, Causal Network, Knowledge map • Graphical model of causality and influence • directed acyclic graph, G = (V, E) • V : random variable • E : dependencies between random variables • only one edge connecting two nodes directly • direction of edge : causal influence • for X  Y  Z • evidence regarding X : causal support of Y • evidence regarding Z : diagnostic support of Y

  18. H : hardware problem B : bug’s in LISP code E : editor is running L : LISP Interpreter O.K. F : Cursor is flashing P : prompt displayed W : dog is barking H W B E L Parent child successor F P

  19. Burglary Earthquake Alarm MaryCalls JohnCalls

  20. Independent relationship • Each node is CI of all non-successors given Parent • Pr(P|E,F,L,J,B,H) = Pr(P|L) • Topological sort of variables • variable after all children • Let X1, X2, …, Xn be a topological sort of variables of G • Pr(Xi | Xi+1, Xi+2, …, Xn) = Pr (Xi | parent(Xi))

  21. H B D C S E L Pr(P,F,L,E,B,H) = Pr(P|F,L,E,B,H)Pr(F|L,E,B,H) Pr(L|E,B,H)Pr(E|B,H)Pr(B|H)Pr(H) = Pr(P|L)Pr(F|E)Pr(L|H,B)Pr(B)Pr(H) F P

  22. Path-Based Characterization of Independence • A is dependent of C given B • F is independent of D given E B C A E F D

  23. Direction dependent separation (D-separation) • A path P is blocked given set of node E if ZP for which 1. ZE and Z has one arrow in and one arrow out on P 2. ZE and Z has both path arrows out 3. ZE and successor(Z)E, and both path arrows lead to Z • A set of nodes E d-separates two sets of nodes X and Y if every undirected path from X to Y is blocked • If every undirected path from X to Y is d-separable by E, then X and Y are conditionally independent given E.

  24. Z (1) (2) Z E (3) Z Y X Path X to Y is blocked, given evidence E

  25. Quantifying Network • Supply conditional probability of each node given parent H B E L F P

More Related