1 / 8

CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets

CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets. P(B=F) P(B=T) .999 .001 . P(E=F) P(E=T) .998 .002 . Earthquake. Burglary. B E P(A=F) P(A=T) F F .999 .001 T F .06 .94 F T .71 .29 T T .05 .95. Alarm.

judson
Download Presentation

CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCI 121 Special Topics: Bayesian NetworksLecture #2: Bayes Nets

  2. P(B=F)P(B=T) .999 .001 P(E=F)P(E=T) .998 .002 Earthquake Burglary BEP(A=F)P(A=T) F F .999 .001 T F .06 .94 F T .71 .29 T T .05 .95 Alarm JohnCalls MaryCalls AP(J=F)P(J=T) F .95 .05 T .10 .90 A P(M=F)P(M=T) F .99 .01 T .30 .70

  3. Why Use Bayes Nets? • Why not just use joint prob. (can be used to answer any query)? BEAJMP F F F F F P1 ... T T T T T Pm • Table size is exponential (m=2n entries for n vars.) • Network (graph) is sparse; only encodes local dependencies. • n2k entries, where k = avg. # inputs to node.

  4. Types of Inferences / Queries • Diagnostic: P(B | J) = 0.02 • Causal: P(J | B) = 0.85 • Intercausal: P(B | J &M & ~E) = 0.34 • Explaining Away: A university accepts a student if s/he is a good scholar or good athlete. If an accepted student is a good scholar, is s/he a good athlete? (Berkson's Paradox / selection bias)

  5. Answering Queries • E.g., compute P(B|A) • Product Rule tells us that P(B|A) = P(B&A) / P(A) • Problem: the only values we can obtain directly from the probability tables are the priors and some joint conditionals. For P(B&A) and P(A), the values are confounded with other variables (E). • Solution: we can marginalize(sum out) the other variables to focus on the ones we care about • First, we must convert the tables to joint probabilities of all relevant variables:

  6. P(B) .001 P(E) .002 B E P(A) T T .95 T F .94 F T .29 F F .001 Answering Queries: Marginalization BEAProb T T T .001*.002*.95 = .000001900 T T F .001*.002*.05 = .000000100 T F T .001*.998*.94 = .000938120 T F F .001*.998*.06 = .000059880 F T T .999*.002*.29 = .000579420 F T F .999*.002*.71 = .001418580 F F T .999*.998*.001 = .000997002 F F F .999*.998*.999 = .996004998

  7. Answering Queries: Marginalization Now we get P(A) by summing over rows where A = True: BEAProb T T T .000001900 T T F .000000100 T F T .000938120 T F F .000059880 F T T .000579420 F T F .001418580 F F T .000997002 F F F .996004998 P(A) = .000001900 + .000938120 + .000579420 + .000997002 .002516442

  8. BEAProb T T T .000001900 T T F .000000100 T F T .000938120 T F F .000059880 F T T .000579420 F T F .001418580 F F T .000997002 F F F .996004998 Answering Queries: Marginalization • Next, we need to compute P(B&A) • Again, we marginalize: • So P(B|A) = .000940020 / .002516442= .373551228 P(B&A) = . 000001900 + .000938120 .000940020

More Related