1 / 7

Recap: Reasoning Over Time

Dive into the world of Markov models, explore filtering and smoothing techniques, and uncover the power of the Viterbi algorithm. Learn to predict states over time with hidden Markov models. Discover key concepts in probabilistic reasoning and apply them to practical scenarios.

smindy
Download Presentation

Recap: Reasoning Over Time

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recap: Reasoning Over Time • Stationary Markov models • Hidden Markov models 0.3 0.7 X1 X2 X3 X4 rain sun 0.7 0.3 X1 X2 X3 X4 X5 E1 E2 E3 E4 E5 This slide deck courtesy of Dan Klein at UC Berkeley

  2. Recap: Filtering • Elapse time: compute P( Xt | e1:t-1 )Observe: compute P( Xt | e1:t ) Belief: <P(rain), P(sun)> X1 X2 <0.5, 0.5> Prior on X1 <0.82, 0.18> Observe E1 E2 <0.63, 0.37> Elapse time <0.88, 0.12> Observe

  3. Filtering/Smoothing • Exercise 15.13, p. 609 • Prior of “Enough Sleep” (ES) = .7 • ES at t-1 => Pr(ES at t) = .8 • ~ES at t-1 => Pr(ES at t) = .3 • “Red Eyes” (RE): If ES, PR(RE) = .2; If ~ES, PR(RE) = .7 • “Sleep in Class” (SC): If ES, PR(SC) = .1; If ~ES, PR(SC) = .3 • e1 = not red eyes, not sleeping in class: ~RE, ~SC • e2 = red eyes, not sleeping in class: RE, ~SC • e3 = red eyes, sleeping in class RE, SC

  4. Best Explanation Queries • Query: most likely seq: X1 X2 X3 X4 X5 E1 E2 E3 E4 E5

  5. State Path Trellis • State trellis: graph of states and transitions over time • Each arc represents some transition • Each arc has weight • Each path is a sequence of states • The product of weights on a path is the seq’s probability • Can think of the Forward (and now Viterbi) algorithms as computing sums of all paths (best paths) in this graph sun sun sun sun rain rain rain rain

  6. Viterbi Algorithm sun sun sun sun rain rain rain rain

  7. Example

More Related