150 likes | 357 Views
Probabilistic reasoning over time. This sentence is likely to be untrue in the future!. The basic problem. What do we know about the state of the world now given a history of the world before . The only evidence we have are probabilities.
E N D
Probabilistic reasoning over time This sentence is likely to be untrue in the future!
The basic problem • What do we know about the state of the world now given a history of the world before. • The only evidence we have are probabilities. • “Past performance may not be a guide to future performance.”
Simplifying assumptions and notations • States are our “events”. • (Partial) states can be measured at reasonable time intervals. • Xt unobservable state variables at t. • Et (“evidence”) observable state variables at t. • Vm:n : Variables Vm, Vm+1,…,Vn
Stationary, Markovian (transition model) • Stationary: the laws of probability don’t change over time • Markovian: current unobservalbe state depends on a finite number of past states • First-order: current state depends only on the previous state, i.e.: • P(Xt|X0:t-1)=P(Xt|Xt-1) • Second-order: etc., etc.
Observable variables (the sensor model) • Observable variables depend only on the current state (by definition, essentially), these are the “sensors”. • The current state causes the sensor values. • P(Et|X0:t,E0:t-1)=P(Et|Xt)
Start it up (the prior probability model) • What is P(X0)? • At time t, the joint is completely determined: • P(X0,X1,…Xt,E1,…,Et) =P(X0) • ∏i t P(Xi|Xi-1)P(Ei|Xi)
Better predictions? • More state variables (temperature, humidity, pressure, season…) • Higher order Markov processes (take more of the past into account). • Tradeoffs?
What’s it good for? • Belief/monitoring the current state • Prediction about the next state • Hindsight about previous states • Explanation of possible causes
Hidden Markov Models (HMMs) • Further simplification: • Only one state variable. • We can use matrices, now. • Ti,j = P(Xt=j|Xt-1=i)
Speech Recognition • P(words|signal) = P(signal|words)P(words) • P(words) “language model” • “Every time I fire a linguist, the recognition rate goes up.”
Model 1: Speech • Sample the speech signal • Decide the most likely sequence of speech symbols
Phonetic alphabet • Phonemes: minimal units of sound that make a meaning difference (beat vs. bit; fit vs. bit) • Phones: normalized articulation results paid vs. tap • English has about 40 • Co-articulation effects modeled as new symbols. sweet = w(s,iy)
Model 2,3: Words, Sentences • Given the phones, what is the most likely word/word in the sentence? • “Give me all your money. I have a gub.” • Gub is unlikely to be a word, • And if it were, it would be less likely than “gun.”