Markov chains
Markov chains. Markov Chains. Markov chain model: G=((S,E), ) Finite set S of states. Probabilistic transition function E ={ ( s,t ) | (s)(t) > 0} The graph (S,E) is useful. Markov Chain: Example. 1/3. 1/3. 1/3. Markov Chain: Example. 1/3. 1/3. 1/3. Markov Chain: Example.
808 views • 46 slides