190 likes | 390 Views
3-2. Primer: Markov chains (cont.). . 3-3. Discrete time Markov chains. time is discrete, t = 1, 2, system occupies state X(t) at time tX(t) takes values from finite state space Stransitions between states Pi,j = P(X(t 1) = j | X(t) = i ) probability system transits to j from iP = [Pi,j]
E N D