120 likes | 139 Views
Learn how Markov Chains, a type of stochastic process, can be used to analyze trends and predict the future in various fields such as weather, stock market, genetics, and product success.
E N D
Markov Chains Mathematical models for processes that evolve over time in a probabilistic manner are called stochastic processes. A special kind of stochastic process is Markov Chain. It is characterized by the special property that probabilities involving how the process will evolve in the future depends only on the present state of the process, and so are independent of the events in the past. Markov chains are used to analyze trends and predict the future (weather, stock market, genetics, product success, etc.)
Markov Chains A finite Markov chain is • a set of objects, • a set of consecutive time periods, • a finite set of different states such that • during any given time period, each object is in only one state (although different objects can be in different states); • the probability that an object will move from one state to another state (or remain in same state) over a time period depends only on the beginning and ending states.
Markov Chains • A Markov chain can be represented by a matrix P=[pij] where pij represents the probability of an object moving from state i to state j in one time period. Such a matrix is called a transition matrix. - The transition matrix is square (by the nature of Markov process); • The sum of the probabilities of each row must add to one. • A Markov chain can also be represented by a graph. • Example in the next slide
0.1 0.9 0.8 coke pepsi 0.2 Markov Chains: Coke vs. Pepsi Example • Given that a person’s last cola purchase was Coke, there is a 90% chance that his next cola purchase will also be Coke. • If a person’s last cola purchase was Pepsi, there is an 80% chance that his next cola purchase will also be Pepsi. transition matrix:
Powers of transition matricesCoke vs. Pepsi Example (cont.) Given that a person is currently a Pepsi purchaser, what is the probability that he will purchase Coke two purchases from now? Pr[ Pepsi?Coke ] = Pr[ PepsiCokeCoke ] +Pr[ Pepsi Pepsi Coke ] = 0.2 * 0.9 + 0.8 * 0.2 = 0.34 P2 is the transition matrix after two time period.
Powers of transition matricesCoke vs. Pepsi Example (cont.) Given that a person is currently a Coke purchaser, what is the probability that he will purchase Pepsithree purchases from now?
Distribution Row Vector A distribution row vector d for an N-state Markov chain is an N-dimensional row vector having as its components, one for each state, the probabilities that an object in the system is in each of the respective states. Example (cont.): Suppose 60% of all people now drink Coke, and 40% drink Pepsi. Then the distribution vector is (0.6, 0.4). Let d(k) denote the distribution vector for a Markov chain after k time periods. Thus, d(0) represents the initial distribution. Then d(k) = d(0)Pk
Distribution Row Vector Example (cont.): • Suppose 60% of all people now drink Coke, and 40% drink Pepsi • What fraction of people will be drinking Coke three weeks from now? • d(0) = (0.6, 0.4) • d(3) = d(0)P3 • Pr[X3=Coke] = 0.6 * 0.781 + 0.4 * 0.438 = 0.6438
Regular Markov Chains A Markov chain is regular if some power of the transition matrix contains only positive elements. If the matrix itself contains only positive elements then the power is one, and the matrix is automatically regular. Transition matrices that are regular always have an eigenvalue of unity. They also have limiting distribution vectors x(∞), where the ith component of x(∞) represents the probability of an object in state i after a large number of time periods have elapsed.
stationary distribution 0.1 0.9 0.8 coke pepsi 0.2 Limiting distributionCoke vs. Pepsi Example (cont) 2/3 Pr[Xi= Coke] week - i
Regular Markov Chains Definition: A nonzero vector x is a left eigenvector for a matrix A if there exists a scalar λ such that xA = λx. (Left and right eigenvectors are usually different; they are the same only for special type of matrices.) The limiting distribution x(∞) is a left eigenvector of the transition matrix corresponding to the eigenvalue of unity, and having the sum of its components equal to one. Examples on the board.