130 likes | 439 Views
Markov Chains. Tom Finke. Outline of presentation The Markov chain model Description and solution of simplest chain Study of steady state solutions Study of dependence on initial conditions The Mathematics of Markov Chains Probability, Matrices, Discrete Process. Overview.
E N D
Markov Chains Tom Finke
Outline of presentation The Markov chain model Description and solution of simplest chain Study of steady state solutions Study of dependence on initial conditions The Mathematics of Markov Chains Probability, Matrices, Discrete Process Overview
Set up of the problem • Miniature Golf and Frisbee Golf • Each night some people change sites and some don’t • % remains the same • Night 1, a 50-50 split • Goal: predict % on subsequent nights
Charts of data • Top Pie chart shows the % at each place on night 1 • Second Row of Pie charts show movement of population from one night to next
Probability Tree Diagram • Columns refer to nights • Multiply along tree to get probability of that sequence • In any column, add all probabilities for a given activity to get total
Night 2 Miniature golf: (.50)(.60)+(.50)(.30)=.45 Frisbee golf: (.50)(.40)+(.50)(.70)=.55 Night 3 Miniature golf : (.45)(.60)+(.55)(.30)=.435 Frisbee golf: (.45)(.40)+(.55)(.70)=.565 Some Mathematics
Markov Chain Visualized • Circles represent activities • Arrows represent movement among activities • Numbers on arrows are probabilities
Matrix Mathematics • T is the transition matrix • S(1) is the initial (1st) state vector • Matrix multiplication gives state S(2)
The Change in State • This shows the time evolution of the distribution between the two activities, one night at a time • It suggests there might be a steady state which is eventually reached
S(n+1)=T*S(n); eventually multiplying by T does not change the state. T^m*S(1) for large m Normalized eigenvector of T with associated eigenvalue 1 Steady State
Start the process with a different state S(1) Same steady state Next Step: change the transition matrix Initial Conditions
A mathematical process that is stochastic, discrete, and has the property that the probability that a particular outcome occurs depends only on the previous outcome is called a Markov chain. It can be visualized by a chain, or directed graph. It can be analyzed by matrix theory. Conclusions