160 likes | 671 Views
Markov Chains. Kevin Comer CSS 650 January 31, 2013. Definitions. Markov Chain: A random process with the Markov Property Markov Property: Given the present state, the future and past states are independent. “ M emorylessness ” Possible values of X i form the state space.
E N D
Markov Chains Kevin Comer CSS 650 January 31, 2013
Definitions • Markov Chain: A random process with the Markov Property • Markov Property: Given the present state, the future and past states are independent. • “Memorylessness” • Possible values of Xi form the state space
Simple Example Source: http://en.wikipedia.org/wiki/File:MarkovChain1.png
Reducibility • State j is accessible from state i if there is a non-zero probability of transitioning from i to j at some point • State j is communicate with state i if both j is accessible from i and i is accessible from j • State i is essential if for all states i is accessible to it is also accessible from. • A Markov Chain is irreducible if all states are communicate with every other state.
Periodicity • A state i has period k if any return to i must occur in multiples of k steps • Example: if, starting in state i, one can return to i in {6, 8, 10, 12…} steps, state i has period k = 2. • If k = 1, the state is aperiodic, meaning returns to state i can occur at irregular times. • A Markov chain is aperiodic only if every state in it is aperiodic.
Recurrence • A state is transient if there is a non-zero probability that we will never return to that state. • If a state is not transient, then it is recurrent. • Recurrent states have a finite hitting time with probability 1. • The mean recurrence time is the expected return time • If Mi is finite, the state is positive recurrent. • Otherwise, the state is non-null recurrent.
Recurrence • Expected Number of Visits • A state is recurrent if and only ifthe expected number of return visits over an infinite time is infinite. • A state is absorbing if it is impossible to leave the state. • and for i ≠ j
Ergodicity • A state is ergodic if it is aperiodic and positive recurrent. • If all states in an irreducible Markov chain are ergodic, then the chain is ergodic.
Variations of Markov Chains • Time-homogenous Markov Chain • Probability of transition is independent of time n • Markov Chain of order m • Future state depends on the past m states • given that n > m • Can be reformatted to construct a state space of ordered m-tuples of X values
Steady State Analysis • Markov process can be described by a time-independent matrix pij • Stationary distribution represented by vector π • Assuming the state space is finite, the transition matrix P is defined as
Model of Class Mobility • System of equations (since ) • Solving for probabilities • , ,
Branching Process • Consider a population where each individual produces j ≥ 0 new offspring with probability Pj • The population at any given time n is Xn • State Xn = 0 is a recurrent state, all others are transient • Let π0 denote the probability that the population will eventually die out.
Time-Reversible Markov Chains • Assume a stationary ergodic Markov chain that we want to reverse • If Qij = Pij for all i, j, then the chain is considered time reversible • Also can be expressed as
Continuous-Time Markov Process • Length of time in each state exponentially distributed (i.e. “memoryless”) • Rate parameter = qii • Transition rate matrix Q • Given n systems in state i, they will transition to state j at a rate of nqij (provided large enough n) • As transition rates sum up to 0,
Social Science Applications • Economics • Macroeconomic business cycle • Economic development of countries • Mathematical Biology • Birth and Death Rates • Simple Example: Conway’s Game of Life • Epidemiology (SIR model) • Queuing Theory • Music