700 likes | 957 Views
Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes. January 19 th – 22 nd 2012 Lahore University of Management Sciences. Schedule. Day 1 ( Saturday 21 st Jan ): Review of Probability and Markov Chains
E N D
Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes January 19th – 22nd 2012 Lahore University of Management Sciences
Schedule • Day 1 (Saturday 21st Jan): Review of Probability and Markov Chains • Day 2 (Saturday 28th Jan): Theory of Stochastic Differential Equations • Day 3 (Saturday 4th Feb): Numerical Methods for Stochastic Differential Equations • Day 4 (Saturday 11th Feb): Statistical Inference for Markovian Processes
Today • Review of Probability • Simulation of Random Variables • Review of Discrete Time Markov Chains • Review of Continuous Time Markov Chains
Why Probability Models? • Are laws of nature truly probabilistic? • Coding uncertainty in models • Financial Markets, Biological Processes, Turbulence, Statistical Physics, Quantum Physics
Mathematical Foundations • S is a collection of elements (outcomes of an experiment) • Each (nice) subset of S is an event • A is a collection of (nice) subsets of S • The set function is called a probability measure iff
Independence • Two events are independent iff • This means that the occurrence of one does not affect the occurrence of the other
Conditional Probability • Probability of given that has occurred • Denoted by • Independence can be reformulated as =
Random Variables • A random variable X is areal valued function defined on the sample space such that • A is the state space of the random variable • If A is finite of countably infinite X is discrete • If A is an interval X is continuous
Cumulative Distribution Function • The cumulative distribution function of X is the function • F is non decreasing and right continuous and
Probability Mass Function • If X is a discrete random variable, the function is called the probability mass function of X • We also have • The cdf satisfies
Probability Density Function • If X is a continuous random variable the probability density function is given by • The cdf satisfies
Discrete Distributions • Uniform : • Bernoulli • Binomial • Poisson
Continuous Random Variables • Uniform • Exponential • Gaussian
Expectation of a R.V. • The expectation is defined as for a continuous random variable • For a discrete random variable • What is it?
Expectation of Function of a R.V. • “Law of the unconscious statistician”
Moments • The nth moment is given by • What do they ‘mean’?
Multivariate Distributions • Several random variables can be associated with the same sample space • Can define a joint pmf or pdf • In case of a bivariate random vector
Marginal pdf • The marginal pdf of X1 is given by • The marginal pdf of X2 is given by
Conditional Expectation • Conditional Expectation is given by • Note this is a function of a random variable itself!!!
Probability Generating Function • The pgf of random variable is given by • The pmf can be recovered by taking derivatives evaluated at 0
Central Limit Theorem • Why are many physical processes well modeled by Gaussians? • Let be i.i.d random variables with finite mean and variance then as the limiting distribution of is a normal
Law of Large Numbers • Let be i.i.d random variables with finite mean and variance then
Numerics • Simulate a 1-D random Walk • Calculate the mean • Calculate the Variance • Simulate a 2D random walk • Calculate the mean • Calculate the Variance
Simulating a Binomially Distributed Random Variable • Note sum of Bernoulli trials is a binomial • Let Xi be a Bernoulli trial with probability ‘p’ of success • is binomial ‘n’, ‘p’
Continuous Random Variables • Inverse Transform Method • Suppose a random variable has cdf ‘F(x)’ • Then Y=F-1(U) also had the same cdf • Generating the exponential • Generate the exponential, compare with exact cdf • Generate a r.v. with cdf
Rejection Method • Simulate & • To Simulate look @ • If accept, else reject • To Simulate N(0,1) let • If set
Section Challenge • Kruskal’s Paper and Simulation of the Kruskal Count • The n-hat problem through various approaches and simulating the n-hat problem
Boring Definitions • A stochastic process is a collection of random variables • T is the index set, S is the common sample space • For each fixed denotes a single random variable • For each fixed is a functions defined on T
Types of Stochastic Processes • Discrete Time Discrete Space (DTMC) • Discrete Time Continuous Space (Time Series) • Continuous Time Discrete Space (CTMC) • Continuous Time Continuous Space (SDE)
Discrete Time Discrete Space Processes Discrete Time Markov Chains
Discrete Time Markov Chain • The index set is discrete (finite or infinite) • Markov Property
Transition Probability Matrix • The one step transition probability is defined as • If the transition probability does not depend on n the process is stationary or homogenous • The transition matrix is
N-step Transition Probability • The n step transition probability is • How is this related to the one step transition probability? • Guess: Perhaps as the nth power?
Chapman Kolmogorov Equations • To get from i to j in n steps is equivalent to get from i to k in s steps and from k to j in n-s steps, summed over all possible intermediate k’s • The n step transitions are just powers of the once step transition!!
Communication Classes • Two states i and j ‘communicate’ ( ) if for some m and n • is an equivalence relation • The set of equivalence classes is called a ‘class’ of the DTMC • If there is only one class in a MC it is irreducible
Class Properties • Periodicity : The period of state i, ‘d(i)’; is the GCD of all such n for which • First Return Time • Transience & Recurrence • Transience • Recurrence
Mean Return Time • Let be the random variable defining the first return time • The mean of is the mean return time Transient State Recurrent State
First Passage Time • First passage time is defined as
Stationary Distribution • For a DTMC a stationary distribution is non-negative vector • i.e. the eigenvector of P corresponding to eigenvalue 1
Existence Theorem for Stationary Distribution • For a positive recurrent, aperiodic and irreducible DTMC there exists a unique stationary distribution such that
Logistic Growth • The transition probabilities are given by where • Note the correspondence with the deterministic model for
DTMC SIS Epidemic Model • Compartmental Model
The Infected Class • I is a random variable that describes the infected class I={0,1,2………N} • Two classes {0} and {1,2,….N} • {0} is the absorbing class • Average time in infected state • F is the sub matrix corresponding to transient states
DTMC SIR Epidemic Model • The transition probability is given by with
Section Challenge • Simulate • Logistic Growth • SIS Model • SIR Model • Compare mean of MC Simulation with solution of corresponding deterministic Model
Continuous Time Discrete Space Processes Continuous Time Markov Chains
Definitions • The index set is an interval • States are discrete • Markov Property for any sequence
Transition Probability • The transition probability is given by • If this only depends on the length of the time interval chain is homogenous