90 likes | 264 Views
Concepts in Probabilistic Convergence and Theorems on the Limit of iid Mean. Tutorial 10, STAT1301 Fall 2010, 30NOV2010, MB103@HKU By Joseph Dong. Sure Convergence. Convergence of a Point S equence The limit is a point
E N D
Concepts inProbabilistic Convergence and Theorems on the Limit of iid Mean Tutorial 10, STAT1301 Fall 2010, 30NOV2010, MB103@HKUBy Joseph Dong
Sure Convergence • Convergence of a Point Sequence • The limit is a point • Local convergence of a random variable sequence at a particular outcome in the sample space requires the value sequence of these random variables evaluated at to converge. • Local convergence is the convergence of a point sequence. • Local convergence is yes or no. There doesn’t exist any halfway between sure local convergence and sure local divergence. • Sure Convergence of a random variable sequence • The same as Everywhere Convergence of a function sequence. Using the word “sure” because we are addressing events rather than sets. The domain of a random variable is the state space. The whole state space maps to the “sure event.” • The limit is a random variable. Formal definition of sure convergence of an r.v. sequence: • Sometimes the limit can also be a single value(e.g. ). In this case, the limit random variable is a constant, non-random variable, sending each outcome of state space to the single point of sample space. • Sure convergence of a random variable sequence is a slack way to say that the random variable locally converges everywhere on the state space. • This makes it to become convergence of a random variable sequence.
Almost Sure Convergence • Addresses convergence of a random variable sequence. • is sure convergence except for a minuscule set of places on the state space. • Is sure convergence except for a null event. • A null event is a possible event (non-empty set) with probability zero. • E.g. • , the event {1, 3, 5,} is a null event because . • {Randomly draw a point from [0,1] and the outcome is the point 0.5} because the probability implied here is and . • {Tossing a fair coin infinitely many times and the outcome is a sequence of heads only} because the outcome is the singleton {0000000……} and it has the probability zero. • is convergence with probability one, but still not on the entire state space. • is divergence with probability zero, but still diverges on a possible event. • Definition • For almost all but maybe not all ’s in , . • Using probability notation to be precise about “almost all”:
Convergence in Probability • The random variable sequence converges on an event with arbitrarily high probability. • “1” is the highest possible probability. • “1” is an “arbitrarily high probability”, but not the converse. • The layman’s language does not distinguish between “highest” and “arbitrarily high.” • We resort to mathematical language for a description of the subtle difference. • To account for the qualifier “arbitrary” we need someone to arbitrarily specify a bound so that above it every value is “arbitrarily high.” Say someone specified a very small so that is a very high bound to satisfy her requirement. Convergence in Probability means: there will be some large enough to let fall within the arbitrarily small -neighborhood of . • “Falling at the single point 1” is a much stringent requirement than “Falling within an arbitrarily small neighborhood of 1.” • You’ll need a much larger to fulfill the former than the latter.
Convergence in Distribution • Very weak form of probabilistic convergence. • For each in the sample space, need not even be close to . • The convergence need not take place in the state space. • The convergence does take place in sample space on which the CDF function is defined. • E.g. Tossing a fair coin gives the sample space . If we want to define a random variable from this sample space to we at least have two choices: and . These are completely different random variables but they share exactly the same sample space and the same probability measure on it: both of them measure {0} with probability 0.5 and {1} with probability 0.5 in the sample space. Thus they are equal by distribution. If we define a sequence of r.v.s all equal to , then this sequence should converge to in distribution. • Definition:
Logical Strong-Weak relationship • Convergence to a Constant • A special situation happens when the limit random variable is a constant (not random at all), in this case, Convergence in Probability to a constant is equivalent to Convergence in Distribution to that same constant. That is
IID Mean • IID = Independent and Identically Distributed • An iid sequence of r.v.s means • ’s are mutually independent r.v.s. • All ’s follow one same distribution, say, . • IID mean is the arithmetic mean of the iid sequence • IID mean is a transformation of the random variables. • IID mean is a random variable. • IID mean is closely related to numerator IID sum.
WLLN, SLLN, and CLT • CLT • WLLN • SLLN
Consequence of CLT • One consequence of CLT is we can now at least approximate the distribution of an IID sum by normal distribution. • Binomialis an IID sum of Bernoullis • Poisson is an IID sum of Poissons • Negative Binomial is an IID sum of Geometrics • Negative Binomial is also an IID sum of Negative Binomials • Gamma is an IID sum of Exponentials • Gamma is also an IID sum of Gammas • Chisq is an IID sum of Chisqs • All of the above (and many others) can be approximated by appropriately parameterized Normal distributions.