380 likes | 572 Views
ORDER STATISTICS AND LIMITING DISTRIBUTIONS. ORDER STATISTICS. Let X 1 , X 2 ,…,X n be a r.s. of size n from a distribution of continuous type having pdf f(x), a<x<b . Let X (1) be the smallest of X i , X (2) be the second smallest of X i ,…, and X (n) be the largest of X i.
E N D
ORDER STATISTICS • Let X1, X2,…,Xn be a r.s. of size n from a distribution of continuous type having pdf f(x), a<x<b. Let X(1) be the smallest of Xi, X(2) be the second smallest of Xi,…, and X(n) be the largest of Xi. • X(i) is the i-th order statistic.
ORDER STATISTICS • It is often useful to consider ordered random sample. • Example: suppose a r.s. of five light bulbs is tested and the failure times are observed as (5,11,4,100,17). These will actually be observed in the order of (4,5,11,17,100). Interest might be on the kth smallest ordered observation, e.g. stop the experiment after kth failure. We might also be interested in joint distributions of two or more order statistics or functions of them (e.g. range=max – min)
ORDER STATISTICS • If X1, X2,…,Xn is a r.s. of size n from a population with continuous pdf f(x), then the joint pdf of the order statistics X(1), X(2),…,X(n) is Order statistics are not independent. The joint pdf of ordered sample is not same as the joint pdf of unordered sample. Future reference: For discrete distributions, we need to take ties into account (two X’s being equal). See, Casella and Berger, 1990, pg 231.
Example • Suppose that X1, X2, X3 is a r.s. from a population with pdf f(x)=2x for 0<x<1 Find the joint pdf of order statistics and the marginal pdf of the smallest order statistic.
ORDER STATISTICS • The Maximum Order Statistic: X(n)
ORDER STATISTICS • The Minimum Order Statistic: X(1)
y … y1 y2 yk-1 yk yk+1 yn … ORDER STATISTICS • k-th Order Statistic # of possible orderings n!/{(k1)!1!(n k)!} P(X<yk) P(X>yk) fX(yk)
Example • Same example but now using the previous formulas (without taking the integrals): Suppose that X1, X2, X3 is a r.s. from a population with pdf f(x)=2x for 0<x<1 Find the marginal pdf of the smallest order statistic.
Example • X~Uniform(0,1). A r.s. of size n is taken. Find the p.d.f. of kth order statistic. • Solution: Let Yk be the kth order statistic.
y … y1 y2 yk-1 yk yk+1 yn … ORDER STATISTICS • Joint p.d.f. of k-th and j-th Order Statistic (for k<j) k-1 items j-k-1 items n-j items # of possible orderings n!/{(k1)!1!(j-k-1)!1!(n j)!} 1 item 1 item yj-1 yj yj+1 P(X<yk) P(yk<X<yj) P(X>yj) fX(yk) fX(yj)
Motivation • The p.d.f. of a r.v. often depends on the sample size (i.e., n) • If X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) is a function of them, sometimes it is possible to find the exact distribution of Yn (what we have been doing lately) • However, sometimes it is only possible to obtain approximate results when n is large limiting distributions.
CONVERGENCE IN DISTRIBUTION • Consider that X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) be a function of rvs with cdfs Fn(y) so that for each n=1, 2,… where F(y) is continuous. Then, the sequence Ynis said to converge in distribution to Y.
CONVERGENCE IN DISTRIBUTION • Theorem:If for every point yat which F(y) is continuous, then Yn is said to have a limiting distribution withcdfF(y). The term “Asymptotic distribution” is sometimes used instead of “limiting distribution” • Definition of convergence in distribution requires only that limiting function agrees with cdf at its points of continuity.
Example Let X1,…, Xn be a random sample from Unif(0,1). Find the limiting distribution of the max order statistic, if it exists.
Example Let {Xn} be a sequence of rvs with pmf Find the limiting distribution of Xn , if it exists.
Example Let Yn be the nth order statistic of a random sample X1, …, Xn from Unif(0,θ). Find the limiting distribution of Zn=n(θ -Yn), if it exists.
Example Let X1,…, Xn be a random sample from Exp(θ). Find the limiting distribution of the min order statistic, if it exists.
Example • Suppose that X1, X2, …, Xn are iid from Exp(1). Find the limiting distribution of the max order statistic, if it exists.
LIMITING MOMENT GENERATING FUNCTIONS • Let rv Yn have an mgf Mn(t) that exists for all n. If then Ynhas a limiting distribution which is defined by M(t).
Example Let =np. The mgf of Poisson() The limiting distribution of Binomial rv is the Poisson distribution.
CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE) • A rv Yn convergence in probability to a rv Y if for every >0.
CHEBYSHEV’S INEQUALITY • Let X be an rv with E(X)= and V(X)=2. • The Chebyshev’s Inequality can be used to prove stochastic convergence in many cases.
1.E(Yn)=n where CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE) • The Chebyshev’s Inequality proves the convergence in probability if the following three conditions are satisfied.
Example Let X be an rv with E(X)= and V(X)=2<.For a r.s. of size n, is the sample mean. Is
WEAK LAW OF LARGE NUMBERS (WLLN) • Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0, converges in probability to . WLLN states that the sample mean is a good estimate of the population mean. For large n, the sample mean and population mean are close to each other with probability 1. But more to come on the properties of estimators.
STRONG LAW OF LARGE NUMBERS • Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0, that is, converges almost surely to .
Relation between convergences Almost sure convergence is the strongest. (reverse is generally not true)
THE CENTRAL LIMIT THEOREM • Let X1, X2,…,Xn be a sequence of iid rvs with E(Xi)= and V(Xi)=2<∞. Define . Then, or Proof can be found in many books, e.g. Bain and Engelhardt, 1992, page 239
Example • Let X1, X2,…,Xn be iid rvs from Unif(0,1) and • Find approximate distribution of Yn. • Find approximate values of • The 90th percentile of Yn.
SLUTKY’S THEOREM • If XnX in distribution and Yna, a constant, in probability, then a) YnXnaX in distribution. b) Xn+YnX+a in distribution.
SOME THEOREMS ON LIMITING DISTRIBUTIONS • If Xn c>0 in probability, then for any function g(x) continuous at c, g(Xn)g(c) in prob. e.g. • If Xnc in probability and Ynd in probability, then • aXn+bYn ac+bd in probability. • XnYn cd in probability • 1/Xn 1/c in probability for all c0.
Example 1. X~Gamma(, 1). Show that
Example 2. Let X1,…,Xn be r.v.s. each with Gamma(n+1,β). Show that Xn/n converges in probability to a constant c, and find c.
Example 3. Let Xn be “displaced exponential” with p.d.f. Find the limiting distribution.
Example 4. Let the sample space S={1,2,3,4}. Define the following r.v.s: Xn(1)=Xn(2)=1,Xn(3)=Xn(4)=0 for n=1,2,… and X(1)=X(2)=0, X(3)=X(4)=1 a) Does Xn converge in probability to X as n→∞? b) Does Xn converge in distribution to X as n→∞?