500 likes | 1.47k Views
SUMS OF RANDOM VARIABLES. Changfei Chen. Sums of Random Variables. Let be a sequence of random variables, and let be their sum: . Mean and Variance of Sums of Random Variables.
E N D
SUMS OF RANDOM VARIABLES Changfei Chen
Sums of Random Variables • Let be a sequence of random variables, and let be their sum:
Mean and Variance of Sums of Random Variables • The expected value of a sum of n random variables is equal to the sum of the expected values: • Note: regardless of the statistical dependence of the r.v.
Mean and Variance of Sums of Random Variables • Variance of sum of r.v.
Mean and Variance of Sums of Random Variables • Since that the covariance is not necessarily equal to zero in general, the variance of sum is not necessarily equal to the sum of variances of each r.v.. • In case that all the r.v. are independent, the covariance will be zeros. Then
pdf of Sums of Independent R.V. • Here are n independent r.v. • First look at the sum of two independent r.v. • Z=X+Y • The characteristic function of Z: • (1)
pdf of Sums of Independent R.V. • The cdf of Z: • Then the pdf of Z: • p.s. Go through ‘Leibniz Rule’ in Calculus
pdf of Sums of Independent R.V. • can be viewed as the Fourier transform of the pdf of Z, so: by equation (1) • The Fourier transform of a convolution of two functions is equal to the product of the individual Fourier transforms.
pdf of Sums of Independent R.V. • Now considering the sum of more r.v. • Let • Thus the pdf of the sum of r.v. can be found by finding the inverse Fourier transform of the product of the individual characteristic functions.
The Sample Mean • X be a random variable for which the mean, • , is unknown. • denote n independent, repeated measurements of X, i.e. the are independent, identically distributed r.v. (each has the same probability distribution as the others and all are mutually independent)with the same pdf as X. • Then the sample mean, , of the sequence is used to estimate E[X]:
The Sample Mean • The expected value of the sample mean: • (where ) • So the mean of the sample mean is equal to • So we say that the sample mean is an Unbiased Estimator for
The Sample Mean • Then the mean square error of the sample mean about is equal to the variance of the sample mean.
The Sample Mean • Let • Then • So • is the variance of Xi
The Sample Mean • So as n, the number of samples, increases, the variance of the sample mean approaches zero, which means that the probability that the sample mean is close to the true mean approaches one as n becomes very large.
The Sample Mean • Use the Chebyshev inequality to formalize the probability: • Thus for any choice of error and probability , we can select the number of samples,n, to have the sample mean be within of the true mean with probability
The Sample Mean • Example: • A Voltage of constant, but unknown, value is to be measured. Each measurement Xi is actually the sum of the desired voltage v and a noise voltage Ni of zero mean and standard deviation of 1 microvolt: • Assume that the noise are independent variables. How many measurements are required so that the probability that is within microvolt of the true mean is at least .99?
The Sample Mean • Example (Continue): • From the problem, we know that each measurement Xi has mean v and variance 1. • Moreover, we know • So • We can solve the above equation and get n=100. • Thus if repeat the measurement 100 times, we can have the sample mean of the measurement results, on average, be of 99% probability within 1 microvolt.
Weak Law of Large Numbers • If we let the number of sample,n, approach infinity, • The above is the expression of the weak law of large numbers, which states that for a large enough fixed value of n, the sample mean using n samples will be close to the true mean with high probability.
Strong Law of Large Numbers • Let be a sequence of iid r.v. with finite mean and finite variance • which states that with probability 1, every sequence of sample mean calculations will eventually approach and stay close to • The strong law of large numbers demonstrates the consistency between the theory and the observed physical behavior.
Strong Law of Large Numbers • Example: Relative Frequency • Consider a sequence of independent repetitions of some random experiment and let the r.v. be the indicator function for the occurrence of event A in the ith trial. The total number of occurrences of A in the first n trials is then • The relative frequency of event A in the first n repetitions of the experiment is then • Thus the relative frequency is simply the sample mean of the random variables
Strong Law of Large Numbers • Example (Continue): • So, apply the weak law of large numbers to the relative frequency: • apply the strong law of large numbers to the relative frequency:
The Central Limit Theorem • Let be the sum of n iid random variables with finite mean and finite variance , and let be the zero-mean, unit variance r.v. defined by (normalize the ) • As we know the pdf of Gaussian r.v. is • Where m is the mean and is the variance of Gaussian r.v. • Then • Which states that as n becomes large, the cdf of approaches the cdf of a Gaussian r.v.
The Central Limit Theorem • In central limit theorem can be any distributions as they have a finite mean and finite variance, which gives it the wide applicability. • The central limit theorem explains why the Gaussian r.v. appears in so many applications.
The Central Limit Theorem • Example: • Suppose that the orders at a restaurant are iid r.v. with mean and standard deviation . Estimate the probability that the first 100 customers spend a total of (1) more than $840. (2) between $780 and $820. • Let denote the expenditure of the ith customer, then the total spent of the first 100 customers will be • The mean and variance of are • Normalize the
The Central Limit Theorem • Example (Continue): • Thus, • (1) • (2)
Questions? • Thank you!
More: Q-function • The values of the Q(x) in the previous example come from the table of Q-function. • The Q-function is defined by • Where is the cdf of a Gaussian r.v. with zero mean and unit variance. • Properties of Q-function:
More: Proof of the central limit theorem The characteristic function of is given by Expanding the exponential in the expression, we get The term can be neglected relative to as n becomes large. Thus,