190 likes | 630 Views
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki. Chapter 7: Expectation and variance. The expectation of a discrete random variable X taking the values a1, a2, . . . and with probability mass function p is the number :.
E N D
CIS 2033 based onDekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance
The expectation of a discrete random variable X taking the values a1, a2, . . . and with probability mass function p is the number: We also call E[X] the expected valueor meanof X. Since the expectation is determined by the probability distribution of X only, we also speak of the expectation or mean of the distribution. Expected values of discrete random variable
Example • Let X be the discrete random variable that takes thevalues 1, 2, 4, 8, and 16, each with probability 1/5. Compute the expectation of X.
Bernoulli Distribution • Let X have Bernoulli distribution with the probability of success p.
Binomial Distribution • Let X have Binomial distribution with the probability of success p and the number of trails n. • Computing the expectation of X directly leads to a complicated formula, but we can use the fact that X can be represented as the sum of n independent Bernoulli variables: Note: We do not need the independence assumption for the expected value, since it is a linear function of RVs, but we need it for variance.
Geometric Distribution • Let X have Geometric distribution with the probability of success p. We skip the derivation of variance.
The expectationof a continuous random variableX with probability density function f is the number We also call E[X] the expected valueor meanof X. Note that E[X] is indeed the center of gravity of the mass distribution described by the function f: Expected values of continuous random variable
Uniform U(a,b) • Let X be uniform U(a, b). Then f(x)= 1/(b-a) for x in [a, b] and zero outside this interval.
The EXPECTATION of a GEOMETRIC DISTRIBUTION. Let X have a geometric distribution with parameter p; then The EXPECTATION of an EXPONENTIAL DISTRIBUTION. Let X have an exponential distribution with parameter λ; then The EXPECTATION of a NORMAL DISTRIBUTION. Let X be an N(μ, σ2) distributed random variable; then
The CHANGE-OF-VARIABLE FORMULA. Let X be a random variable, and let g : R → R be a function. If X is discrete, taking the values a1, a2, . . . , then If X is continuous, with probability density function f, then Example: Let X have a Ber(p) distribution. Compute E(2X).
The varianceVar(X) of a random variable X is the number Standard deviation: Variance of a normal distribution. Let X be an N(μ, σ2) distributed random variable. Then Variance of an EXPONENTIAL DISTRIBUTION. Let X have an exponential distribution with parameter λ; then
An alternative expression for the variance. For any random variable X, is called the second moment of X. We can derive this equation from:
Example. Let X takes the values 2, 3, and 4 with probabilities 0.1, 0.7, and 0.2. We can compute that E[X]= 3.1.
Expectation and variance under change of units. For any random variable X and any real numbers r and s, and