1 / 27

Advanced Concepts in Expectation

Advanced Concepts in Expectation. Expectation of a Random Variable. Discrete distribution Continuous distribution E ( X ) is called expected value , mean or expectation of X . E ( X ) can be regarded as being the center of gravity of that distribution. E ( X ) exists if and only if

yaholo
Download Presentation

Advanced Concepts in Expectation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Concepts in Expectation

  2. Expectation of a Random Variable • Discrete distribution • Continuous distribution • E(X) is called expected value, mean or expectation of X. • E(X) can be regarded as being the center of gravity of that distribution. • E(X) exists if and only if • E(X) exists if and only if Whenever X is a bounded random variable, then E(X) must exist.

  3. Examples of Expectation • Discrete distribution Suppose X have Pr(X = -2) = 0.1, Pr(X = 0) = 0.4, Pr(X = 1) = 0.3 and Pr(X = 4) = 0.2 E(X)= -2(0.1)+ 0(0.4)+ 1(0.3)+ 4(0.2) = 0.9 • Continuous distribution • Cauchy distribution E(X) does not exist! The tails of this curve y=xf(x)approach x-axis slowly that the total area is infinite.

  4. The Expectation of a Function • Let , then • Let , then • Let it can be shown that

  5. Examples 4.1.3 & 4.1.5: Expectation of Function • Suppose X has p.d.f as follows: • Suppose X and Y have a uniform distribution over the square S containing all points

  6. Properties of Expectations • If Y= aX+b, where a and b are constants, then E(Y) = aE(X)+b Proof : • If there exists a constant such that Proof :

  7. Properties of Expectations • If are n random variables such that each exists, then Proof : • For all constants , • Usually Only linear functions g satisfy

  8. Example 4.2.4 • Sampling without replacement A box contains red balls (proportion p) and blue balls. Suppose that n balls are selected from the box at random without replacement, and let X denote the number of red balls that are selected. Determine E(X). Let if the ith ball selected is red, and if it is blue. • Sampling with replacement (Binomial distribution) Suppose n balls are selected form the box at random with replacement. Alternative way to derive E(X).

  9. Example 4.2.5: Expected Number of Matches • Suppose that a person types n letters, types the addresses on n envelopes, and then places each letter in an envelop in a random manner. Let X be the number of letters that are placed in the correct envelopes. • Let if the ith letter is placed in the correct envelope, and otherwise.

  10. Expectation of a Product • If are nindependent random variable such that each exists, then • Proof:

  11. Variance and Standard Deviation • The variance of X, denoted by Var(X ) or , is defined as follows: The standard deviation, denoted by , is defined as the square root of the variance. If X is bounded, then Var(X) must exist. • Example 4.3.1: Suppose X can take the five values –2, 0, 1, 3, 4 with equal probability.

  12. Properties of the Variance • Var(X ) = 0 if and only if there exists a constant c such that Pr(X = c) = 1. • For constant a and b, . Proof :

  13. Properties of the Variance • If X1 , …, Xnare independent random variables, then • If X1,…, Xn are independent random variables, then

  14. Variance of the Binomial Distribution • Suppose that a box contains red balls (proportion equals p) and blue balls. A random sample of n balls is selected with replacement.

  15. Moment Generating Functions • Consider a given random variable X and for each real number t, we shall let . The function is called the moment generating function (m.g.f.) of X. • Suppose that the m.g.f. of X exists for all values of t in some open interval around t = 0. Then, • More generally,

  16. Example 4.4.2: Calculating an m.g.f. • Suppose that X is a random variable and has p.d.f. as follows: • Determine the m.g.f. of X and also Var(X)

  17. Properties of Moment Generating Functions • Let X has m.g.f. ; let Y = aX+b has m.g.f. . Then for every value of t such that exists, Proof: • Suppose that X1,…, Xn are n independent random variables; and for i = 1,…, n, let denote the m.g.f. of Xi. Let , and let the m.g.f. of Y be denoted by . Then for every value of t such that exists, we have Proof:

  18. The m.g.f. for the Binomial Distribution • Suppose that a random variable X has a binomial distribution with parameters n and p. We can represent X as the sum of n independent random variables X1,…, Xn. • Determine the m.g.f. of

  19. Uniqueness of Moment Generating Functions • If the m.g.f. of two random variables X1 and X2 are identical for all values of t in an open interval around t = 0, then the probability distributions of X1 and X2 must be identical. • The additive property of the binomial distribution Suppose X1 and X2 are independent random variables. They have binomial distributions with parameters n1 and p and n2 and p. Let the m.g.f. of X1 + X2 be denoted by . The distribution of X1 + X2 must be binomial distribution with parameters n1 + n2 and p.

  20. Covariance and Correlation • Let X and Y be random variables having a specified joint distribution and let . The covariance of X and Y is defined as: • In order to obtain a measure of association between X and Y that is not driven by arbitrary changes in the scales of one or the other random variable, we define the correlation of X and Y as:

  21. Examples 4.6.1 & 4.6.2: Computing Covariance and Correlation • Let X and Y has joint p.d.f. • Determine covariance and correlation The marginal p.d.f. of X is The marginal p.d.f. of Y is

  22. Properties of Covariance and Correlation • For all random variables X and Y such that Proof: • If X and Y are independent random variables with Proof: If X and Y are independent, then E(XY)=E(X)E(Y). Therefore,

  23. Meaning of Correlation • Correlation only measures linear relationship. • A small value of does not mean that X and Y are not close to being related. Two random variables can be “dependent” but “uncorrelated”(nonlinear). • Example: Suppose that X can take only three values –1, 0, and 1, and that each of these three values has the same probability. Let Y=X 2. So X and Y are dependent. E(XY)=E(X 3)=E(X)=0, so Cov(X,Y) = E(XY) – E(X)E(Y)=0 (uncorrelated). y 1 x -1 1

  24. Properties of Variance and Covariance • If X and Y are random variables such that and , then Proof:

  25. Distribution p.m.f. or p.d.f. m.g.f. y (t) E(X) Var(X) Bernoulli Binomial Poisson Geometric Negative Binomial Summary of Discrete Distributions

  26. Summary of Continuous Distributions

  27. Summary of Different Distributions Poisson process (Time to 1st event) (Time to nth event)

More Related