170 likes | 385 Views
Lecture 11. Joint Probability, Distributions and Random Variables. Jointly Distributed RVs In many situations, more than one random variable will be of interest to the experimenter. For Discrete Variables: The pmf of two discrete random variables:.
E N D
Lecture 11 Joint Probability, Distributions and Random Variables
Jointly Distributed RVs In many situations, more than one random variable will be of interest to the experimenter. • For Discrete Variables: • The pmf of two discrete random variables: • The marginal pmfs of X p1(x1), and Y p2(x2) are:
Consider two production lines that manufacture a certain item. The production rates for both lines vary randomly from day to day. Line 1 has a capacity of 4 units per day while line II has a capacity of 3 units per day. Further, both lines produce at least one unit on any given day. Let X1 = No. of units produced by line I, and X2 = No. of units produced by line II per day. The joint probability (pr) distribution (JPD) of X1 and X2 is tabulated as follows: • Example The above table implies that the joint pr P(X1 = 2, X2 = 3) = p(2, 3) = 0.10, and p(4, 2) = 0.15, etc. Further, p1(x1) and p2(x2) are referred to as the marginal pr distributions (mpds) of X1 and X2, respectively.
Further, 1 = E (X1) = 0.10 + 0.50 + 1.05 + 1.20 = 2.85 units/day, and 2 = E (X2) = 0.20 + 0.90 +1.05 = 2.15 units/day. Similarly, E (X12) = 9.05 12 = 11 = 0.9275 1 = 0.9631 E(X22) = 5.15 22 = 22 = 0.5275 2 = 0.7263. The covariance between 2 random variables (rvs) is defined as: 12 = COV(X1, X2) = E [(X1 - 1)(X2 - 2)] = E (X1 X2) 1 2. For the above example, E(X1 X2) = 0.01 + 20.05 + 30.04 + 20.05 + 40.10 + 60.10 + 30.10 + 60.15 + 90.10 + 40.04 + 80.15 + 120.11 = 6.11 12 = 6.11 (2.85)·(2.15) = 0.0175 Hence the covariance matrix is: Note that the covariance matrix is always symmetrical because ij = ji for all i j. Further, covariance must be taken only two rvs at a time (not 3).
The correlation coefficient between X1 and X2 , , is: • It can be shown that 1 +1, where = 0 implies no correlation between X1 and X2( = 0 does not always imply that X1 and X2 are independent). • A value of = 1 implies perfect correlation between X1 and X2. • A positive 0 < 1 implies that the relationship between x1 and x2 is increasing and vice a versa when 1 < 0. • For example, there is a positive correlation between X1 = the amount of irrigation, and X2 = crop yield. While, there is a negative correlation between X1 = width of road, and X2 = accident rate.
Conditional Probability Distributions – Discrete rvs The conditional probability of X2 given X1=x1 is: • For our example,
Conditional Expectations (Remember how the expected value of any discrete random variable is calculated) • For our example • E(X2|X1=1) = (0.10)·(1) + (0.50)·(2) + (0.40)·(3) = 2.30 • E(X1|X2=3) = (4/35)·(1) + (10/35)·(2) + (10/35)·(3) + (11/35)·(4) = 2.80
Continuous Bivariate Random Variables Example: Suppose X1 represents surface tension and X2 represents the acidity of the same sampling unit of a chemical product. The joint probability density function (pdf) of the random variables X1 and X2 is: f(x1, x2) = C (6 – x1 x2), 0 x1 2, 2 x2 4 Compute the value of C if 0X12, and 2X24 : (The volume under f(x1,x2) should be equal to zero, ONE )
f(x1,x2) = 0.125(6 x1 x2) is a jpdf over RX: 0X12, 2X24, because the volume under f(x1,x2) is equal to 1.00 • Compute the joint pr that a randomly selected unit has a surface tension less than 1 and an acidity not exceeding 3. • i.e. P(0X11 and 2X23) = ? P(0X11 and 2X23) = 0.375
Compute P(X1+X24) or P(X24X1) You can use the MATLAB commands below: syms x1 x2 double(int(int(1/8*(6-x1-x2),x2,2,4-x1),x1,0,2)) = 0.6667 double(int(int(1/8*(6-x1-x2),x1,0,4-x2),x2,2,4)) = 0.6667
Conditional probability density functions – Continuous rvs The conditional pdf of X2 given x1 is defined as: • Conditional Expectation:
Independent Random Variables Two random variables X1 and X2 are independent if for every pair of x1,x2 values, p(x1,x2) = p1(x1)·p2(x2), when X1 and X2 are discrete f(x1,x2) = f1(x1)·f2(x2), when X1 and X2 are continuous. If the conditions above are not satisfied for all (x1,x2) then the rvs X1 and X2 are said to be dependent. The two random variables are independent if the joint pmf or pdf is the product of the two marginal pmf’s or pdf’s. If X1 and X2 are independent, then E(X2|x1) should not be a function of X1 over the range of X1, Rx1, and vice versa.
For our discrete example, p(2,3) = 0.10 =? p1(2)·p2(3) p1(2)·p2(3) = (0.25)·(0.35) = 0.0875 0.0875 0.10 So X1 and X2 are NOT independent. • For our continuous example, Note that since X2 is not independent of X1, then the E(X2x1) is a function of x1 over the range space Rx1 = [0, 2].
Functions of two random variables Example 5.12 on page 219 The expected number of seats separating any particular 2 of the 5 people. Let X1=the seat number of the 1st, X2=the seat number of the 2nd, Possible (x1,x2) pairs are: {(1,2), (1,3), (1,4),….,(4,5)} – A total of 20 different possible ways the two might have gotten their tickets. Let h(x1,x2) be the number of seats separating the two. h(x1,x2)=|X1-X2|-1
Covariance If two rvs are not independent, a measure to assess how strongly they are related is often needed. The covariance between two rvs X1, and X2 is: A simplified formula is: Note that
Correlation Coefficient It can be seen that the covariance is a unit dependent measure. Cov(100X1,100X2)=(100)(100)Cov(X1,X2), therefore a unitless measure is more practical. The correlation coefficient of rvs X1, and X2 is: Note that and If X1 and X2 are independent, then =0, but =0 does not imply independence!! =1 or -1 iff X2=a·X1+b, for some numbers a and b with a 0.