1 / 43

Pairs of Random Variables

Pairs of Random Variables. Random Process. Introduction. In this lecture you will study: Joint pmf, cdf, and pdf Joint moments The degree of “correlation” between two random variables Conditional probabilities of a pair of random variables. Two Random Variables.

Download Presentation

Pairs of Random Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pairs of Random Variables Random Process

  2. Introduction • In this lecture you will study: • Joint pmf, cdf, and pdf • Joint moments • The degree of “correlation” between two random variables • Conditional probabilities of a pair of random variables

  3. Two Random Variables • The mapping is written as to each outcome is S

  4. Example 1

  5. Example 2

  6. Two Random Variables • The events evolving a pair of random variables (X, Y) can be represented by regions in the plane

  7. Two Random Variables • To determine the probability that the pair is in some region B in the plane, we have • Thus, the probability is • The joint pmf, cdf, and pdf provide approaches to specifying the probability law that governs the behavior of the pair (X, Y) • Firstly, we have to determine what we call product form where Ak is one-dimensional event

  8. Two Random Variables • The probability of product-form events is • Some two-dimensional product-form events are shown below

  9. Pairs of Discrete Random Variables • Let the vector random variable assume values from some countable set • The joint pmf of X specifies the probabilities of event • The values of the pmf on the set SX,Y provide

  10. Pairs of Discrete Random Variables

  11. Pairs of Discrete Random Variables • The probability of any event B is the sum of the pmf over the outcomes in B • When the event B is the entire sample space SX,Y, we have

  12. Marginal Probability Mass Function • The joint pmf provides the information about the joint behavior of X and Y • The marginal probability mass function shows the random variables in isolation similarly

  13. Example 3

  14. The Joint Cdf of X and Y • The joint cumulative distribution function of X and Y is defined as the probability of the event • The properties are

  15. The Joint Cdf of X and Y

  16. Example 4

  17. The Joint Pdf of Two Continuous Random Variables • Generally, the probability of events in any shape can be approximated by rctangles of infinitesimal width that leads to integral operation • Random variables X and Y are jointly continuous if the probability of events involving (X, Y) can be expressed as an integral of probability density function • The joint probability density function is given by

  18. The Joint Pdf of Two Continuous Random Variables

  19. The Joint Pdf of Two Continuous Random Variables • The joint cdf can be obtained by using this equation • It follows • The probability of rectangular region is obtained by letting

  20. The Joint Pdf of Two Continuous Random Variables • We can, then, prove that the probability of an infinitesimal rectangle is • The marginal pdf’s can be obtained by

  21. The Joint Pdf of Two Continuous Random Variables

  22. Example 5

  23. Example 5

  24. Example 6

  25. Independence of Two Random Variables • X and Y are independent random variable if any event A1 defined in terms of X is independent of any event A2 defined in terms of Y • If X and Y are independent discrete random variables, then the joint pmf is equal to the product of the marginal pmf’s

  26. Independence of Two Random Variables • If the joint pmf of X and Y equals the product of the marginal pmf’s, then X and Y are independent • Discrete random variables X and Y are independent iff the joint pmf is equal to the product of the marginal pmf’s for all xj, yk

  27. Independence of Two Random Variables • In general, the random variables X and Y are independent iff their joint cdf is equal to the product of its marginal cdf’s • In continuous case, X and Y are independent iff their joint pdf’s is equal to the product of the marginal pdf’s

  28. Joint Moments and Expected Values • The expected value of is given by • Sum of random variable

  29. Joint Moments and Expected Values • In general, the expected value of a sum of n random variables is equal to the sum of the expected values • Suppose that , we can get

  30. Joint Moments and Expected Values • The jk-th joint moment of X and Y is given by • When j = 1 and k = 1, we can say that as correlation of X and Y • And when E[XY] = 0, then we say that X and Y are orthogonal

  31. Conditional Probability Case 1: X is a Discrete Random Variable • For X and Y discrete random variables, the conditional pmf of Y given X = x is given by • The probability of an event A given X = xk is found by using • If X and Y are independent, we have

  32. Conditional Probability • The joint pmf can be expressed as the product of a conditional pmf and marginal pmf • The probability that Y is in A can be given by

  33. Conditional Probability • Example:

  34. Conditional Probability • Suppose Y is a continuous random variable, the conditional cdf of Y given X = xk is • We, therefore, can get the conditional pdf of Y given X = xk • If X and Y are independent, then • The probability of event A given X = xk is obtained by

  35. Conditional Probability • Example: binary communications system

  36. Conditional Probability Case 2: X is a continuous random variable • If X is a continuous random variable then P[X = x] = 0 • If X and Y have a joint pdf that is continuous and nonzero over some region of the plane, we have conditional cdf of Y given X = x

  37. Conditional Probability • The conditional pdf of Y given X = x is • The probability of event A given X = x is obtained by • If X and Y are independent, then and • The probability Y in A is

  38. Conditional Probability • Example

  39. Conditional Expectation • The conditional expectation of Y given X = x is given by • When X and Y are both discrete random variables

  40. Conditional Expectation • In particular we have where

  41. Pairs of Jointly Gaussian Random Variables • The random variables X and Y are said to be jointly Gaussian if their joint pdf has form

  42. Lab assignment • In group of 2 (for international class: do it personally), refer to Garcia’s book, example 5.49, page 285 • Run the program in MATLAB and analyze the result • Your analysis should contain: • The purpose of the program • Line by line explanation of the program (do not copy from the book, remember NO PLAGIARISM is allowed) • The explanation of Fig. 5.28 and 5.29 • The relationship between the purpose of the program and the content of chaper 5 (i.e. It answers the question: why do we study Gaussian distribution in this chapter?) • Try using different parameter’s values, such as 100 observation, 10000 observation, etc and analyze it • Due date: next week

  43. Regular Class: NEXT WEEK: QUIZ 1 Material: Chapter 1 to 5, Garcia’s book Duration: max 1 hour

More Related