1 / 54

Chapter 3 Random vectors and their numerical characteristics

Chapter 3 Random vectors and their numerical characteristics. 3.1 Random vectors. 1.n-dimension variables n random variables X 1 , X 2 , ...,X n compose a n-dimension vector (X 1, X 2 ,...,X n ), and the vector is named n-dimension variables or random vector.

noma
Download Presentation

Chapter 3 Random vectors and their numerical characteristics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 3Random vectors and their numerical characteristics

  2. 3.1 Random vectors 1.n-dimension variables n random variables X1,X2,...,Xn compose a n-dimension vector (X1,X2,...,Xn), and the vector is named n-dimension variables or random vector. 2.Joint distribution of random vector Define function F(x1,x2,…xn)= P(X1≦x1,X2≦x2,...,Xn ≦xn) the joint distribution function of random vector (X1,X2,...,Xn).

  3. Bivariate distribution Let (X, Y) be bivariable and (x, y)R2, define F(x,y)=P{Xx, Yy} the bivariate joint distribution of (X, Y) Geometric interpretation:the value of F( x, y) assume the probability that the random points belong to eara in dark

  4. For (x1, y1), (x2, y2)R2, (x1<x2, y1<y2 ), then P{x1<Xx2, y1<Yy2 } =F(x2, y2)-F(x1, y2)- F (x2, y1)+F (x1, y1). (x1, y2) (x2, y2) (x1, y1) (x2, y1)

  5. EX Suppose that the joint distribution of (X,Y) is (x,y), find the probability that (X,Y) stands in erea G . G Answer

  6. Joint distribution F(x, y) has the following characteristics: (1)For all(x, y) R2 , 0 F(x, y)  1,

  7. (2) Monotonically increment for any fixed y R, x1<x2yields F(x1, y)  F(x2 , y); for any fixed x R, y1<y2 yields F(x, y1)  F(x , y2). (3) right-hand continuous for xR, yR,

  8. (4) for all (x1, y1), (x2, y2)R2, (x1<x2, y1<y2 ), F(x2, y2)-F(x1, y2)- F (x2, y1)+F (x1, y1)0. Conversely, any real-valued function satisfied the aforementioned 4 characteristics must be a joint distribution of some bivariable.

  9. Example 1. Let the joint distribution of (X,Y) is • Find the value of A,B,C。 • Find P{0<X<2,0<Y<3} Answer

  10. Discrete joint distribution If both x and y are discrete random variable, then,(X, Y) take values in (xi, yj), (i, j=1, 2, … ), it is said that X and Y have a discrete joint distribution. Thejoint distribution is defined to be a function such that for any points(xi, yj), P{X=xi, Y= yj,}= pij , (i, j=1, 2, … ). That is (X, Y)~ P{X=xi, Y= yj,}= pij ,(i, j=1, 2, … ),

  11. The joint distribution can also be specified in the following table X Y y1 y2 … yj … p11p12 ...P1j ... p21p22 ...P2j ... pi1pi2 ...Pij ... x1 x2 xi ... ... ... ... ... ... ... ... • Characteristics of joint distribution : • pij0 , i, j=1, 2, … ; • (2)

  12. Example 2. Suppose that there are two red balls and three white balls in a bag, catch one ball form the bag twice without put back, and define X and Y as follows: Please find the joint distribution of (X,Y) Y 1 0 X 1 0

  13. Continuous joint distributions and density functions 1. It is said that two random variables (X, Y) have a continuous joint distribution if there exists a nonnegative function f (x, y) such that for all (x, y)R2,the distribution functionsatisfies and denote it with (X, Y)~ f (x, y), (x, y)R2

  14. 2. characteristics of f(x, y) (1)f (x, y)0, (x, y)R2; (2) (3)若f (x, y)在(x, y)R2处连续,则有

  15. (4) For any region G R2, EX Let Find P{X>Y} y 1 G 1 x

  16. EX Find (1)the value of A; (2) the value of F(1,1); (3) the probability of (X, Y)stand in region D:x0, y0, 2X+3y6 1 Answer (1) Since 1

  17. (3)

  18. 3. Bivariate uniform distribution Bivariate (X, Y) is said to follow uniform distribution if the density function of is specified by By the definition, one can easily to find that if (X, Y) is uniformly distributed, then

  19. EX Suppose that (X,Y) is uniformly distributed on area D, which is indicated by the graph on the left hand, try to determine: (1)the density function of (X,Y); (2)P{Y<2X} ; (3)F(0.5,0.5) Answer

  20. (2)Two dimensional normal distribution Suppose that the density function of (X, Y) is specified by where,1、2 are constants and 1>0,2>0、|  |<1 are also constant , then, it is said that (X, Y) follows the two-dimensional normal distribution with parameters 1, 2, 1, 2,  and denoted it by

  21. The concept of joint distribution can be easily generalized to n-dimensional random vectors. Definition. Suppose that (X1,X2,...Xn) is a n-dimensional random vector,if for any n-dimensional cube there exists a nonnegative f(x1,x2,...xn) such that It is said that (X1,X2,...Xn) follows continuous n-dimensional distribution with density function f(x1,x2,...xn)

  22. Definition. Suppose that (X1,X2,...Xn) assume finite or countable points on Rn . We call thatX1,X2,...Xn) is a discrete distributed random vector and P{X1=x1,X2=x2,...Xn=xn}, (x1,x2,...xn) ∈Rn is the distribution law of (X1,X2,...Xn)

  23. Multidimensional random varibables Discrete d.f. Continuous d.f. Distribution function P{(X,Y)G} Standardized Probability for area

  24. EX: Suppose that(X,Y)has density funciton Determine:(1)P{X0},(2)P{X1},(3)P{Y  y0} y Answer: P{X0}=0 D x

  25. 3.2 Marginal distribution and independence Marginal distribution function for bivariate Define FX(x)=F (x, +)= =P{Xx} FY(y)=F (+, y)= =P{Yy} the marginal distribution of (X, Y) with respect to X and Y respectively

  26. Example 1. Suppose that the joint distribution of (X,Y) is specified by Determine FX(x) and FY(y)。

  27. Marginal distribution for discrete distribution Suppose that (X, Y)~ P{X=xi, Y= yj,}= pij ,i, j=1, 2, … Define P{X=xi}=pi.= ,i=1, 2, … P{Y= yj}=p.j= ,j=1, 2, … the marginal distribution of (X, Y) with respect to X and Y respectively.

  28. Marginal density function Suppose that (X, Y)~f (x, y), (x, y)R2, define the joint distribution of (X,Y) with respec to X and Y. One canc easily to find that the marginal distribution of N(1, 2, 12, 22, ) is N(1, 12) and and N(2, 22).

  29. Example 3. Suppose that the joint density function of (X,Y)is specified by Determine (1) the value of c; (2)the marginal distribution of (X,Y) with respect to X

  30. Independence of random vectors Definition It is said that X is independent of Y for any real number a<b, c<d,p{a<Xb,c<Yd} =p{a<Xb}p{c<Yd},i.e. event{a<Xb}is independent of {c<Yd}. Theorem A sufficient and necessary condition for random variables X and Y to be independent is F(x,y)=FX(x)FY(y)

  31. Remark (1) If (X,Y) is continuously distributed, then a sufficient condition for X and Y to be independent is f(x,y)=fX(x)fY(y) (2) If (X,Y) is discrete distributed with law Pi,j=P{X=xi,Y=yj},i,j=1,2,...then a sufficient and continously for X and Y to be independent is Pi,j=Pi.Pj

  32. EX:try to determine that whether the (X,Y) in Example1,Exmaple2,Example3 are independent or not? Example 4. Suppose that the d.f. of (X,Y) is give by the following chart and that X is independent of Y, try to determine a and b.

  33. Definition 3.7 Suppose is an discrete two-dimensional distribution, for given , if, then the conditional distribution law for given can be represented by which is defined as (3.14) and (1) (2) § 3.3 CONDITIONAL DISTRIBUTION

  34. Definition 3.9 Suppose that for any, holds, if exist, then the limit is called the conditional distribution of for given and denoted it by or .

  35. Theorem 3.6 Suppose that has continuous density function and then (3.15) is a continuous d.f. and its density function is which is called the conditional density function of given the condition and denoted it by

  36. 3.4 Functions of random vectorsFunctions of discrete random vectors Suppose that (X, Y)~P(X=xi, Y=yj)=pij ,i, j=1, 2, … then Z=g(X, Y)~P{Z=zk}= =pk , k=1, 2, … or

  37. EXSuppose that X and Y are independent and both are uniformly distributed on 0-1 with law X01 • P q p • Try to determine the distribution law of • W=X+Y ;(2) V=max(X, Y); • (3) U=min(X, Y); • (4)The joint distribution law of w and V .

  38. 2 0 1 1 0 1 1 1 0 0 0 1 W 0 1 2 V 0 1 0 0 0

  39. Example 3.17 Suppose and are independent of then each other, then Example 3.18 Suppose and are independent of , then

  40. Let be the joint density function of , then the density function of can be given by the following way. and

  41. Suppose is the density function of and has continuous partial derivatives, the inverse transformation exists, the Jacobian determinant J is defined as follows: Define then the joint density function of can be determined as

  42. Example 3.23 Suppose that is independent of with marginal density and respectively, then the density function of is specified by Remark Under the conditions of Example 3.23, one can easily find the density function of is

  43. Numerical characteristics of random vectors 1.Definition Suppose that the variance of r.v. X and Y exist, define the expectation E{[XE(X)][YE(Y)]} is the covariance of X and Y, Cov(X, Y)=E(XY-E(X)E(Y). ? If Cov(X,Y)=0,It is said that X and Y are uncorrelated What is the different between the concept of “X and Y are independent”and the one of “X and Y are uncorrelated”?

  44. Example 2Suppose that (X, Y) is uniformly distributed on D={(X, Y):x2+y21} .Prove that X and Y are uncorrelated but not independent. Proof

  45. Thus X and Y are uncorrelated. Since Thus, X is not independent of Y.

  46. 2. Properties of covariance: (1) Cov(X, Y)=Cov(Y, X); (2) Cov(X,X)=D(X);Cov(X,c)=0 (3) Cov(aX, bY)=abCov(X, Y), where a, b are constants Proof Cov(aX, bY)=E(aXbY)-E(aX)E(bY) =abE(XY)-aE(X)bE(Y) =ab[E(XY)-E(X)E(Y)] =abCov(X,Y)

  47. (4) Cov(X+Y,Z)=Cov(X, Z)+Cov(Y, Z); Proof Cov(X+Y,Z)= E[(X+Y)Z]-E(X+Y)E(Z) =E(XZ)+E(YZ)-E(X)E(Z)-E(Y)E(Z) =Cov(X,Z)+Cov(Y,Z) (5) D(X+Y)=D(X)+D(Y)+2Cov(X, Y). Proof Remark D(X-Y)=D[X+(-Y)] =D(X)+D(Y)-2Cov(X,Y)

  48. Definition of covariance Properties of Covariance Coefficient Independent and c uncorrelated

  49. Coefficients Definition Suppose that r.v. X,Y has finite variance, dentoed by DX>0,DY>0,respectively, then, is name the coefficient of r.v. X and Y . Introduce which is the standardized of X Obviously, EX*=0,DX*=1 and

More Related