340 likes | 532 Views
Two Functions of Two Random Variables. Two Functions of Two RV.s. X and Y are two random variables with joint p.d.f and are functions define the new random variables: How to determine the joint p.d.f
E N D
Two Functions of Two RV.s • X and Y are two random variables with joint p.d.f • and are functions • define the new random variables: • How to determine the joint p.d.f • with in hand, the marginal p.d.fs and can be easily determined.
Two Functions of Two RV.s • For given z and w, • where is the region in the xy plane such that :
Example • X and Y are independent uniformly distributed rv.s in • Define • Determine • Obviously both w and z vary in the interval Thus • two cases: Solution
Example - continued • For • For • With • we obtain • Thus
Example - continued • Also, • and • and are continuous and differentiable functions, • So, it is possible to develop a formula to obtain the joint p.d.f directly.
(b) (a) • Let’s consider: • Let us say these are the solutions to the above equations:
Consider the problem of evaluating the probability This can be rewritten as: To translate this probability in terms of we need to evaluate the equivalent region for in the xy plane.
(a) (b) • The point A with coordinates (z,w) gets mapped onto the point with coordinates • As z changes to to point B in figure (a), • let represent its image in the xy plane. • As w changes to to C, • let represent its image in the xy plane.
Finally D goes to • represents the equivalent parallelogram in the XY plane with area • The desired probability can be alternatively expressed as • Equating these, we obtain • To simplify this, we need to evaluate the area of the parallelograms in terms of
let and denote the inverse transformations, so that • As the point (z,w) goes to • the point • the point • the point • Hence x and y of are given by: • Similarly, those of are given by:
The area of the parallelogram is given by • From the figure and these equations, • so that • or
This is the Jacobian of the transformation • Using these in , and we get • where represents the Jacobian of the original transformation: (*)
Example • Example 9.2: Suppose X and Y are zero mean independent Gaussian r.vs with common variance • Define where • Obtain • Here • Since • if is a solution pair so is Thus Solution
Example - continued • Substituting this into z, we get • and • Thus there are two solution sets • so that
Example - continued • Also is • Notice that here also • Using (*), • Thus • which represents a Rayleigh r.v with parameter
Example - continued • Also, • which represents a uniform r.v in • Moreover, • So Z and W are independent. • To summarize, If X and Y are zero mean independent Gaussian random variables with common variance, then • has a Rayleigh distribution • has a uniform distribution. • These two derived r.vs are statistically independent.
Example - continued • Alternatively, with X and Y as independent zero mean Gaussian r.vs with common variance, X + jY represents a complex Gaussian r.v. But • where Z and W are as in except that for the abovementioned, to hold good on the entire complex plane we must have • The magnitude and phase of a complex Gaussian r.v are independent with Rayleigh and uniform distributions respectively.
Example • Let X and Y be independent exponential random variables with common parameter . • Define U = X + Y, V = X - Y. • Find the joint and marginal p.d.f of U and V. • It is given that • Now since u= x+ y, v= x- y, always and there is only one solution given by Solution
Example - continued • Moreover the Jacobian of the transformation is given by • and hence represents the joint p.d.f of U and V. This gives • and • Notice that in this case the r.vs U and V are not independent.
As we will show, the general transformation formula in (*) making use of two functions can be made useful even when only one function is specified.
Auxiliary Variables • Suppose • X and Y : two random variables. • To determine by making use of the above formulation in (*), we can define an auxiliary variable • and the p.d.f of Z can be obtained from by proper integration.
Example • Z = X + Y • Let W = Y so that the transformation is one-to-one and the solution is given by • The Jacobian of the transformation is given by • and hence • or • This reduces to the convolution of and if X and Y are independent random variables.
Example • Let and be independent. • Define • Find the density function of Z. • Making use of the auxiliary variable W = Y, Solution
Example - continued • Using these in (*), we obtain • and • Let so that • Notice that as w varies from 0 to 1, u varies from to
Example - continued • Using this in the previous formula, we get • As you can see, • A practical procedure to generate Gaussian random variables is from two independent uniformly distributed random sequences, based on
Example • Let Xand Y be independent identically distributed Geometric random variables with • (a) Show that min (X , Y ) and X – Y are independent random variables. • (b) Show that min (X , Y ) and max (X, Y ) –min (X, Y ) are also independent • (a) Let Z= min (X, Y ) , and W= X – Y. • Note that Z takes only nonnegative values while Wtakes both positive, zero and negative values Solution
Example - continued • We have P(Z= m, W = n) = P{min (X, Y ) = m, X – Y= n}. • But • Thus
Example - continued represents the joint probability mass function of the random variables Z and W. Also
Example - continued • Thus Z represents a Geometric random variable since and • Note that establishing the independence of the random variables Z and W. • The independence of X – Yand min (X , Y ) when X and Yare • independent Geometric random variables is an interesting observation.
Example - continued • (b) Let Z = min (X , Y ) , R = max (X , Y ) – min (X , Y ). • In this case both Zand R take nonnegative integer values • we get
Example - continued • This equation is the joint probability mass function ofZandR. • Also we can obtain: • and • From (9-68)-(9-70), we get • This proves the independence of the random variables Z and R as well.