240 likes | 474 Views
ELEC 303 – Random Signals. Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009. Lecture outline. Basic concepts Statistical averages, Autocorrelation function Wide sense stationary (WSS) Multiple random processes. Random processes.
E N D
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009
Lecture outline • Basic concepts • Statistical averages, • Autocorrelation function • Wide sense stationary (WSS) • Multiple random processes
Random processes • A random process (RP) is an extension of a RV • Applied to random time varying signals • Example: “thermal noise” in circuits caused by the random movement of electrons • RP is a natural way to model info sources • RP is a set of possible realizations of signal waveforms governed by probabilistic laws • RP instance is a signal (and not just one number like the case of RV)
Example 1 • A signal generator generates six possible sinusoids with amplitude one and phase zero. • We throw a die, corresponding to the value F, the sinusoid frequency = 100F • Thus, each of the possible six signals would be realized with equal probability • The random process is X(t)=cos(2 100F t)
Example 2 • Randomly choose a phase ~ U[0,2] • Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase • The RP is X(t)= A cos(2f0t + )
Example 3 • X(t)=X • Random variable X~U[-1,1]
Random processes • Corresponding to each i in the sample space , there is a signal x(t; i) called a sample function or a realization of the RP • For the different I’s at a fixed time t0, the number x(t0; i) constitutes a RV X(t0) • In other words, at any time instant, the value of a random process is a random variable
Example 4 • We throw a die, corresponding to the value F, the sinusoid frequency = 100F • Thus, each of the possible six signals would be realized with equal probability • The random process is X(t)=cos(2 100F t) • Determine the values of the RV X(0.001) • The possible values are cos(0.2), cos(0.4), …, cos(1.2) each with probability 1/6
Example 5 • is the sample space for throwing a die • For all i let x(t; i)= i e-1 • X is a RV taking values e-1, 2e-1, …, 6e-1, each with probability 1/6
Example 6 • Example of a discrete-time random process • Let i denote the outcome of a random experiment of independent drawings from N(0,1) • The discrete–time RP is {Xn}n=1 to , X0=0, and Xn=Xn-1+ i for all n1
Statistical averages • mX(t) is the mean, of the random process X(t) • At each t=t0, it is the mean of the RV X(t0) • Thus, mX(t)=E[X(t)] for all t • The PDF of X(t0) denoted by fX(t0)(x)
Example 7 • Randomly choose a phase ~ U[0,2] • Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase • The RP is X(t)= A cos(2f0t + ) • We can compute the mean • For [1,2], f()=1/2, and zero otherwise • E[X(t)]= {0 to 2} A cos(2f0t+)/2.d = 0
Autocorrelation function • The autocorrelation function of the RP X(t) is denoted by RX(t1,t2)=E[X(t1)X(t2)] • RX(t1,t2) is a deterministic function of t1 and t2
Example 8 • The autocorrelation of the RP in ex.7 is • We have used
Example 9 • X(t)=X • Random variable X~U[-1,1] • Find the autocorrelation function
Wide sense stationary process • A process is wide sense stationary (WSS) if its mean and autocorrelation do not depend on the choice of the time origin • WSS RP: the following two conditions hold • mX(t)=E[X(t)] is independent of t • RX(t1,t2) depends only on the time difference =t1-t2 and not on the t1 and t2 individually • From the definition, RX(t1,t2)=RX(t2,t1) If RP is WSS, then RX()=RX(-)
Example 8 (cont’d) • The autocorrelation of the RP in ex.7 is • Also, we saw that mX(t)=0 • Thus, this process is WSS
Example 10 • Randomly choose a phase ~ U[0,] • Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase • The new RP is Y(t)= A cos(2f0t + ) • We can compute the mean • For [1,], f()=1/, and zero otherwise • MY(t) = E[Y(t)]= {0 to } A cos(2f0t+)/.d = -2A/ sin(2f0t) • Since mY(t) is not independent of t, Y(t) is nonstationary RP
Multiple RPs • Two RPs X(t) and Y(t) are independent if for all t1 and t2, the RVs X(t1) and X(t2) are independent • Similarly, the X(t) and Y(t) are uncorrelated if for all t1 and t2, the RVs X(t1) and X(t2) are uncorrelated • Recall that independence uncorrelation, but the reverse relationship is not generally true • The only exception is the Gaussian processes (TBD next time) were the two are equivalent
Cross correlation and joint stationary • The cross correlation between two RPs X(t) and Y(t) is defined as RXY(t1,t2) = E[X(t1)X(t2)] clearly, RXY(t1,t2) = RXY(t2,t1) • Two RPs X(t) and Y(t) are jointly WSS if both are individually stationary and the cross correlation depends on =t1-t2 for X and Y jointly stationary, RXY() = RXY(-)