510 likes | 623 Views
TE-306 Digital Communications. Random Signals Random Variables. All useful message signals appear random; that is, the receiver does not know, a priori, which of the possible waveform have been sent.
E N D
Random SignalsRandom Variables • All useful message signals appear random; that is, the receiver does not know, a priori, which of the possible waveform have been sent. • Let a random variable X(A) represent the functional relationship between a random event A and a real number. • Notation - Capital letters, usually X or Y, are used to denote random variables. Corresponding lower case letters, x or y, are used to denote particular values of the random variables X or Y. • Example: means the probability that the random variable X will take a value less than or equal to 3.
Probability Mass Function for (discrete RVs) • Probability Mass Function (p(x)) specifies the probability of each outcome (x) and has the properties:
Cumulative Distribution Function • Cumulative Distribution Function specifies the probability that the random variable will assume a value less than or equal to a certain variable (x). • The cumulative distribution, F(x), of a discrete random variable X with probability mass distribution, p(x), is given by:
Mean & Standard Deviation of a Discrete Random Variable X • Mean or Expected Value • Standard Deviation
Example The probability mass function of X is Xp(x) 0 0.001 1 0.027 2 0.243 3 0.729 The cumulative distribution function of X is XF(x) 0 0.001 1 0.028 2 0.271 3 1.000
Example Contd. Probability Mass Function p(x) 1 0.5 0 x 0 1 2 3 4 Cumulative Distribution Function F(x) 1 0.5 0 x 0 1 2 3 4
Probability Density Function The function f(x) is a probability density function for the continuous random variable X, defined over the set of real numbers R, if 1. f(x) 0 for all x R. 2. 3. P(a < X < b) =
Probability Density Function Note: For any specific value of x, say x = c, f(x) can be thought of as a smoothed relative frequency histogram
Cumulative Distribution Function The cumulative distribution function F(x) of a continuous random variable X with density function f(x) is given by:
Probability Density Distribution: f(t) f(t1) = value of f, not a probability Area = P(t1 < T <t2) t F(t) 1 F(t2) P(t1 < T <t2) = F(t2) - F(t1) F(t1) t t1 t2 Note: F(t) is the cumulative area under f(t)
Ensemble Averages • The first moment of a probability distribution of a random variable X is called mean value mX, or expected value of a random variable X • The second moment of a probability distribution is the mean-square value of X • Central momentsare the moments of the difference between X and mX and the second central moment is the variance of X • Variance is equal to the difference between the mean-square value and the square of the mean
Special Continuous Probability Distributions • Uniform Distribution • Normal Distribution • Lognormal Distribution • Exponential Distribution • Weibull Distribution
Uniform Distribution Probability Density Function f(x) 1/(b-a) 0 x a b
Uniform Distribution Probability Distribution Function F(x) 1 0 x a b
Uniform Distribution: Mean & Standard Deviation • Mean • Standard Deviation
Random Processes • A random process X(A, t) can be viewed as a function of two variables: an event A and time.
Statistical Averages of a Random Process • A random process whose distribution functions are continuous can be described statistically with a probability density function (pdf). • A partial description consisting of the mean and autocorrelation function are often adequate for the needs of communication systems. • Mean of the random process X(t) : (1.30) • Autocorrelation function of the random process X(t) (1.31)
Stationarity • A random process X(t) is said to be stationary in the strict sense if none of its statistics are affected by a shift in the time origin. • A random process is said to be wide-sense stationary (WSS) if two of its statistics, its mean and autocorrelation function, do not vary with a shift in the time origin. (1.32) (1.33)
Autocorrelation of a Wide-Sense Stationary Random Process • For a wide-sense stationary process, the autocorrelation function is only a function of the time difference τ = t1 – t2; (1.34) • Properties of the autocorrelation function of a real-valued wide-sense stationary process are Symmetrical in τ about zero Maximum value occurs at the origin Autocorrelation and power spectral density form a Fourier transform pair Value at the origin is equal to the average power of the signal
Time Averaging and Ergodicity • When a random process belongs to a special class, known as an ergodic process, its time averages equal its ensemble averages. • The statistical properties of such processes can be determined by time averaging over a single sample function of the process. • A random process is ergodic in the mean if (1.35) • It is ergodic in the autocorrelation function if (1.36)
Power Spectral Density and Autocorrelation • A random process X(t) can generally be classified as a power signal having a power spectral density (PSD) GX(f ) • Principal features of PSD functions And is always real valued for X(t) real-valued PSD and autocorrelation form a Fourier transform pair Relationship between average normalized power and PSD
Summary of Relationships of Ergodicity of Random Process to interchange the Time and Ensemble Averages
Noise in Communication Systems • The term noise refers to unwanted electrical signals that are always present in electrical systems; e.g spark-plug ignition noise, switching transients, and other radiating electromagnetic signals. • Can describe thermal noise as a zero-mean Gaussian random process. • A Gaussian process n(t) is a random function whose amplitudeat any arbitrary time t is statistically characterized by the Gaussian probability density function (1.40) where, is the variance of n. • The normalized or standardized Gaussian density function of a zero-mean process is obtained by assuming that =1.
Noise in Communication Systems • The normalized or standardized Gaussian density function of a zero-mean process is obtained by assuming unit variance.
Noise in Communication Systems • A random signal is often represented as the sum of a Gaussian noise random variable and a dc signal. That is z = a + n where z is the random signal, a is the dc component, and n is the Gaussian noise random variable. • The pdf p(z) is then expressed as where, is the variance of n. • The Gaussian distribution is often used as the system noise model because of a theorem, called the central limit theorem
White Noise • The primary spectral characteristic of thermal noise is that its power spectral density is the same for all frequencies of interest in most communication systems • A thermal noise source emanates an equal amount of noise power per unit bandwidth at all frequencies– from dc to about 1012 Hz. • Therefore, a simple model for thermal noise assumes that its power spectral density Gn(f) is flat for all frequencies, as shown below
White Noise • Power spectral density of noise, Gn(f ) is (1.42) where the factor 2 is included to indicate that Gn(f ) is a two-sided power spectral density. • When the noise power has such a uniform spectral density we refer to it as white noise (white light which contains equal amounts of all frequencies within the visible band of electromagnetic radiation) • The autocorrelation function of white noise is given by the inverse Fourier transform of the noise power spectral density, denoted as follows: (1.43)
White Noise • Thus the autocorrelation of white noise is a delta function weighted by the factor N0/2 and occurring at =0, as shown in figure below: • Note that Rn() is zero for 0; • The average power Pn of white noise is infinite because its bandwidth is infinite, as shown in below:
White Noise • The effect on the detection process of a channel with additive white Gaussian noise (AWGN) is that the noise affects each transmitted symbol independently. • Such a channel is called a memoryless channel. • The term “additive” means that the noise is simply superimposed or added to the signal
Signal Transmission through Linear Systems • A system can be characterized equally well in the time domain or the frequency domain, techniques will be developed in both domains • The system is assumed to be linear and time invariant. • It is also assumed that there is no stored energy in the system at the time the input is applied
Impulse Response • The linear time invariant system or network is characterized in the time domain by an impulse response h (t ), to an input unit impulse (t) (1.45) • The response of the network to an arbitrary input signal x (t ) is found by the convolution of x(t ) with h (t ) (1.46) • The system is assumed to be causal, which means that there can be no output prior to the time, t =0,when the input is applied. • The convolution integral can be expressed as: (1.47a)
Frequency Transfer Function • The frequency-domain output signal Y(f ) is obtained by taking the Fourier transform (1.48) • Frequency transfer function or the frequency response is defined as: (1.49) (1.50) • The phase response is defined as: (1.51)
Random Processes and Linear Systems • If a random process forms the input to a time-invariant linear system, the output will also be a random process. • The input power spectral density GX(f )and the output power spectral density GY(f )are related as: (1.53)
DistortionlessTransmissionWhat is the required behavior of an ideal transmission line? • The output signal from an ideal transmission line may have some time delay and different amplitude than the input • It must have no distortion—it must have the same shape as the input. • For ideal distortionless transmission: Output signal in time domain Output signal in frequency domain System Transfer Function (1.54) (1.55) (1.56)
What is the required behavior of an ideal transmission line? • The overall system response must have a constant magnitude response • The phase shift must be linear with frequency • All of the signal’s frequency components must also arrive with identical time delay in order to add up correctly • Time delay t0 is related to the phase shift and the radian frequency = 2f by: t0 (seconds) = (radians) / 2f (radians/seconds ) (1.57a) • Another characteristic often used to measure delay distortion of a signal is called envelope delay or group delay: (1.57b)
Ideal Filters • For the ideal low-pass filter transfer function with bandwidth Wf= fuhertz can be written as: (1.58) Where (1.59) (1.60) Figure1.11 (b) Ideal low-pass filter
Ideal Filters • The impulse response of the ideal low-pass filter:
Ideal Filters • For the ideal band-pass filter transfer function • For the ideal high-pass filter transfer function Figure1.11 (c) Ideal high-pass filter Figure1.11 (a) Ideal band-pass filter
Realizable Filters • The simplest example of a realizable low-pass filter; an RC filter 1.63) Figure 1.13
Realizable Filters • Phase characteristic of RC filter Figure 1.13
Realizable Filters • There are several useful approximations to the ideal low-pass filter characteristic and one of these is the Butterworth filter • (1.65) • Butterworth filters are popular because they are the best approximation to the ideal, in the sense of maximal flatness in the filter passband.
Bandwidth Of Digital DataBaseband versus Bandpass • An easy way to translate the spectrum of a low-pass or baseband signal x(t) to a higher frequency is to multiply or heterodyne the baseband signal with a carrier wave cos 2fct • xc(t) is called a double-sideband (DSB) modulated signal xc(t) = x(t) cos 2fct (1.70) • From the frequency shifting theorem Xc(f) = 1/2 [X(f-fc) + X(f+fc) ](1.71) • Generally the carrier wave frequency is much higher than the bandwidth of the baseband signal fc >> fmand therefore WDSB = 2fm
Bandwidth Dilemma • Theorems of communication and information theory are based on the assumption of strictly bandlimited channels • The mathematical description of a real signal does not permit the signal to be strictly duration limited and strictly bandlimited.
Bandwidth Dilemma • All bandwidth criteria have in common the attempt to specify a measure of the width, W, of a nonnegative real-valued spectral density defined for all frequencies f < ∞ • The single-sided power spectral density for a single heterodyned pulse xc(t) takes the analytical form: (1.73)
(a) Half-power bandwidth. (b) Equivalent rectangular or noise equivalent bandwidth. (c) Null-to-null bandwidth. (d) Fractional power containment bandwidth. (e) Bounded power spectral density. (f) Absolute bandwidth. Different Bandwidth Criteria