330 likes | 428 Views
Course Review for Final. ECE460 Spring, 2012. Common Fourier Transform Pairs. Fourier Transform Properties. Sampling Theorem. Able to reconstruct any bandlimited signal from its samples if we sample fast enough. If X ( f ) is band limited with bandwidth W
E N D
Course Review for Final ECE460 Spring, 2012
Sampling Theorem Able to reconstruct any bandlimitedsignal from its samples if we sample fast enough. If X(f) is band limited with bandwidth W then it is possible to reconstruct x(t) from samples
Bandpass Signals & Systems • Frequency Domain: • Low-pass Equivalents: • Let • Giving • To solve, work with low-pass parameters (easier mathematically), then switch back to bandpass via
Analog Modulation • Amplitude Modulation (AM) • Message Signal: • Sinusoidal Carrier: • AM (DSB) • DSB – SC • SSB • Started with DSB-SC signal and filtered to one sideband • Used ideal filter:
Angular Modulation • Angle Modulation • Definitions: • FM (sinusoidal signal)
Combinatorics • Sampling with replacement and ordering • Sampling without replacement and with ordering • Sampling without replacement and without ordering • Sampling with replacement and without ordering • Bernoulli Trials • Conditional Probabilities
Random Variables • Cumulative Distribution Function (CDF) • Probability Distribution Function (PDF) • Probability Mass Function (PMF) • Key Distributions • Bernoulli Random Variable • Uniform Random Variable • Gaussian (Normal) Random Variable
Functions of a Random Variable • General: • Statistical Averages • Mean • Variance
Multiple Random Variables • Joint CDF of X and Y • Joint PDF of X and Y • Conditional PDF of X • Expected Values • Correlation of X and Y • Covariance of X and Y - what is ρX,Y? • Independence of X and Y
Jointly Gaussian R.V.’s • Xand Y are jointly Gaussian if • Matrix Form: • Function:
Random Processes • Notation: • Understand integration across time or ensembles • Mean • Autocorrelation • Auto-covariance • Power Spectral Density • Stationary Processes • Strict Sense Stationary • Wide-Sense Stationary (WSS) • Cyclostationary • Ergodic
Transfer Through a Linear System • Mean of Y(t)where X(t) is wss • Cross-correlation function RXY(t1,t2) • Autocorrelation function RY(t1,t2) • Spectral Analysis
Energy & Power Processes For a sample function For Random Variables we have Then the energy and power content of the random process is
Zero-Mean White Gaussian Noise A zero mean white Gaussian noise, W(t), is a random process with • For any n and any sequence t1, t2, …, tn the random variables W(t1), W(t2), …, W(tn), are jointly Gaussian with zero mean and covariances
Bandpass Processes X(t)is a bandpassprocess Filter X(t) using a Hilbert Transform: and define If X(t) is a zero-mean stationary bandpass process, then Xc(t) and Xs(t) will be zero-mean jointly stationary processes: Giving
Performance on an Analog System in Noise • Metric: SNR • Message Signal Power m(t): • Noise:
Digital Systems • Discrete Memoryless Source (DMS) completely defined by: • Alphabet: • Probability Mass Function: • Self-Information • Log2 - bits (b) • Loge-nats • Entropy- measure of the average information content per source symbol and is measured in b/symbol • Discrete System: Bounded: • Joint entropy of two discrete random variables (X, Y) • Conditional entropy of the random variable X given Y • Relationships
Mutual Information • Mutual Information denotes the amount of uncertainty of X that has been removed by revealing random variable Y. • If H(X) is the uncertainty of channel input before channel output is observed • and • H(X|Y) is the uncertainty of channel input after channel output is observed, • then • I(X;Y)is the uncertainty about the channel input that is resolved by observing channel output
Source Coding • Viable Source Codes • Uniquely decodable properties • Prefix-free • instantaneously decodable • Theorem: • A source with entropy H can be encoded with arbitrarily small error probability at any rate R (bits/source output)as long as R > H. • Conversely if R < H, the error probability will be bounded away from zero, independent of the complexity of the encoder and the decoder employed. • : the average code word length per source symbol • Huffman Coding
Quantization • Quantization Function: • Squared-error distortion for a single measurement: • Distortion D for the source since X is a random variable • In general, a distortion measure is a distance between X and its reproduction . • Hamming distortion:
Rate Distortion • Minimum number of bits/source output required to reproduce a memoryless source with distortion less than or equal to D is call the rate-distortion function, denoted by R(D): • For a binarymemoryless source • And with Hamming distortion, the rate-distortion function is • For a zero-mean, Gaussian Source with variance σ2
Geometric Representation • Gram-Schmidt Orthogonalization • Begin with first waveform, s1(t) with energy ξ1: • Second waveform • Determine projection, c21, onto ψ1 • Subtract projection from s2(t) • Normalize • Repeat
Pulse Amplitude ModulationBandpass Signals • What type of Amplitude Modulation signal does this appear to be? X
PAM SignalsGeometric Representation • M-ary PAM waveforms are one-dimensional • where • For Bandpass: d = Euclidean distance between two points d d d d d 0
Optimum Receivers • Start with the transmission of any one of the M-ary signal waveforms: • Demodulators • Correlation-Type • Matched-Filter-Type • Optimum Detector • Special Cases (Demodulation and Detection) • Carrier-Amplitude Modulated Signals • Carrier-Phase Modulation Signals • Quadrature Amplitude Modulated Signals • Frequency-Modulated Signals Demodulator Detector Output Decision Sampler
DemodulatorsCorrelation-Type Next, obtain the joint conditional PDF
DemodulatorsMatched-Filter Type • Instead of using a bank of correlators to generate {rk}, use a bank of N linear filters. • The Matched Filter Key Property: if a signal s(t) is corruptedby AGWN, the filter with impulse response matched to s(t) maximizes the output SNR Demodulator
Optimum Detector • Maximum a Posterior Probabilities (MAP) • If equal a priori probabilities, i.e., for all M and the denominator is a constant for all M, this reduces to maximizing called maximum-likelihood (ML) criterion.
Probability of ErrorBinary PAM Baseband Signals • Consider binary PAM baseband signalswhere is an arbitrary pulse which is nonzero in the interval and zero elsewhere. This can be pictured geometrically as • Assumption: signals are equally likely and that s1 was transmitted. Then the received signal is • Decision Rule: • The two conditional PDFs for r are 0
Probability of ErrorM-ary PAM Baseband Signals • Recall baseband M-ary PAM are geometrically represented in 1-D with signal point values of • And, for symmetric signals about the origin, • where the distance between adjacent signal points is . • Each signal has a different energies. The average is