1 / 65

Outline

This article provides an overview of communication systems, covering topics such as signals, random variables, random processes, and spectra. It discusses analog modulation, analog to digital conversion, digital transmission, signal space representation, optimal receivers, digital modulation techniques, channel coding, synchronization, and information theory.

silerr
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline • Introduction • Signal, random variable, random process and spectra • Analog modulation • Analog to digital conversion • Digital transmission through baseband channels • Signal space representation • Optimal receivers • Digital modulation techniques • Channel coding • Synchronization • Information theory

  2. Signal, random variable, random process and spectra

  3. Signal, random variable, random process and spectra • Signals • Review of probability and random variables • Random processes: basic concepts • Gaussian and White processes Selected from Chapter 2.1-2.6, 5.1-5.3

  4. Signal • In communication systems, a signal is any function that carries information. Also called information bearing signal

  5. Signal • Continuous-time signal vs. discrete-time signal • Continuous-valued signal vs. discrete-valued signal Continuous-time continuous-valued: analog signal Discrete-time and discrete-valued: digital signal Discrete-time and continuous-valued : sampled signal Continuous-time and discrete-valued: quantized signal

  6. Signal

  7. Signal • Energy vs. power signal • Energy • Power • A signal is an energy signal iff energy is limited • A signal is a power signal iff power is limited

  8. Signal • Fourier Transform

  9. Random variable • Review of probability and random variables • Two events A and B • Conditional probability P(A|B) • Joint probability P(AB)=P(A)P(B|A)=P(B)P(A|B) • A and B are independent iff P(AB)=P(A)P(B) • Let be mutually exclusive events with • . Then for any event , we have

  10. Random variable • Review of probability and random variables • Bayes’ Rule: Let be mutually exclusive such that . For any nonzero probability event B, we have

  11. Random variable • Review of probability and random variables Consider a binary communication system

  12. Random variable • Review of probability and random variables • A random variable is a mapping from the sample space to the set of real numbers • Discrete r.v.: range is finite ({0,1}) or countable infinite ({0,1,…}) • Continuous r.v.: range is uncountable infinite (real number)

  13. Random variable • Review of probability and random variables • The cumulative distribution function (CDF) of a r.v. X is • The key properties of CDF

  14. Random variable • Review of probability and random variables • The probability density function (PDF) of a r.v. X is • The key properties of PDF

  15. Random variable • Review of probability and random variables • Common random variables: Bernoulli, Binomial, Uniform, and Gaussian • Bernoulli distribution: • Binomial distribution: the sum of n independent Bernoulli r.v.

  16. Random variable • Review of probability and random variables • Suppose that we transmit a 31-bit sequence with error correction capability up to 3 bit errors • If the probability of a bit error is p=0.001, what is the probability that the sequence received is in error? • On the other hand, if no error correction is used, the error probability is

  17. Random variable • Review of probability and random variables • Uniform distribution: • The random phase of a sinusoid is often modeled as a uniform r.v. between 0 and • The mean or expected value of X is first moment of X

  18. Random variable • Review of probability and random variables • The n-th moment of X • If n=2, we have the mean-squared value of X • The n-th central moment is • If n=2, we have the variance of X • is called the standard deviation

  19. Random variable • Review of probability and random variables • Gaussian distribution: • A Gaussian r.v. is completely determined by its mean and variance, and hence usually denoted as The most important distribution in communications!

  20. Random variable • Review of probability and random variables • Q-function is a standard form to express error probabilities without a closed-form • The Q-function is the area under the tail of a Gaussian pdf with mean 0 and variance 1 Extremely important in error probability analysis!

  21. Random variable • Review of probability and random variables • Some key properties of Q-function Q-function is monotonically decreasing Craig’s alternative form of Q-function Upperbound For a Gaussian variable

  22. Random variable • Review of probability and random variables • Joint distribution of two r.v.s X and Y is • And joint PDF is

  23. Random variable • Review of probability and random variables • Marginal distribution • Marginal density • X and Y are said to be independent iff

  24. Random variable • Review of probability and random variables • Correlation of the two r.v.s X and Y is • Correlation of the two centered r.v.s X-E[X] and Y-E[Y] is called the covariance of X and Y • If then X and Y are called uncorrelated.

  25. Random variable • Review of probability and random variables • The covariance of X and Y normalized w.r.t. is referred to the correlation coefficient of X and Y • If X and Y are independent, then they are uncorrelated. • The converse is not true except for the Gaussian r.v.

  26. Random variable • Review of probability and random variables • Functions of random variable. • How to obtain the PDF of r.v. ? • Two steps: • Calculate the CDF of Y through • Take the derivative of CDF • For r.v.s , generally

  27. Random variable • Review of probability and random variables • are jointly Gaussian iff

  28. Random variable • Review of probability and random variables • In case of n=2, we have

  29. Random variable • Review of probability and random variables • And if X1 and X2 are uncorrelated, i.e., If X1 and X2 are Gaussian and uncorrelated, they are independent.

  30. Random variable • Review of probability and random variables • If n random variables are jointly Gaussian, then any set of them is also jointly Gaussian. • Jointly Gaussian r.v.s are completely characterized by mean vector and the covariance matrix. • Any linear combination of the n r.v.s is Gaussian.

  31. Random variable • Review of probability and random variables • Law of large numbers Consider a sequence of r.v. , let , if the r.v.s are uncorrelated with the same mean , and variance , then The average r.v. converges to the mean value

  32. Random variable • Review of probability and random variables • Central limit theorem If are i.i.d. r.v.s with same mean , and variance , then Thermal noise results from the random movements of many electrons is well modeled by a Gaussian distribution.

  33. Random variable • Review of probability and random variables • Consider for example a sequence of uniform distribution r.v.s with mean 0 and variance 1/12, then n=2 n=4 n=1 n=8 Dashed line: Solid line:

  34. Random variable • Review of probability and random variables • Rayleigh distribution: Rayleigh distribution is usually used to model fading for non-line-of-sight (NLOS) signal transmissions. Very important distribution for analysis in mobile and wireless communications.

  35. Random variable • Review of probability and random variables • Consider for example h=a+jb, where a and b are i.i.d. Gaussian r.v.s with mean 0 and variance , then the magnitude of h follows Rayleigh distribution • and the phase follows uniform distribution.

  36. Random process and spectra • Random processes: basic concepts • A random process (stochastic process, or random signal) is the evolution of random variables over time. • A random process is a function defined over time and sample space with value at a specific time corresponding to a sample value . • Given , are the random variables. • Given , is the sample path function.

  37. Random process and spectra • Random processes: basic concepts • Let , for fixed amplitude and frequency, the random process • have different sample path functions First trial Second trial Third trial

  38. Random process and spectra • Random processes: basic concepts • The statistics of random variables at different times • CDF • Expectation function • Auto-correlation function • Covariance

  39. Random process and spectra • Random processes: basic concepts • Consider for example the • with , we have • Expectation function: • Auto-correlation function (assume ):

  40. Random process and spectra • Random processes: Stationary processes • A stochastic process is said to be stationary if for any n and The expectation function is independent of time The auto-correlation function only depends on the time difference

  41. Random process and spectra • Random processes: Stationary processes • A random process is said to be Wide sense stationary (WSS) if • A strict stationary process satisfies for any n and

  42. Random process and spectra • Random processes: Stationary processes • Some properties of WSS If , then

  43. Random process and spectra • Random processes: Stationary processes • Statistical averaging • Time averaging • If time averaging = statistical averaging, then the process is said to be ergodic. (what does it mean?)

  44. Random process and spectra • Random processes: Stationary processes

  45. Random process and spectra • Random processes: Stationary processes • Frequency domain characteristics

  46. Random process and spectra • Random processes: Stationary processes • Given a deterministic signal , the power is • Truncate to get an energy signal • Parseval’s theorem • Hence

  47. Random process and spectra • Random processes: Stationary processes • Consider as a sample path function of a random process X(t) , the average power of X(t) is • Power spectral density of X(t)

  48. Random process and spectra • Random processes: Stationary processes • Consider a binary semi-random signal • with • Find its power spectral • density?

  49. Random process and spectra • Random processes: Stationary processes • Wiener-Khinchin theorem • Property

  50. Random process and spectra • Random processes: Stationary processes • Consider for instance the random process • , we have • Then,

More Related