1 / 13

Lecture 13 Communications Channels (Section 4.1)

This lecture explains the theory of information and the various channel models used in communication systems. It covers topics such as encoded, received, and decoded messages, formal definitions of communications channels, memoryless channels, and crossover probability. Examples of binary symmetric and binary erasure channels are provided. The lecture also discusses forward and backward channel probabilities and provides exercises for practice.

neberle
Download Presentation

Lecture 13 Communications Channels (Section 4.1)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 13Communications Channels(Section 4.1) Theory of Information

  2. The Big Picture encoded message received message decoded message message message Communications channel model noise We will focus on this part, called communications channel

  3. Formal Definitions A communications channel consists of: a finite channel alphabet A = {a1, …, ar} a set of (forward) channel or transition probabilities P(aj received | ai sent) that sum up to 1 for every i=1,…,r: P(a1 received | aisent)+…+P(arreceived | aisent)=1 A channel is called memoryless if the outcome of any one symbol transmission is independent of the outcomes of the previous transmissions. If codeword c = c1...cn is sent through a memoryless channel, then the probability of receiving word x = x1...xn is equal to the product P(x1 received | c1 sent) • … •P(xn received | cnsent)

  4. crossover probability Example 1: Binary symmetric channel (BSC) 90% 0 0 1 1 10% 10% 90% What is the probability that 001 is received when 001 is sent? What is the probability that 000 is received when 001 is sent? What is the probability that 110 is received when 001 is sent?

  5. Example 2: Binary erasure channel (BEC) 80% 0 0 15% 5% 100% ? ? 5% 15% 1 1 80% ? signifies an erased or illegible symbol Missing arrows mean 0 probability

  6. 80% Forward vs. Backward Channel Probabilities 0 0 15% 5% 100% ? ? 5% 15% 1 1 80% Forward probability: P(ajreceived | aisent) Backward probability: P(ajsent | aireceived) The above picture shows forward probabilities. What would be the backward probabilities?

  7. 80% Forward vs. Backward Channel Probabilities 0 0 15% 5% 100% ? ? 5% 15% 1 1 80% Forward probability: P(ajreceived | aisent) Backward probability: P(ajsent | aireceived) The above picture shows forward probabilities. What would be the backward probabilities?

  8. Forward vs. Backward Channel Probabilities 0 0 ? ? 1 1 The above picture shows backward probabilities for the same channel.

  9. Exercise 1 Draw the graph of a binary erasure channel that either transmits a bit correctly, or else erases it.

  10. Exercise 3 Why are channels of the following form called deterministic?

  11. Exercise 5 Why are channels of the following form called useless? p 0 0 1 1 1-p p 1-p

  12. Exercise 5 What should we do with a binary symmetric channel whose crossover probability is significantly greater than ½? 10% 0 0 1 1 90% 90% 10%

  13. Homework Exercises 2,4,6 of Section 4.1.

More Related