1 / 12

Understanding Information Theory: Communication Systems and Encoding

Dive into the realm of information theory - analyzing flow of data, encoding symbols, noise interference, and error correction in communication systems. Learn about information entropy, efficient encoding, and quantifying information. Explore applications in telecommunications and compression.

bgerhardt
Download Presentation

Understanding Information Theory: Communication Systems and Encoding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Theory Michael J. Watts http://mike.watts.net.nz

  2. Lecture Outline • What is information theory? • Communications Systems • Measuring Information and uncertainty • Information entropy and encoding

  3. What is Information Theory? • Area of applied mathematics • Concerned with the analysis of a “communications system”

  4. Components of Communications Systems • Source • Where the information is coming from • Encoder • Encodes the information and transmits it as a signal • Channel • Medium over which information is transmitted

  5. Components of Communications Systems • Noise • Random interference added to signal • Decoder • Receives the information and decodes it for the destination • Destination • What is receiving the information

  6. Information Theory • Information theory analyses the flow of information through these systems • How much information can go through it • How the information can be encoded • Must quantify information first

  7. Information Theory • Definitions of information • Data used to make a decision • Data that informs • Information tells you something you did not already know

  8. Quantity of Information • Assume we have a random variable X • X can have exactly N states • Each state has a probability P(n) • Probability of the state occurring determines its uncertainty • As P(n) approaches 0.5, uncertainty approaches 1

  9. Quantity of Information • High uncertainty == more information • The more uncertain an event is, the more information is conveyed when it occurs • Surprise! • Entropy is the total uncertainty over all variables • Total uncertainty of the system

  10. Encoding • Encoding is representing symbols as bits • Number of bits is proportional to number of symbols • Size of alphabet • Min. bits / symbol(N) = bits * uncertainty(N) • More certain == fewer bits • Efficient encoding == fewer average bits • Data compression

  11. Error Correction • Noise complicates things • Definition of noise • Error correcting codes account for noise • Correct a change in any one bit • Add bits to an encoding scheme • Additional bits represent the information • Hamming code • Used in CD's and elsewhere

  12. Summary • Information theory deals with the transmission and encoding of symbols (information) • Information is measured as how “surprising” a symbol is • Applications in telecommunications and compression • Noise can interfere with a signal • Error correction is needed

More Related