1 / 46

History and Overview of Information Theory

Explore the history of information theory, from the optical telegraph to modern-day computers and the internet. Learn about the mathematical theory behind communication and the importance of information theory in various fields such as civil engineering and cultural exchange.

sealey
Download Presentation

History and Overview of Information Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Day 2 • Information theory (信息論) • Civil engineering (土木工程) • Cultural exchange

  2. Information Theory 信息論

  3. Information Theory • History and overview • Mathematical theory

  4. A history of communication • ??? • 1790 Optical telegraphs • 1838 Electrical telegraph (電報) • 1876 Telephone • 1896 Radio • 1914 Television • 1940’s Computers • 1983 Internet

  5. Talking drums • Africans were the first to communicate long-distance, 100mph • Talking drums “Return here” = Make your feet come back the way they went, Make your legs come back the way they went, Plant your feet and your legs below, In the village which belongs to us.

  6. The optical telegraph • How to communicate 26 possible letters? • Optical telegraph: 5 shutters opened and closed with pulleys (so 25=32 possibilities) • In France, 1/3 of messages arrive within a day

  7. The electrical telegraph • Samuel Morse 1838 • Letters represented with sequences of● and —

  8. An alternate history… • Imagine the telegraph (電報) were invented in China. • Devise a system to send messages using 3 symbols: ●, —, and _ (space) • Each symbol costs $1. Try to spend as little money as possible!

  9. A twist • Devise a system to send messages using 3 symbols: ●, —, and _ (space) • Each symbol costs $1. Try to spend as little money as possible! • 10% of the symbols will be changed.

  10. Discussion • How much information did you have to transmit? • What problems did you face?

  11. Flaws with the telegraph • Shortened messages are messed up by small errors • Telegraph: BAY -> BUY

  12. Big questions • How do we measure information? • How do we transmit a message in the most compact way?Data compression • How do we transmit a message over a noisy channel?Error correction

  13. Information Theory • History and overview • Mathematical theory

  14. The fundamental problem is that of reproducing at one point either exactly or approximately a message at another point. Claude Shannon, A Mathematical Theory of Communication, 1948

  15. Why care about information theory? Internet, computers, satellites, CD’s, telephones, TV,… None of these would exist without a mathematical theory of communication.

  16. Big questions • How do we measure information? • How do we transmit a message in the most compact way?Data compression • How do we transmit a message over a noisy channel?Error correction

  17. How do we measure information? • We measure information in bits and bytes. • What does this mean?

  18. How do we measure information? • Guess a number 1-100, I’ll tell you if my number is greater or smaller. How many tries do you need? • How many weighings to find a heavier coin in 9 coins? • How many weighings to find a lighter/heavier coin in 12 coins? Can you know whether it’s lighter or heavier?

  19. How do we measure information? Guess a number 1-100, I’ll tell you if my number is greater or smaller. How many tries do you need? We need about log2100 ≈ 7 guesses—about 7 bits.

  20. How many weighings to find a heavier coin in 9 coins?log39=2 • How many weighings to find a lighter/heavier coin in 12 coins? Can you know whether it’s lighter or heavier?Guess: log324 ≈ 3

  21. More problems • You are given 16 balls, one of which is heavier/lighter. You are also given a balance that only says that the two sides balance, or do not balance. How many weighings do you need to find the odd ball? • How many weighings to find a lighter/heavier coin in 39 coins? • How many weighings to find two different-weight coins in 12 coins? (Take a guess!)

  22. Information is distinguishing one possibility in a sea of possibilities.

  23. Possibilities multiply • 1 bit = distinguishing between 2 possibilities • What gives more information, a number between 1 and 1000, or 10 switches? log2 = lg

  24. The idea of the bit • Why binary? • To store a number between 1 and n, need lg(n) bits. • A string of s symbols, each symbol drawn from an alphabet of n symbols, contains slg(n) bits of information.

  25. Big questions • How do we measure information? • How do we transmit a message in the most compact way?Data compression • How do we transmit a message over a noisy channel?Error correction

  26. Language is redundant If u cnrdths u cngt a gdjb w hi pa

  27. Language is redundant If you can read this you can get a good job with high pay (25/44 letters)

  28. Language is redundant

  29. Language is redundant… • Because probabilities are not the same! • What is easier to transmit, 10 random coin flips or 10 weighted coin flips?

  30. How much can we compress information? • Suppose letters appear with probabilities p1,…,pn and there are l letters in the message. • Takes space l log2n. • But… only aroundmessages are more likely.

  31. Entropy = disorder • Information content of a letter: -plg(p) • Entropy measures disorder

  32. Data compression • Huffman encoding • Bigrams, trigrams,…

  33. Big questions • How do we measure information? • How do we transmit a message in the most compact way?Data compression • How do we transmit a message over a noisy channel?Error correction

  34. Error Correction • The key to correcting errors is redundancy.

  35. Get in groups of 3. • We will place a black or white hat on each person's head with probability ½, so that you cannot see his or her own hat, but can see the hats of everyone else. At the count of three, you may either guess the color of your hat, or stay silent. • If at least one person guesses, and if everyone guesses correctly, then you all win. • If anyone guesses wrong, or no one guesses, then you all lose. • What is your optimal strategy?

  36. Error-correcting code • Ex. Repetition code R3 • Rate = 1/3 (bad) • Probability of block error ~3f2~.03 for f=.1 • Rate vs. probability of block error

  37. The problem • Inefficient: To have a 1GB with no errors in 10 years (probability 10-15), we need 60GB. • Inefficient error correction!

  38. (4,7) Hamming code

  39. (4,7) Hamming code

  40. Comparing codes • Rate = 4/7

  41. Noisy-channel coding theorem • Information can be communicated over a noisy channel at a non-zero rate with arbitrarily small error probability.

  42. Implications • To have a x GB with no errors in 10 years (probability 10-15), we need 60x GB ~2x GB.

More Related