1 / 15

Information Theory and Games (Ch. 16)

Information Theory and Games (Ch. 16). Information Theory. Information theory studies information flow Under this context information has no intrinsic meaning Information may be partial (e.g., a sound ) Information measures the degree of uncertainty.

bian
Download Presentation

Information Theory and Games (Ch. 16)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Theory and Games (Ch. 16)

  2. Information Theory • Information theory studies information flow • Under this context information has no intrinsic meaning • Information may be partial (e.g., a sound) • Information measures the degree of uncertainty • Basic model: (1) sender passes information to (2) receiver • Measure of information gained is a number in the [0,1] range: • 0 bit: gained no information • 1 bit: gained the most information • How much information 2 gained? • Was there any distortion (“noise”) while passing the information? information 1 2

  3. Recall: Probability Distribution • The events E1, E2, …, Ek must meet the following conditions: • One always occur • No two can occur at the same time • The probabilities p1, …, pk are numbers associated with these events, such that 0  pi  1 and p1 + … + pk = 1 • A probability distribution assigns probabilities to events such that the two properties above holds

  4. Information Gain versus Probability • Suppose that I flip a “totally unfair” coin (always come heads): • what is the probability that it will come heads: 1 • How much information you gain when it fall: 0 • Suppose that I flip a “fair” coin: • what is the probability that it will come heads: 0.5 • How much information you gain when it fall: 1 bit

  5. Information Gain versus Probability (2) • Suppose that I flip a “very unfair” coin (99% will come heads): • what is the probability that it will come heads: 0.99 • How much information you gain when it fall: Fraction of A bit Information gain probability

  6. Information Gain versus Probability (3) • Imagine a stranger, “JL”. Which of the following questions, once answered, will provide more information about JL: • Did you have breakfast this morning? • What is your favorite color? • Hints: • What are your chances of guessing the answer correctly? • What if you knew JL and you knew his preferences?

  7. Information Gain versus Probability (4) • If the probability that an event occurs is high, I gain less information when the event actually occurs • If the probability that an event occurs is smaller, I gain more information when the event actually occurs • In general, the information provided by an event decreases with the increase in the probability that that event occurs. Information gain of an event e (Shannon and Weaver, 1949): I(e) = log2(1/p(e))

  8. Information, Uncertainty, and Meaningful Play • Recall discussion of relation between uncertainty and Games • What happens if there is no uncertainty at all in a game (both at macro-level and micro-level)? • What is the relation between uncertainty and information gain? If there is no uncertainty then information gain is 0. As a result, player’s actions are not meaningful!

  9. Lets Play Twenty Questions • I am thinking of an animal: • You can ask “yes/no” questions only • Winning condition: • If you guess the animal correctly after asking 20 questions or less, and • you can’t make more than 3 attempts to guess the right animal

  10. # potential questions # levels a question 20 yes 0 no 21 1 22 2 23 3 What is happening? (Constitutive Rules) • We are building a binary (two children) decision tree # questions made = log2(# potential questions)

  11. Full none some no yes waitEstimate? 0-10 >60 30-60 10-30 no Alternate? Hungry? yes Yes no yes No yes Alternate? Reservation? Fri/Sat? yes yes no no yes no yes Raining? No Yes Bar? Yes yes no no yes yes No no Yes Same Principle Operates for Online Version • Game:http://www.20q.net/ • Ok so how can this be done? • It uses information gain: Decision Tree Table of movies stored in the system Patrons? Nice: Resulting tree is optimal.

  12. Example

  13. Expected Information Gain • We are given a probability distribution: • The events E1, E2, …, Ek • The probabilities p1, …, pk associated with these events • We have the information gain for those events: I(E1), I(E2), …, I(Ek) • The Expected Information Gain (EIG): • EIG = p1 * I(E1) + … + pk * I(Ek)

  14. Decision Tree • Obtained using expected information gain • In this example it has the minimum height, which is nice (why?) Patrons? full none some Hungry no yes no yes Type? Yes french burger thai italian Yes Fri/Sat? yes no no yes no yes

  15. Noise and Redundancy • Noise: affects component to component communication • Example in a game? • Redundancy: counterbalance to noise • Making sure information is communicated properly • Example in game? • Balance act: noise versus redundancy • Too much information: signal might be lost • Too little information: signal might be lost Charades: playing with noise • Noise: distortion in the communication. Example information 1 2 Crossword puzzle. Other example? • Redundancy: passing the same information by two or more different channels information 1 2 information

More Related